Last Update 8:56 AM June 05, 2023 (UTC)

Company Feeds | Identosphere Blogcatcher

Brought to you by Identity Woman and Infominer.
Support this collaboration on Patreon!

Monday, 05. June 2023

Finema

KERI jargon in a nutshell. Part 2: SAID and ACDC.

Part 1: KERI and AID. Part 2: SAID and ACDC. Part 3: OOBI and IPEX. Part 4: CESR and CESR Proof Signatures. Part 5: TEL and PTEL TL;DR 1: A SAID is an identifier that is encapsulated by the data content from which the SAID is cryptographically derived. Using a SAID provides enhanced security for managing and reasoning about the identified data. TL;DR 2: An ACDC is a var
Part 1: KERI and AID. Part 2: SAID and ACDC. Part 3: OOBI and IPEX. Part 4: CESR and CESR Proof Signatures. Part 5: TEL and PTEL
TL;DR 1: A SAID is an identifier that is encapsulated by the data content from which the SAID is cryptographically derived. Using a SAID provides enhanced security for managing and reasoning about the identified data.
TL;DR 2: An ACDC is a variant of a W3C Verifiable Credential with extra properties and an enhanced security model derived from the KERI protocol.

This blog continues from the KERI jargon in a nutshell (Part 1) which summarizes the key concepts and terminology related to Key Event Receipt Infrastructure (KERI). The second part here explores SAID (Self-Addressing Identifiers) and ACDC (Authentic Chained Data Container).

These topics are currently under the standardization process by Internet Engineering Task Force (IETF). I recommend avid readers and developers to check out their IETF drafts, which can be found here:

KERI Community: https://github.com/WebOfTrust/keri SAID IETF Draft: https://datatracker.ietf.org/doc/draft-ssmith-said/ ACDC IETF Draft: https://datatracker.ietf.org/doc/draft-ssmith-acdc/ Table of Contents

· SAID (Self-Addressing Identifiers)
Content-Addressability
Self-Reference
SAD (Self-Addressing Data)
SAID Protocol
Advantages of Using SAIDs
· ACDC (Authentic Chained Data Container)
Simple ACDC Example
Naive Conversion to a W3C VC
· Distinguishing Features of ACDCs
1. KERI Security Model
2. Compact ACDC Format
3. Composable JSON Schema
4. ACDC Chaining
5. Ricardian Contracts
6. Chain-Link Confidentiality
7. Graduated Disclosure

Photo by Mediamodifier on Unsplash SAID (Self-Addressing Identifiers)

A SAID (Self-Addressing Identifier) is an identifier that is content-addressable and self-referential. A SAID is uniquely and cryptographically bound to a serialization of data that includes the SAID as a component (or field) in that serialization.

Note 1: Serialization is defined as a process of translating a data structure or object state into a data format that can be stored, transmitted, and reconstructed at a later time.
Note 2: In KERI, all identifiers are SAIDs. For example, Autonomic Identifiers (AIDs) are SAIDs of their associated key inception events. Identifiers of Authentic Chained Data Containers (ACDCs) and their schemas are also SAIDs.
Content-Addressability

An identifier that is content-addressable is based on an encoded cryptographic digest and is, thus, cryptographically bound to the content that it identifies.

Self-Reference

An identifier that is self-referential appears within the content that it is identifying.

SAD (Self-Addressing Data)

SAD (Self-Addressing Data) of a SAID is defined as a representation of data content from which the SAID is derived. The SAID is both cryptographically bound to (content-addressable) and encapsulated by (self-referential) its SAD.

Note: the example SAID above starts with the character “E”. This is a CESR character code that specifies that the SAID is generated using a Blake3–256 digest.
SAID Protocol

A “naive” cryptographic content-addressable identifier is not self-referential. This is because the naive cryptographic derivation process of a content-addressable identifier is a cryptographic digest of the serialized content. Changing one bit of the serialization content will result in a different digest. Therefore, a self-referential content-addressable identifier requires a special derivation protocol.

The generation and verification protocol of a SAID—which makes it both content-addressable and self-referential—is outlined with an example below:

Here is an example of a python dict with the empty field “said”. {
"said": "",
"first": "Sue",
"last": "Smith",
"role": "Founder"
} Serialize the data into JSON with no extra whitespace. Replace the “said” field value in the serialization of data with a dummy string # of the same length as the to-be-generated SAID. Here, we use a Blake3–256 digest with 44 characters. {"said":"############################################","first":"Sue",
"last ":"Smith","role":"Founder"} Compute the digest (Blake3–256) of the serialization that includes the dummy value for the “said” field. EJymtAC4piy_HkHWRs4JSRv0sb53MZJr8BQ4SMixXIVJ The value of the “said” field is now replaced with the computed and encoded SAID to produce the final serialization with embedded SAID. The final serialization may be converted to a python dict by deserializing the JSON. {
"said": "EJymtAC4piy_HkHWRs4JSRv0sb53MZJr8BQ4SMixXIVJ",
"first": "Sue",
"last": "Smith",
"role": "Founder"
}
Note: The SAID protocol preserves the insertion ordering of its content. That is, it preserves the order in which elements are added to a data structure
Advantages of Using SAIDs

Here are some advantages of using SAIDs over other types of identifiers:

SAIDs facilitate greater interoperability, reduced ambiguity, and enhanced security in the management of data. A cryptographic commitment—such as a signature, digest, or another SAID—to a given SAID is essentially equivalent to a commitment to its associated serialization. Any change to the serialization invalidates its SAID thereby ensuring secure immutability and evident reasoning with SAIDs about serializations or equivalently their SAIDs. SAIDs facilitate immutably referenced data serializations for applications such as Verifiable Credentials and Ricardian Contracts. ACDC (Authentic Chained Data Container)

An ACDC (Authentic Chained Data Container) is a variant of the Verifiable Credential (VC) that is designed to be used with other protocols in the KERI ecosystem.

ACDC or Authentic Chained Data Container has the following meaning:

Authentic: ACDCs are authentic using digital signatures that are signed and verified by keys associated with AIDs. Chained: ACDCs may be chained together as a directed acyclic graph. Data Container: A data container is a serialized data structure of label-value pairs. That is, it is a way of bundling together data items.

A Data Container is a way of bundling together data items. An Authentic Data Container (ADC), therefore, is a data container that provides cryptographic verifiability of the integrity and source of its bundled data. In other words, that data is both tamper-proof and is non-repudiably attributable to its source. Authentic Chained Data Containers (ACDCs) are ADCs that are chained/linked together to provide granular proof-of-authorship (i.e., authenticity) of their contained data as a directed acyclic graph.

Note: A directed acyclic graph (DAG) is a directed graph with no directed cycles. That is, it consists of vertices and edges, with each edge directed from one vertex to another, such that following those directions will never form a closed loop.

As a variant of the VC, ACDCs share the same fundamental properties as traditional VCs. However, ACDCs are specifically designed to work with the KERI protocol and consequently benefit from its best-in-class security properties. Unlike W3C VC, ACDC also provides normative support for credential chaining. The features that distinguish ACDCs from traditional VCs will be discussed in detail later in this blog.

Basically, you can think of ACDCs as VCs on steroids!
Note: As defined in W3C Verifiable Credential Data Model v1.1, a verifiable credential is a set of tamper-evident claims and metadata that cryptographically prove who issued it. This definition also applies to ACDCs.
Simple ACDC Example

Here is an example of an ACDC from an issuer to an issuee:

{
"v": "ACDC10JSON000113_",
"d": "EFprJkdpcJSet8aLY9PaMicyyUkEFnVjrIRZTmftmhkg",
"i": "EmkPreYpZfFk66jpf3uFv7vklXKhzBrAqjsKAn2EDIPM",
"a": {
"d": "EgveY4-9XgOcLxUderzwLIr9Bf7V_NHwY1lkFrn9y2PY",
"i": "EQzFVaMasUf4cZZBKA0pUbRc9T8yUXRFLyM1JDASYqAA",
"dt": "2021-06-09T17:35:54.169967+00:00"
}
}

Here, each field has the following description:

“v” is a version string that provides protocol type (ACDC), version (1.0), serialization type (JSON), the size of the ACDC in the lowercase hexadecimal notation (00113 = 1*16**2 + 1*16 + 3 = 275 characters), and the terminator character (_). “d” is the SAID of the ACDC. “i” is the AID of the issuer. “a” is the block of attributes (claims) of the holder (i.e. the data subject).

There are the following fields inside “a”:

“d” is the SAID of the nested “a” block. “i” is the AID of the holder. “dt” is the date-time of the issuance.
Note 1: The primary field labels in ACDCs are compact, i.e., having only one or two characters. This design choice was made so that ACDC can support resource-constrained applications such as those in supply chains and IoT.
Note 2: an ACDC may be untargeted, that is an ACDC without a holder. An untargeted ACDC provides an undirected verifiable attestation or observation of a piece of data. For example, a thermometer may make verifiable non-repudiable temperature measurements and publish them as ACDCs.
Naive Conversion to a W3C VC

The example ACDC above may be converted to a W3C Verifiable Credential (W3C VC) in a “naive” manner as follows:

{
"@context": [
"https://www.w3.org/2018/credentials/v1",
"https://www.w3.org/2018/credentials/examples/v1"
],
"issuer": {
"id": "EmkPreYpZfFk66jpf3uFv7vklXKhzBrAqjsKAn2EDIPM"
},
"validFrom": "2021-06-09T17:35:54.169967+00:00",
"credentialSubject": {
"id": "EQzFVaMasUf4cZZBKA0pUbRc9T8yUXRFLyM1JDASYqAA"
}
}

This conversion is naive since some security properties of the ADCD are lost during the conversion. For example, the SAIDs of the ACDC and its attribute do not appear in the W3C VC.

A more secure conversion standard between ACDC and W3C VC is currently being developed by Kevin Griffin from GLEIF in this draft specification:

https://weboftrust.github.io/vc-acdc/
Note: Unlike W3C VC, an ACDC does not have a “proof” section. A digital signature is attached to the ACDC using the CESR Proof Signature protocol.
Distinguishing Features of ACDCs

ACDCs have features that distinguish them from traditional VCs. Some of the major distinguishing features include:

The security model that is derived from KERI, Compact formats for resource-constrained applications Composable JSON Schema, Support for credential chaining, Support for Ricardian contracts, Support for chain-link confidentiality, and Support for graduated disclosure. 1. KERI Security Model

An ACDC inherits its security properties from the KERI protocol that enables the ACDC to be used in a portable but securely attributable, fully decentralized manner.

A security property that I would like to emphasize here is the one that emerges from using SAIDs as identifiers. In fact, all identifiers in ACDCs are SAIDs or URIs that contain SAIDs. Hence, a digital signature on an ACDC provides a cryptographic commitment to the ACDC itself as well as to all the data from which the SAIDs in ACDC were derived. Examples of SAIDs in an ACDC include:

the top-level identifier of the ACDC itself and the identifiers nested data blocks inside the ACDC: Using SAIDs as identifiers enable a compact but secure representation of the ACDC and associated blocks. the identifier of ACDC’s issuer and holder: The identifiers of the issuer and the holder (i.e., the issuee) are autonomic identifiers (AIDs), which are SAIDs of the inception events of their key event logs (KELs). A digital signature on an ACDC can be verified using the issuer’s KEL while the holder may cryptographically prove its holdership over the ACDC using their KEL. the identifier of the ACDC’s status registry: The status of an ACDC is determined by a status registry, called a transaction event log (TEL). The issuance and revocation events of an ACDC are recorded as TEL entries that are anchored to the KEL of the ACDC’s issuer. the identifier of the ACDC’s schema: The schema of an ACDC is a JSON schema where the schema id field ("$id") is the SAID of the schema. This ensures that the ACDC schema is static and verifiable to the SAID.
Note: With the use of the SAID protocol, the schema of an ACDC is static, i.e. immutable. This makes the ACDC protected against the schema malleability attack where an adversary attacks the source that provides schema resources.
2. Compact ACDC Format

An ACDC may be expressed in multiple ways including in compacted or uncompacted formats. Simple examples of the same ACDC that is expressed in compacted or uncompacted formats are given below.

# Compacted ACDC variant
{
"v": "ACDC10JSON000119_",
"d": "ECh56mUZGxTZtpiTrxaB7wZlQtRsD4N5pY5adc_B1748",
"i": "EKxICWTx5Ph4EKq5xie2znZf7amggUn4Sd-2-46MIQTg",
"ri": "EMrPQvNVXag7MtTXVNtOBtUmjBWj6QLpPH8QBqBITUH_",
"s": "EAxP7acR20ckPKh581y6bWrqhCMZToBhoGHLoQgqJQtW",
"a": "EBkbsuJIH_8aCUKNFFpRjT5G5_YsQ6_pZrcrCVQFnzC3"
} # Uncompacted ACDC variant
{
"v": "ACDC10JSON00018c_",
"d": "EDycvNBB5c1cqvgOnCmwBPmcrk80bYyRVfc-G_351kO9",
"i": "EKxICWTx5Ph4EKq5xie2znZf7amggUn4Sd-2-46MIQTg",
"ri": "EMrPQvNVXag7MtTXVNtOBtUmjBWj6QLpPH8QBqBITUH_",
"s": "EAxP7acR20ckPKh581y6bWrqhCMZToBhoGHLoQgqJQtW",
"a": {
"d": "EBkbsuJIH_8aCUKNFFpRjT5G5_YsQ6_pZrcrCVQFnzC3",
"i": "ELjSFdrTdCebJlmvbFNX9-TLhR2PO0_60al1kQp5_e6k",
"dt": "2023-06-05T00:30:16.261184+00:00",
"name": "Jane Doe"
}
}

Here two more fields are introduced:

"s" is the SAID of the JSON Schema that is used to produce the ACDC. "ri" is the identifier of the ACDC’s status registry.

In the compacted variant, the value of "a" is the SAID of the attribute section, which is equal to the value of the digest "d" inside "a" of the uncompacted variant.

Compact ACDCs are useful in resourced-constrained applications where the receivers/verifiers of ACDCs may cache some piece of data with their SAIDs. Other uses of compact ACDCs will also be discussed in later sections of this blog.

3. Composable JSON Schema

The schema used by ACDCs is the JSON Schema. The JSON Schema supports composable schema that enables the use of any combination of compacted and uncompacted sections and subsections in an ACDC. With this property, the ACDC’s verification protocol allows the same ACDC to be expressed in multiple forms while still being verifiable using the same digital signature from an issuer.

For the above examples of compacted and uncompacted ACDC, the schema of their attribute section "a" may be constructed with the keyword "OneOf" as follows:

{
"a": {
"oneOf": [
{
"description": "Attributes block SAID",
"type": "string"
},
{
"$id": "ELpLieummE-O6FzaQ7RW-yPV__x36x6XkmdFBr6ua7ec",
"description": "Attributes block",
"type": "object",
"properties": {
"d": {
"description": "Attributes block SAID",
"type": "string"
},
"i": {
"description": "Holder's AID",
"type": "string"
},
"dt": {
"description": "Issuance date time",
"type": "string",
"format": "date-time"
},
"name": {
"description": "Name",
"type": "string"
}
},
"additionalProperties": false,
"required": [
"d",
"i",
"name"
]
}
]
}
}
Note: If an ACDC schema allows for multiple variants, the issuer of such an ACDC must digitally sign the most compacted variant. A verifier who subsequently receives an uncompacted variant may convert it into the most compacted variant before verifying it against the digital signature.
4. ACDC Chaining

Unlike the W3C VC data model, The ACDC specification provides normative support for credential chaining. Chaining between ACDCs forms a directed acyclic graph (DAG) where the ACDCs are nodes in the graph. The directed edges between ACDCs are represented by field "e", which specifies the relationships between the connected ACDCs. A simple example is given below where an edge represents a mother-son relationship between Ms. Jane Doe and Mr. John Doe.

# An example ACDC that attests to the identity of Ms. Jane Doe.
{
"v": "ACDC10JSON00018c_",
"d": "EDycvNBB5c1cqvgOnCmwBPmcrk80bYyRVfc-G_351kO9",
"i": "EKxICWTx5Ph4EKq5xie2znZf7amggUn4Sd-2-46MIQTg",
"ri": "EMrPQvNVXag7MtTXVNtOBtUmjBWj6QLpPH8QBqBITUH_",
"s": "EAxP7acR20ckPKh581y6bWrqhCMZToBhoGHLoQgqJQtW",
"a": {
"d": "EBkbsuJIH_8aCUKNFFpRjT5G5_YsQ6_pZrcrCVQFnzC3",
"i": "ELjSFdrTdCebJlmvbFNX9-TLhR2PO0_60al1kQp5_e6k",
"dt": "2023-06-05T00:30:16.261184+00:00",
"name": "Jane Doe"
}
} # An example ACDC that attests to the identity of Mr. John Doe.
# The edge section indicates that Ms. Jane Doe is his mother.
{
"v": "ACDC10JSON000203_",
"d": "EFh8dlxwT2EjhknxNAP5xIhYmiurtGbeDQ-pS5jOaOlE",
"i": "EKxICWTx5Ph4EKq5xie2znZf7amggUn4Sd-2-46MIQTg",
"ri": "ECHkWfh71fLmkeJpeMvic6RQp5P73PizTGQelXB7ZNEE",
"s": "EAxP7acR20ckPKh581y6bWrqhCMZToBhoGHLoQgqJQtW",
"a": {
"d": "EFFD47E5Ev1zJF3zIagGEM7kbTI9DoT3scItzDPq9Jnm",
"i": "EKW8xHilZH8-CetHzBOddWpG0bS_vJlphrK3zh95u0zF",
"dt": "2023-06-05T01:07:44.885411+00:00",
"name": "John Doe"
},
"e": {
"d": "EE12DuT-V4IPumKmRsirulOeroO38aCEb4mjKy8SlE0m",
"mother": {
"n": "EDycvNBB5c1cqvgOnCmwBPmcrk80bYyRVfc-G_351kO9"
}
}
}

The edge section "e" is the representation of the directed edge that originates from the embedding ACDC. Inside the edge section, the fields have the following meanings:

"d" is the SAID of the edge section. "n" is the SAID of the ACDC to which the edge is directed, i.e., the SAID of Ms. Jane Doe’s ACDC.
Note: The edge section may also be compacted using the section’s SAID.
5. Ricardian Contracts

An ACDC may consist of a rule section "r" to provide a Ricardian Contract. To satisfy the properties of a Ricardian contract, the rule section may be hierarchically structured into sections and subsections with named or numbered clauses in each section. The labels on the clauses may follow such a hierarchical structure using nested maps or blocks.

Note: A Ricardian contract is a type of digital contract that combines legal and financial terms with computer-readable code. It aims to create a machine-readable representation of a legal agreement by embedding the terms of the contract within a digital document. This approach allows both humans and machines to interpret and execute the contract.

An example of an ACDC with a rule section for warranty and liability disclaimers is given below.

# An example ACDC with a rule section
{
"v": "ACDC10JSON0003e0_",
"d": "EKZ0qdcyz2Mpl9QNuP0p1Sd0vr1Ov4g4wsDeZ7DJqGh6",
"i": "EKxICWTx5Ph4EKq5xie2znZf7amggUn4Sd-2-46MIQTg",
"ri": "ECHkWfh71fLmkeJpeMvic6RQp5P73PizTGQelXB7ZNEE",
"s": "EAxP7acR20ckPKh581y6bWrqhCMZToBhoGHLoQgqJQtW",
"a": {
"d": "EDhg1jZaNPJdYpiNwQRsjlZMiXF6XYAzlNhgkGBfXwMb",
"i": "EKW8xHilZH8-CetHzBOddWpG0bS_vJlphrK3zh95u0zF",
"dt": "2023-06-05T01:14:49.919153+00:00",
"name": "John Doe"
},
"r": {
"d": "ENyB1FGejfsC2MoYXJO9WOFJttmB3lw5NC0y_dSBPlq0",
"usageDisclaimer": {
"l": "Usage of a valid, unexpired, and non-revoked credential does not assert that the holder is trustworthy, honest, reputable in its business dealings, safe to do business with, or compliant with any laws or that an implied or expressly intended purpose will be fulfilled."
},
"issuanceDisclaimer": {
"l": "All information in a valid, unexpired, and non-revoked credential is accurate as of the date the validation process was complete. The credential has been issued to the person named in the credential as the subject."
}
}
}

In the rule section,

"r" is the rule section of the ACDC. "d" is the SAID of the rule section. "l" is the text of a Ricardian contract clause.
Note: A rule section and its subsections may be compacted using the section’s and subsections’ SAIDs, respectively.
6. Chain-Link Confidentiality

One of the most important features of ACDCs (using their rule sections "r") is support for Chain-Link Confidentiality that imposes contractual restrictions and liability on a recipient (i.e. a disclosee) of confidential information. This property provides a powerful mechanism for the protection of personal data and confidential information.

The concept was first coined by Woodrow Hartzog and is defined as follows:

“A chain-link confidentiality regime would contractually link the disclosure of personal information to obligations to protect that information as the information moves downstream. The system would focus on the relationships not only between the discloser of information and the initial recipient but also between the initial recipient and subsequent recipients.”
[Woodrow Hartzog. “Chain-link confidentiality.” Georgia Law Review 46 (2011): 657.]

This approach links recipients of information in a chain, with each recipient bound by the same obligation to protect the information. These chain contracts would contain at least three kinds of terms:

obligations and restrictions on the use of the disclosed information, requirements to bind future recipients to the same obligations and restrictions; and requirements to perpetuate the contractual chain.

An exchange of an ACDC compatible with chain-link confidentiality starts with a disclosure offer from a discloser to a recipient. This offer includes a metadata ACDC with the information to be disclosed such that the recipient can agree to those terms. Once the recipient has accepted the terms (with a non-repudiable digital signature) then full disclosure of ACDC may be made.

A simple example of chain-link confidentiality. 7. Graduated Disclosure

A discloser of an ACDC may choose not to reveal the entire content of the ACDC in their initial interaction with the recipient. Instead, the discloser may opt to partially or selectively present the information contained within the ACDC that is needed to further a transaction. Subsequently, the discloser would disclose further information only after the recipient agrees to the terms established by the discloser. This process may involve multiple steps, where the discloser progressively reveals more information as the recipient consents to additional terms. This process is called graduated disclosure.

Note: Here, “graduated” means “arranged in a series” or “divided into different levels.”

ACDC leverages graduated disclosure that enables adherence to the principle of least disclosure, which is expressed as follows:

The system should disclose only the minimum amount of information about a given party needed to facilitate a transaction and no more.

ACDCs employ three types of graduated disclosure mechanisms, including compact, partial, and selective disclosures. Different types of disclosure may be defined as follows:

Compact disclosure: Compact disclosure of an ACDC discloses the SAID of its content (blocks or sections of the ACDC). Both partial and selective disclosure rely on compact disclosure. Partial disclosure: The content of an ACDC can be partially disclosed via the exchange of only the SAID of a given content block, i.e., using compact disclosure. The SAID of a content block provides a cryptographically equivalent commitment to the yet-to-be-disclosed content. A later exchange of the uncompacted content is verifiable to an earlier partial disclosure. Another type of partial disclosure is the disclosure of metadata about the content. Selective disclosure: In selective disclosure, the set of attributes is provided as an array of blinded blocks, where each attribute in the set has its own dedicated blinded block. A selectively-disclosable attribute section appears at the top level using the field label A, which is distinct from the field label a for a non-selectively-disclosable attribute section.

To give an example that demonstrates the difference between partial and selective disclosure, let’s consider the following attribute block a:

# An example of an attribute block "a"
"a": {
"d": "EA5Fm9mP5tWreQTw2xjiXYDDGzuPVtqyfekYIz6y2jMY",
"i": "ELjSFdrTdCebJlmvbFNX9-TLhR2PO0_60al1kQp5_e6k",
"name": "Jane Doe",
"address": "1600 Pennsylvania Avenue NW, Washington, DC 20500, USA"
}

With selective disclosure, the attribute block may be transformed into a selectively-disclosable attribute block as follows:

# An example of a selectively-disclosable attribute block "A"
"A": [
{
"d": "EJ1FZBKhdVcAiaZS6eHAHMrL_ANysjLkyyOXTyAEA-lb",
"u": "0AAJ1MJgB3aqLHNencIF-CcH",
"i": "ELjSFdrTdCebJlmvbFNX9-TLhR2PO0_60al1kQp5_e6k"
},
{
"d": "EL1EbFKUR_brFCt3knKOW92STd67AuNbRfmUzgplV12G",
"u": "0ABLGaIT_A5yLMDEBUCCPaF7",
"name": "Jane Doe"
},
{
"d": "EN-gWdH5AWQwfdyGmK6CSA5ciXKZbvTYEvKXaSE-74rL",
"u": "0AAlI8guN7866YoSeK7cZR5M",
"address": "1600 Pennsylvania Avenue NW, Washington, DC 20500, USA"
}
]

where each selectively-disclosable block has its own SAID d and a random universally unique identifier (UUID) "u".

By using random UUIDs for each selectively-disclosable block, an adversary—which possesses both the schema of the attribute block and its SAID—is not able to discover the remaining contents of the attribute block in a computationally feasible manner by using, e.g., a rainbow table attack. Therefore the UUID "u" field of each attribute block enables the associated SAID "d" field to securely blind the block’s contents.

Note 1: The field labels of the selectively disclosable attributes are also blinded because they only appear within the blinded block. This prevents un-permissioned correlation via contextualized variants of a field label that appear in a selectively disclosable block.
Note 2: The primary difference between partial disclosure and selective disclosure is determined by the correlatability with respect to its encompassing block after its disclosure. A partially disclosable field becomes correlatable to its encompassing block after its disclosure whereas a selectively disclosable field does not. After selective disclosure, the selectively disclosed fields are not correlatable to the so-far undisclosed but selectively disclosable fields in the same encompassing block.

KERI jargon in a nutshell. Part 2: SAID and ACDC. was originally published in Finema on Medium, where people are continuing the conversation by highlighting and responding to this story.

Sunday, 04. June 2023

SC Media - Identity and Access

Secure-by-design space systems pushed amid increased cyber threats

Ongoing cyberattacks against satellite communications systems have prompted Space Systems Cybersecurity Standard working group members to call for the development of secure-by-design specifications for space system components, CyberScoop reports.

Ongoing cyberattacks against satellite communications systems have prompted Space Systems Cybersecurity Standard working group members to call for the development of secure-by-design specifications for space system components, CyberScoop reports.


Over 2.5M individuals impacted by Harvard Pilgrim Health Care ransomware attack

Massachusetts-based non-profit health service firm Harvard Pilgrim Health Care has confirmed that more than 2.55 million of its current and former members had their sensitive data compromised following a ransomware attack in April, BleepingComputer reports.

Massachusetts-based non-profit health service firm Harvard Pilgrim Health Care has confirmed that more than 2.55 million of its current and former members had their sensitive data compromised following a ransomware attack in April, BleepingComputer reports.


The most overhyped identity trends, according to cybersecurity investors

Identiverse panelists cite identity solutions and concepts whose short-term trajectories might not live up to the buzz surrounding them.

Identiverse panelists cite identity solutions and concepts whose short-term trajectories might not live up to the buzz surrounding them.


Three tips for leaders grappling with the cybersecurity workforce challenge

By focusing on the big picture, becoming serious about developing the workforce, and looking for people other than computer science majors, companies can close the skills gap.

By focusing on the big picture, becoming serious about developing the workforce, and looking for people other than computer science majors, companies can close the skills gap.


FindBiometrics

Acuity Market Intelligence and FindBiometrics Present: The Biometric Digital Identity Prism

Earlier this year, FindBiometrics hosted “The Road Ahead for Biometrics and Digital Identity” Virtual Summit — a full day of online panels and interview style sessions about the hottest topics […]
Earlier this year, FindBiometrics hosted “The Road Ahead for Biometrics and Digital Identity” Virtual Summit — a full day of online panels and interview style sessions about the hottest topics […]

IBM Blockchain

Accelerating AI & Innovation: the future of banking depends on core modernization

In the rapidly evolving landscape of financial services, embracing AI and digital innovation at scale has become imperative for banks to stay competitive. With the power of AI and machine learning, financial institutions can leverage predictive analytics, anomaly detection and shared learning models to enhance system stability, detect fraud and drive superior customer-centric experiences. As […]

In the rapidly evolving landscape of financial services, embracing AI and digital innovation at scale has become imperative for banks to stay competitive. With the power of AI and machine learning, financial institutions can leverage predictive analytics, anomaly detection and shared learning models to enhance system stability, detect fraud and drive superior customer-centric experiences. As we step into 2023, the focus has shifted to digital financial services, encompassing embedded finance, generative AI and the migration of super apps from China into a global phenomenon. And all this while balancing the adoption of a hybrid multicloud strategy. For banks to stay relevant and competitive in this new world, it is imperative for them to adjust to new trends, understand the importance of open finance and transform their core systems. Ultimately, banks must start with modernizing their core through technologies like hybrid multicloud and AI.  

Generative AI: unleashing new opportunities 

Generative AI, exemplified by the explosion in advanced large language model solutions on the market and seen most recently via the launch of IBM watsonx, offers exciting possibilities in financial advisory and data analysis. While the unexplored future of generative AI poses opportunities in deterministic financial environments, configuring these models properly can simplify complex financial concepts and enable easier understanding for customers. Financial institutions must carefully leverage generative AI to strike the right balance between innovation and ethical usage. This is why IBM puts all of its AI technologies through rigorous processes and protocols to offer trustworthy solutions.  

In such a highly regulated industry like banking, it is that much more important for clients to have this access to the toolset, technology, infrastructure, and consulting expertise to build their own — or fine-tune and adapt available AI models — on their own data and deploy them at scale in a more trustworthy and open environment to drive business success. Competitive differentiation and unique business value will be able to be increasingly derived from how adaptable an AI model can be to an enterprise’s unique data and domain knowledge. 

Embedded finance: redefining customer experiences 

Embedded finance has emerged as a rapidly growing trend, revolutionizing the way customers interact with financial products and services. Banks now have the opportunity to seamlessly integrate financial capabilities into various contexts, such as online commerce or car buying and emerging digital ecosystems, without disrupting customer workflows. By embedding financial services into everyday activities, banks can deliver hyper-personalized and convenient experiences, enhancing customer satisfaction and loyalty. 

The rise of super apps: transforming digital ecosystems 

Super apps, popular in China, have the potential to reshape the financial services landscape globally. By consolidating multiple applications and services under a single entity, super apps offer customers a comprehensive ecosystem that seamlessly integrates digital identity, instant payment, and data-driven capabilities. As embedded finance gains traction and open banking APIs become more prevalent, the vision of super apps is becoming a reality. Financial institutions need to adapt to this emerging trend and actively participate in the evolving digital ecosystems to deliver enhanced value and cater to evolving customer expectations. 

Open finance: accelerating the API-driven economy 

Open banking has been a topic of discussion for several years, with PSD2 regulations driving initial progress. Now open finance, an extension of PSD2, is set to open up even more services and foster an API-driven economy. With open finance, banks are compelled to open up additional APIs beyond payment accounts, enabling greater innovation and competition in the financial sector. This shift toward data-driven economies places embedded finance at the core of financial services. Forward-thinking banks are not only complying with regulatory requirements but also proactively leveraging open finance to distribute their services efficiently and reach customers wherever they are. 

The critical need for modernizing core systems and the role of hybrid cloud 

In this new paradigm of AI-powered digital finance, modernizing core systems becomes imperative for banks to deliver seamless experiences, leverage emerging technologies, and remain competitive. Traditional legacy systems often lack the flexibility, scalability and agility required to support the integration of embedded finance, generative AI and open finance. By transforming core systems, banks can create a solid foundation that enables the seamless integration of new technologies, facilitates efficient API-driven ecosystems and enhances the overall customer experience. 

Hybrid multicloud plays a crucial role in facilitating the shift. It allows banks to leverage the scalability and flexibility of public cloud services while maintaining control over sensitive data through private cloud and on-premises infrastructure. By adopting a hybrid multicloud approach, banks can transform their core systems, leverage AI and machine learning capabilities, ensure data security and compliance and seamlessly integrate with third-party services and APIs. The hybrid cloud provides the agility and scalability necessary to support the rapid deployment of new digital services, while also offering the control and customization required by financial institutions. 

Modernization starts at the core 

However, transforming core systems and transitioning to a hybrid cloud infrastructure is not a one-size-fits-all solution. Each bank has unique requirements, existing technology landscapes and strategic goals. It is crucial to align the technology roadmap of fintech solutions with the overall bank strategy, including the digital strategy. This alignment ensures a competitive advantage, sustainability and a seamless convergence between the two roadmaps. Collaboration between banks, fintech providers and IBM can facilitate this alignment and help banks navigate the complexities of digital transformation. 

The financial services industry is undergoing a profound transformation driven by AI, digital innovation and the shift toward digital financial services. Embedded finance, generative AI, the rise of super apps, and open finance are reshaping customer experiences and creating new opportunities for financial institutions. To fully leverage these transformative trends, banks must transform their core systems and adopt a hybrid multicloud infrastructure. This transformation not only enables seamless integration of new technologies but also enhances operational efficiency, agility and data security. As banks embark on this journey, strategic alignment between the technology roadmap and the overall bank strategy is paramount. 

Learn about IBM’s consulting solutions for the financial services industry

The post Accelerating AI & Innovation: the future of banking depends on core modernization appeared first on IBM Blog.


SEC’s climate disclosure rule proposal explained

The Securities and Exchange Commission (SEC) has issued a rule proposal to standardize the way organizations make climate-related disclosures. The rule proposal would require US publicly traded companies to disclose annually how their businesses are assessing, measuring and managing climate-related risks. This would include disclosure of greenhouse gas emissions as a measure of expo

The Securities and Exchange Commission (SEC) has issued a rule proposal to standardize the way organizations make climate-related disclosures. The rule proposal would require US publicly traded companies to disclose annually how their businesses are assessing, measuring and managing climate-related risks. This would include disclosure of greenhouse gas emissions as a measure of exposure to climate-related risk.

The proposed rule would standardize climate-related disclosures for investors, allowing them to clarify exposure to risk and potential impact on the business operations or financial condition of the organization they are investing in.

Why the SEC’s climate disclosure rule proposal matters

This rule proposal follows global efforts in recent years to standardize climate-related disclosure requirements for organizations.

While many companies already disclose their GHG footprint, there are discrepancies with how this is reported even within the same industries. The SEC’s rule proposal aims to harmonize emissions reporting, ensuring data is comparable and transparent for shareholders, investors and the public.

If enacted, the enforceable nature of the rule proposal will also require companies who have never previously reported on their GHG emissions to do so—increasing the significance of climate-related risks to portfolio managers.

Evidence from other geographies shows the significant impact these mandates can have on emission reduction. Mandates drive action, as seen in Australia when the National Greenhouse and Energy Reporting (NGER) Act was introduced in 2007, which now includes hundreds of registrants reporting on their energy production, consumption and GHG emissions.

The United Kingdom is also taking up the mantle this year with plans to mandate UK-registered companies and financial firms to disclose their emissions, and the European Union is set to force all large companies listed on the European stock exchange to report their emissions beginning in 2024. 

How will organizations be impacted if the SEC’s rule is enacted?

The SEC’s proposed climate disclosure rules are targeted at large, publicly listed US companies. The rule proposal includes some flexibility around Scope 3 emissions reporting including an exemption for smaller reporting companies.

The SEC’s climate-related proposal requirements

The SEC’s proposal is aligned with existing recommendations from the Task Force on Climate-related Financial Disclosures (TCFD).

The SEC’s proposed rule amendments would require organizations to disclose certain climate-related information including:

Greenhouse gas (GHG) emissions, Scopes 1, 2 and 3 (reported to an auditable standard) Disclosure of climate-related risk, impacts, targets and goals Systematic management of offsets and REC’s Articulation and management of a transition plan Finance-grade reporting aligned with TCFD Next stage in the SEC’s rule proposal

There has been an extensive public comment period since the proposed rules were published on the SEC’s website. The agency will take those comments into consideration before issuing a final rule, which will be voted on by the SEC’s commissioners.

In its fact sheet, the SEC stated that the new requirements would be phased in over several years. The largest companies would need to start disclosing climate risks in 2023, while other firms would have until 2024.

Envizi will continue to closely monitor developments as the SEC’s climate disclosure proposal moves through consultation stages, and as further announcements by the SEC are made.

The SEC supported by ESG reporting software

IBM Envizi’s existing suite of ESG reporting solutions are well placed to support SEC’s proposed rules announced in March 2022, by supporting organizations to meet stringent ESG reporting commitments within an auditable, single system of record built on the GHG Protocol.

Scope 1 & 2 emissions disclosure

Envizi’s Scope 1 and 2 GHG Accounting and Reporting module enables the automatic data capture from a variety of sources, performs robust GHG accounting aligned with the GHG Protocol, captures custom emissions factors, and manages market-based emissions calculations.
Envizi can meet the SEC’s requirement to express these emissions by disaggregated constituent greenhouse gases in the aggregate, in absolute terms, and in terms of intensity (per unit of economic value or production).

Scope 3 emissions disclosure

Envizi’s Scope 3 GHG Accounting and Reporting module enables the capture of upstream and downstream GHG emissions data, calculates emissions using a robust analytics engine and categorizes emissions by value chain supplier, data type, intensities and other metrics to support auditability.

Climate-related risks & impacts

Envizi’s ESG Reporting Frameworks module manages the people, processes, external references and supporting documents required to:

respond to disclosures about climate risk and impacts respond to disclosures about the governance associated with assessing those climate-related risks and impacts. Managing offsets and RECs

If carbon offsets or renewable energy certificates (RECs) have been used, information about the carbon offsets or RECs, including the amount of carbon reduction represented by the offsets or the amount of generated renewable energy represented by the RECs, can be tracked in Envizi.

Alignment with TCFD

Envizi’s ESG Reporting Frameworks module includes pre-built templates aligned with frameworks such as TCFD, SASB, and GRI, which can be used as a reference point for managing a set of SEC disclosures. When the proposed disclosures have been finalized by SEC, Envizi will create a standard SEC disclosure template with links to specific disclosures in other frameworks to streamline disclosures to multiple frameworks.

Scenario analysis

Envizi’s Sustainability Program Tracking module supports module supports the ability to track and manage sustainability initiatives and efficiency programs to optimize investment decisions, define a portfolio of projects to meet targets and to verify the savings achieved from projects.

The post SEC’s climate disclosure rule proposal explained appeared first on IBM Blog.


Modernizing child support enforcement with IBM and AWS

The IBM and AWS partnership can accelerate your child support enforcement modernization journey. The post Modernizing child support enforcement with IBM and AWS appeared first on IBM Blog.

With 68% of child support enforcement (CSE) systems aging, most state agencies are currently modernizing them or preparing to modernize. More than 20% of families and children are supported by these systems, and with the current constituents of these systems becoming more consumer technology-centric, the use of antiquated technology systems is archaic and unsustainable. At this point, families expect state agencies to have a modern, efficient child support system.

The following are some factors driving these states to pursue modernization:

Current systems are developed on mainframe-based systems using an antiquated, scarcely available technology stack. The unavailability of a skilled workforce makes these systems expensive to operate, manage and enhance. Child support programs are focusing on pioneering a new, holistic and family-focused delivery perspective to make families feel included, listened to and empowered. Modernization will free caseworkers’ time to focus on and collaborate with families, rather than focusing on navigating an aging system.

To fulfill these ever-increasing citizen expectations to consume state-provided child support services, statewide child support systems need do the following:

Empower families to get help when, where and how they need it, including virtual and real-time communications. Provide quick and transparent services to ease stress and frustration in times of need. Be intuitive, user-friendly and cut down on manual effort. Automate routine tasks to free up time to provide personalized services and build relationships with families. Empower caseworkers with online tools to collaborate with colleagues and access knowledge repositories.

Existing systems simply are not able to provide the capabilities needed to realize this family-centered approach of service delivery and need to be modernized, immediately.

Every U.S. state is different

Core system requirements—such as new case initiation, providing location and establishment services, enforcing orders, handling financial transactions, etc.—are mandated by the Office of Federal Child Support Enforcement (OCSE), making many state child support enforcement (CSE) systems similar.

Despite this commonality, every child support organization is unique, and each needs a tailored modernization approach that supports its vision and understands the reality “on the ground.” For example, the core systems technology landscape for each state could be a mainframe legacy system with varying degrees of maturity, portability, reliability and scalability.

States’ existing investment in modernizing ancillary systems (e.g., improved document management capabilities, web portals, mobile applications, data warehouses, enhanced location services, etc.) might negate the need for modernization for these systems. This, along with the overall maturity of CSEs, uninterrupted service requirements, state budget, timing, staffing, and other factors mandate a comprehensive modernization approach to pull in these investments.

IBM Consulting’s Accelerated Incremental Mainframe Modernization

Legacy applications, while being the current crown jewels of states’ IT ecosystems, hinder business agility and are expensive and complicated, therefore making any rebuild, refactor or integration attempts risky.

IBM Consulting’s pioneering Accelerated Incremental Mainframe Modernization (AIMM) approach focuses on legacy modernization with a lens of incremental transformation, rather than just translation. Our approach focuses on business data domain-centric initiatives that can deliver value in the short term and create a development ecosystem conducive to incremental optimization—instead of a risky big-bang application and infrastructure update.

As highlighted in the graphic below, AIMM’s approach focuses on starting with a business and technology map alongside the client-specific IT ecosystems, instead of only a code conversion.

This helps us create a client journey with the co-existence of legacy and incrementally migrated functionality to the new digital core, eventually strangulating and sunsetting the legacy platforms.

The approach leverages IBM’s zOS connectors, allowing us to shift the focus from application to data, move mainframe data in mainframe format into cloud storage (with near real-time, bidirectional data sync), and leverage new and modern cloud data management services. With its proven tools and processes, AIMM meets clients where they are in the legacy modernization journey, analyzing (auto-scan) legacy code, extracting business rules, converting it to modern language, deploying it to any cloud, and managing technology for transformational business outcomes. Ultimately, this helps states to realize their business goals and deliver value faster.

A proven methodology, IBM assets, tools, cloud provider services and our ecosystem of partners and alliances drive IBM’s end-to-end mainframe modernization approach and deliver business value.

The following are some of the industry-leading methodologies, tools and assets used by AIMM:

IBM Garage brings together industry best practices, technical expertise and knowledge, client collaboration and partnership, cloud service providers, and the IBM Consulting teams to deliver accelerated outcomes. IBM Consulting Cloud Accelerator (ICCA) recommends client journeys without deep engineering knowledge, covering execution and modernization steps that take a workload from a source to a cloud destination. IBM Asset Analysis Renovation Catalyst (AARC) automatically extracts the rules from the COBOL application source code and generates a knowledge model, which is then deployed into a rules engine like IBM Operational Decision Manager, Pega, etc. IBM Operational Decision Manager (ODM) enables businesses to respond to real-time data by applying automated decisions, enabling business users to develop and maintain operational systems decision logic. IBM’s open-source automated workflows provide cross-cloud, AI-powered software designed to accelerate application modernization with pre-integrated data, automation and security capabilities, enabling business and IT teams to build and modernize applications faster. Why IBM Consulting and AWS?

IBM is a Premier Consulting Partner for AWS, with 19,000+ AWS certified professionals across the globe, 16 service validations and 15 AWS competencies—becoming the fastest Global GSI to secure more AWS competencies and certifications among Top-16 AWS Premier GSI’s within 18 months. At re:Invent 2022, IBM Consulting was awarded the Global Innovation Partner of the Year and the GSI Partner of the Year for Latin America, cementing client and AWS trust in IBM Consulting as a trusted partner of choice, when it comes to AWS.

AWS has the biggest cloud infrastructure services vendor market share worldwide, averaging around 33% as of Q4 2022. AWS provides AWS GovCloud (US) Regions, a flexible, pay-as-you-go pricing model, reliability, FedRAMP compliance and flexibility of compute. This makes AWS a great choice for federal and state agencies looking to achieve operational efficiencies and innovate on demand to advance their mission across the nation.

As more states are finding new ways to leverage mainframes as part of their modernization strategy with hybrid cloud, IBM is leading the way and working with state and local governments across various sectors, helping deliver more responsive services, digitize their operations, and advance equity and access.

As an example, IBM Consulting worked with the Department of Children and Families of a U.S. state to modernize their highly complex legacy application systems—comprised of 200 databases and 72 application modules into a strategic enterprise AWS cloud platform, resulting in 200M lines of Java code.

Leveraging IBM Garage Method for cloud, IBM Consulting teams executed a proven factory-based approach (process, skills, tools) to do the following:

Deliver throughput and consistency in migration and rationalization. Standardize on a core set of platforms, infrastructure and tools across environments—spanning multiple agencies and service providers. Empower end users with self-service catalog management.

This resulted in a 50% reduction in annual operating costs, improving application response time by 3.5x, benefiting end-user experience, and moving the state to a “pay-as-you-go” model for a reduced upfront commitment.

In 2022, IBM Consulting leveraged AIMM and worked with another major U.S. city agency serving 19M citizens. They led the modernization and migration of 29 user-facing applications, developing standardized DevOps, technology, and quality-assurance processes while modernizing and migrating three legacy applications to AWS by Q1 2023.

The IBM and AWS partnership can accelerate your CSE modernization journey

For a state considering modernizing the legacy mainframe system, IBM recommends the combination of one or more of the following patterns:

Migration to AWS with no/minimal code changes with middleware emulator. Migration to AWS with refactoring (auto-refactoring of code to Java/.Net). Rearchitecting and modernizing with new channels of information delivery (leveraging microservices). Data migration to the cloud for analytics and insights.

For all these patterns, business functional equivalence is essential, as detailed below.

Mainframe modernization patterns and landing zone.

Leveraging these patterns and our extensive AWS experience, IBM Consulting has solidified these patterns as accelerators to help clients looking to migrate to AWS. Below is a high-level schematic.

Business functional equivalence is essential for automation and project speed.

At AWS re:Invent 2022, IBM shared five hybrid application modernization patterns, complete with reference architectures for DevOps/API Life cycles, demos and making the IBM Z Cloud Modernization stack available on AWS Marketplace. In addition, we have developed reference architectures for deploying IBM Mainframe z/OS on AWS with IBM ZD&T, an AWS native development environment reference architecture for modernizing z/OS applications, and a flexible, integrated application modernization and IT automation platform, enabling states to accelerate application modernization, achieve IT automation, reduce risk of application changes, create and deploy secure APIs, and reduce the need for specialized skills.

For code refactoring—in addition to home-grown tools and third-party tools—IBM is training its resources on AWS Blue Age (a recent AWS acquisition) and focusing on simplifying and optimizing the modernization journey by automating the majority of the code-refactoring process.

This approach has yielded immediate benefits for our clients, including providing better ROI, frontloading the savings to power modernization through reinvestment, and providing a gradual ramp up for states’ human workforce—critical for building the skills necessary for operating a mission-critical workload.

Learn more

States and their child support enforcement systems need a technology partner that is committed to listening and understanding the reality of where the state is on the modernization journey. They need a partner that will work in tandem with a state to tailor an approach that supports its vision, constraints and existing investments.

IBM brings deep child-support industry knowledge, leading-edge technologies and expertise on AWS through a unique combination of IBM Consulting, software and as-a-service capabilities. From concept to scale, our end-to-end solutions and proven methods help child support agencies modernize their applications and infrastructure with speed and consistency, so states can stay ahead of the curve.

IBM is proud to be a trusted partner of choice for many states and is eager to help modernize support systems designed to look after the welfare of our most prized asset—our families.

Learn more about AWS Consulting Services


The post Modernizing child support enforcement with IBM and AWS appeared first on IBM Blog.


SC Media - Identity and Access

New milestones prove passkeys are ‘ready for prime time,’ says FIDO Alliance leader

The FIDO Alliance, a nonprofit standards organization, announced new UX guidelines, corporate use case documentation and more at the 2023 Identiverse conference.

The FIDO Alliance, a nonprofit standards organization, announced new UX guidelines, corporate use case documentation and more at the 2023 Identiverse conference.


Entrust

Digitization, remote signing, and eIDAS, part 1/2

It’s not about signing; it’s about the use case In this two-part blog series we... The post Digitization, remote signing, and eIDAS, part 1/2 appeared first on Entrust Blog.
It’s not about signing; it’s about the use case

In this two-part blog series we provide an overview of the problems around digitization, specifically on the challenges with remote signatures and some of the options available to limit the risks of fraud and increase confidence in your signing process.

The move to electronic signatures over pen-and-paper ones did not start with the COVID-19 pandemic – in fact, most of today’s laws regulating the use of electronic signatures were published in the early 2000s to facilitate the growing needs of online commerce.

But, it’s fair to say that the pandemic did represent an important transition in the world of digitization and electronic signing: Within the last two years, electronic signatures went from a “nice to have” – and organizations who deployed them were considered innovative – to a must-have that’s here to stay. Electronic signature solutions provide so much more traceability and efficiency than the traditional pen-and-paper version that there is little chance you would think about reverting to print & sign.

What you may be wondering, however, is how to digitize the rest of your signature and other approval processes that still rely on in-person, face-to-face interactions. These processes are typically high-value and considered critical to a business; you may be reluctant to digitize them over fear that using a remote signature would increase risks of fraud, impersonation, document repudiation, nullification, etc.

Signatures are evidence of intent, consent… and identity

The admissibility of an electronic signature very much depends on the law(s) and regulation(s) that apply to your organization, and your local attorney will be best positioned to advise you on the specific requirements for your signatures.

Some industries are good at providing frameworks to enable remote business while mitigating the risk of fraud. For example, the banking sector is known for its strong regulations and standards for identity verification (i.e., Anti-Money Laundering, Know-Your-Customer, etc.) and these can easily be reused as part of a signing process.

But your organization’s industry may not have come up with such standards yet. If you are looking for solutions to ensure that your signatories are properly identified before they sign your documents remotely to limit your risks, there are several options for you to consider – and for which Entrust can help:

ID or passport checking software/service Identity verification services from a publicly trusted certification authority (CA) – or a qualified trust service provider (QTSP) in the European Union Private or national eID schemes

Therefore a good first step toward the digitization of your business-critical processes is to work out which of these options would be suitable for your organization from a security, legal, and compliance point of view. Definitions and requirements for electronic signatures vary across countries and states; this is why electronic signature requirements should always be discussed on a case-by-case basis with your legal department and/or an attorney.

Software and services related to identity verification are ideal when you are looking to fully own and control the process. However, not all solutions in the market are designed to be used as part of a remote signing process, so it’s important to pick the solution adapted to your specific use case(s).

Perhaps the most widely used option for remote electronic signatures that provide strong evidence of the signatory’s identity and consent to sign, is to work with public CAs (and QTSPs if you are in the European Union). These authorities specialize in verifying identities of organizations and people as part of a signing process. The verified identity details are issued in special credentials called digital certificates, which can in turn can be used to generate digital signatures.

Finally, eID schemes are becoming more and more prominent, especially in the European Union thanks to the eIDAS Regulation, but remain limited to certain industries or countries. In the second part of this blog series, we will provide further insights into digital signatures and eID schemes, but you can already check out our Digital Signing 101 infographic, and learn more about our signing-related solutions here:

Document Signing Certificates Digital Signing as a Service Electronic Signing from Evidos, an Entrust company

Taking it further: Watch our latest webinar on document signing problematics and solutions

The post Digitization, remote signing, and eIDAS, part 1/2 appeared first on Entrust Blog.


Tokeny Solutions

The New Financial World: AI Meets Tokenization

The post The New Financial World: AI Meets Tokenization appeared first on Tokeny.
May 2023 The New Financial World: AI Meets Tokenization

Artificial Intelligence (AI) and blockchain are two groundbreaking technologies that have the potential to revolutionize various industries, including finance and investment. When combined, they can create powerful synergies that enable more efficient and optimized decision-making processes. The prospect of these two potent and transformative technologies intertwining is thrilling, yet somewhat daunting, given their virtually limitless potentials.

Tokeny has always been at the forefront of technology. In fact, in 2018 we started to bridge AI with tokenization. Identity checks of individual investors are already greatly accelerated by the KYC software solutions that are integrated into our onboarding processes. The software compares investors’ identity documents with the information and “liveness” tests by using AI-powered face scans to verify the user matches the photo on authorized documents. Furthermore, our tokenization platform represents these ID verifications on the blockchain with on-chain identities, which makes these certifications actionable. Token issuers, DeFi protocols, and other relevant users can trust these identity proofs to enforce compliance or filter their participants.

However, this example is just the tip of the iceberg. We believe that AI’s ability to process and analyze vast amounts of data with speed and accuracy is invaluable in the financial realm. It can help investors and financial institutions make better-informed decisions by identifying patterns, trends, and correlations that may go unnoticed by human analysts, or simply take too much time to find. AI algorithms can analyze market data, news articles, social media sentiment, and other relevant information to generate real-time insights and predictions on markets and investment opportunities.

One area where AI excels already is in portfolio optimization. By utilizing advanced algorithms, AI can identify optimal investment strategies based on an investor’s risk tolerance, investment goals, and market conditions. It can analyze historical market data, perform risk assessments, and make dynamic adjustments to portfolios, ensuring they remain aligned with the investor’s objectives. This already exists and is continually evolving at an outrageous speed.

However, without a programmable financial infrastructure, it would be too complex for AI applications to take action based on their conclusions, and achieve the goals requested by its owner. An AI agent cannot easily open a bank account, move money, buy and sell assets, or anything in that nature.

Here is where blockchain technology comes into play: Blockchain networks provide a programmable and secure infrastructure that enhances the transparency, integrity, and trustworthiness of financial transactions, and enables programs to access valuable financial resources. By leveraging blockchain, AI systems can access real-time, tamper-proof market data and execute trades directly on decentralized protocols.

Indeed, smart contracts, a key feature of blockchain technology, enable the automation of financial agreements and transactions. AI algorithms can leverage smart contracts to autonomously execute predefined investment strategies, eliminating the need for intermediaries and reducing transaction costs. Artificial intelligence decides, and the blockchain executes.

Our prediction is that the integration of AI and blockchain technologies has the potential to optimize financial and investment decisions in a variety of ways. AI’s data analysis capabilities can provide real-time insights and assist in portfolio optimization, while blockchain networks enhance transparency and automation. Together, they enable more efficient and trustworthy financial services, paving the way for a future where decentralized and AI-powered systems play a central role in optimizing our financial decisions.

Blockchain networks are the programmable financial infrastructure that AI has been missing. Now the technology is there, the only missing part is a large selection of tokenized assets.

What do you think about AI: Opportunity, or the path to apocalypse?

Tokeny Spotlight

PARTNERSHIP

Blocktrade partnered with us to tokenize its equity

INTERVIEW

CEO Luc was interviewed by Delano on art tokenization

NEWSLETTER

Product update on our network-agnostic platform

MEMBERSHIP

We joined the INATBA association to accelerate the adoption of tokenization

TOKENY’S TALENT

Our product owner Lautaro shares his story in this month’s Talent Interview

WEBINAR REPLAY

An insightful conversation on art tokenization with Artory and Winston Art Group

Tokeny Events

Money20/20

June 6-8, 2023 | 🇳🇱 Amsterdam

Learn More

IMPower Fundforum

June 27-28, 2023 | 🇲🇨 Monaco

Learn More

ICT Spring

June 29-30, 2023 | 🇱🇺 Luxembourg

Learn More Market Insights

BNY Mellon deems tokenization ‘next wave of securitization’
BNY Mellon is enhancing its focus on digital assets and has plans of integrating them with all aspects of its business.

CryptoSlate

Read More

 

Luxembourg and Digital Assets: A Perfect Match
The government and financial institutions have embraced the growth of the fintech industry and favourable regulations have been implemented to support the development of new technologies.

Chambers.com

Read More

 

VP Bank: Rising Demand for Art Tokenization in Asia
Liechtenstein-based VP Bank is observing rising demand for the tokenization of collectibles in Asia, following the launch of its blockchain offering in the region.

Finews.asia

Read More

 

World Bank turns to blockchain for tokenizing infrastructure process amid regulatory challenges
The World Bank could be pivoting to blockchain technology in the future following the release of a report exploring tokenization for infrastructural projects.

Coingeek

Read More

 

A Strategic Analysis Of The Tokenization Of Real Estate Landscape
Tokenization is quickly gaining traction in the real estate sector, with traditional institutions partnering with tech providers to explore the tokenization of debt or equity.

KPMG

Read More

Compliance In Focus

Digital finance: Council adopts new rules on markets in crypto-assets (MiCA)
Setting an EU level legal framework for this sector for the first time, the Council adopted a regulation on markets in crypto-assets (MiCA).

European Council

Read More

 

Anti-money laundering: Council adopts rules which will make crypto-asset transfers traceable

Crypto asset service providers are obliged to collect and make accessible certain information about the sender and beneficiary of the transfers of crypto assets they operate.

European Council

Read More

 

Prometheum Approved as Special Purpose Broker for Digital Asset Securities Following ATS Approval
Prometheum Ember Capital LLC, a subsidiary of Prometheum Inc., has received approval from the Financial Industry Regulatory Authority (FINRA) to operate as a special-purpose broker-dealer for digital asset securities.

Crowdfund Insider

Read More

Subscribe Newsletter

A monthly newsletter designed to give you an overview of the key developments across the asset tokenization industry.

Previous Newsletter  Jun2 The New Financial World: AI Meets Tokenization May 2023 The New Financial World: AI Meets Tokenization Artificial Intelligence (AI) and blockchain are two groundbreaking technologies that have the potential to revolutionize various… May2 The EU Regulation Leads the Way in Digital Securities April 2023 The EU Regulation Leads the Way in Digital Securities As the race to embrace digital securities and the tokenization of assets accelerates, the… Apr4 Crisis Resilience: How Tokenization Could Empower Banks March 2023 Crisis Resilience: How Tokenization Could Empower Banks The recent bank crisis has sparked concerns and uncertainty in the financial world. With the takeover… Mar3 The Key Roles And Responsibilities in DeFi for Real World Assets (RWA) February 2023 Building Institutional DeFi: A Guide to Stakeholder Responsibilities We previously explored the necessary requirements for compliantly bringing tokenized real-world assets (RWA) and securities to DeFi.…

The post The New Financial World: AI Meets Tokenization first appeared on Tokeny.

The post The New Financial World: AI Meets Tokenization appeared first on Tokeny.


IBM Blockchain

WebSphere Application Server support

An update on WebSphere Application Server support. The post WebSphere Application Server support appeared first on IBM Blog.

IBM continues to be committed to supporting your journey with the WebSphere platform. There is no planned end-of-support date for WebSphere 8.5.5 and 9.0.5. IBM intends on supporting these WebSphere releases beyond Oracle’s stated extended support date for Java 8.

For more details, see the WebSphere Application Server traditional Lifecycle.

The post WebSphere Application Server support appeared first on IBM Blog.


BlueSky

Private Beta Update & Roadmap

Our priorities continue to be a focus on moderation and curation to ensure a safe and pleasant environment for users, and protocol work to launch federation.

Since our last update, quite a bit has changed. Just a few months ago in February, we only had a couple hundred users, and in April, we crossed 50,000 users. We recently passed 100,000 users, and are excited that so many people have joined us. Our priorities continue to be a focus on moderation and curation to ensure a safe and pleasant environment for users, and protocol work to launch federation.

Recently Released Features Custom feeds. Users can have more ownership over their social media experience by picking the algorithms that power their feeds. So far, third-party developers have created feeds ranging from mutuals only, first posts from new users, cat photos, the #nba hashtag, and more. Blocks and mutelists. Blocks on Bluesky prevent interaction, and organizations and community leaders can maintain shared mutelists for others to subscribe to. More fine-grained content moderation controls. Users are able to select hide, warn, or show for content categories ranging from nudity to impersonation. Custom domains as handles. Both individuals and organizations can set their account handle to their domain, creating a unique identifier across the network. Read our tutorial here for how to set this up. Production desktop app. We launched the desktop web app at https://bsky.app/. Invite codes per account. We added server-level tooling to distribute invites on a regular cadence. Previously, accounts did not have invite codes of their own. Accessibility improvements. We added alt text to images, an improved dark mode, and more. App passwords. We released this as a short-term solution for people to use with third-party apps while we continue to work on Single Sign-On (SSO). The tab for custom feeds in the app. The section for moderation settings in the app. The AT Protocol Developer Ecosystem

As the number of beta testers on Bluesky continues to grow, so too does the size of our developer community. Recent updates from our team include:

Open-sourced client. We published the code that powers the Bluesky social app publicly on GitHub and are accepting community contributions (but please read the rules and guidelines first!). Feed generator starter kit. We published a GitHub repository with a template for developers to build custom feeds. Community documentation. We’ve published a community page on our protocol documentation site with links to a quick start guide, community projects, and developer groups to join.

To see projects from community developers, you can visit this page, or if you have a Bluesky account, check out the @atproto.com account. Some highlights include:

Flipboard added Bluesky integration, so now you can browse Bluesky content from within Flipboard Skylink is a Chrome and Firefox extension that detects if the domain is linked to a Bluesky profile as you browse the web, which really exemplifies the “social internet” Atlas maps the social clusters on the network, and is a fun way of visualizing communities joining the beta SkyFeed surfaced custom feeds even before we released them in the official Bluesky app Roadmap and Growth

A few weeks ago, we intentionally slowed our invite roll-out while we built more moderation tooling and capacity for users on the app. We staffed a content moderation team with shifts that cover a 24/7 schedule, and consulted with trust and safety experts to establish new processes and policies to support a growing userbase. We’ve resumed sending out daily invites to the waitlist, which is where the majority of users already on Bluesky received their invites. For those who don’t know someone personally with an invite code, the waitlist is the fastest way to receive a code, though please be patient as we work through the list.

We recently shared our technical architecture for federation and are releasing a sandbox environment for developers to test soon.

Thanks to all who have used the feedback form within the app; we’ve heard your product feedback over the last few months. In the near future, you can expect these features and improvements as we move towards federation:

Improved labeling, reporting, and related moderation tooling Account portability between servers Updated community guidelines for the app Protocol documentation and specifications

Meanwhile, we’re actively hiring. If you’re eager to join our effort to build the open social web, please apply to one of the listings on our jobs page.

Giving Feedback

When the population of beta testers was even in the tens of thousands, it was still fairly manageable for the team to receive bug reports and requests from tags and mentions directly in the app. However, this method of feedback does not scale, so we’ve implemented some changes to our preferred feedback routes. Please do not tag the entire team’s personal profiles for bug reports or support requests in the app — this makes it difficult to surface important notifications. Instead:

To flag specific posts or profiles for the team’s attention, please use the in-app reporting feature. Our mods review every report we receive. For bug reports or app feedback, please use the feedback form in the app itself (located in the side menu on mobile or the right side on desktop). If you’re familiar with GitHub, you can also file an issue on our repos, though please read our contribution guidelines in the README first. For support requests, please email us at support@bsky.app. For security reports, please email us at security@bsky.app or visit https://bsky.app/.well-known/security.txt.

Thursday, 01. June 2023

Ocean Protocol

DF39 Completes and DF40 Launches. Delegation is now live.

DF39 Completes and DF40 Launches. Delegation is now live Stakers can claim DF39 rewards. DF40 runs Jun 1 — Jun 8, 2023 1. Overview Data Farming Round 40 is here (DF40). DF40 is the 12th week of DF Main, the final phase of DF. This week, users can earn rewards up to 150K OCEAN. In DF Main, weekly rewards will grow to 1M+ OCEAN. In DF40 we’re introducing the ability for part
DF39 Completes and DF40 Launches. Delegation is now live Stakers can claim DF39 rewards. DF40 runs Jun 1 — Jun 8, 2023 1. Overview

Data Farming Round 40 is here (DF40).

DF40 is the 12th week of DF Main, the final phase of DF. This week, users can earn rewards up to 150K OCEAN. In DF Main, weekly rewards will grow to 1M+ OCEAN.

In DF40 we’re introducing the ability for participants to engage in veOCEAN Delegation via our df.oceandao.org portal. This enables users with multiple wallets, apy optimizers, and new mechanisms to be built upon Data Farming’s Active Rewards stream.

The article “Ocean Data Farming Main is Here” has the full details of DF Main. In fact, it’s a self-contained description of Ocean Data Farming (DF), including all the details that matter. It is up-to-date with the latest reward function, weekly OCEAN allocation, and estimates of APYs given the current amount of OCEAN staked.

DF is like DeFi liquidity mining or yield farming, but is tuned to drive data consume volume (DCV) in the Ocean ecosystem. It rewards stakers with OCEAN who allocate voting power to curate data assets with high DCV.

To participate, users lock OCEAN to receive veOCEAN, then allocate veOCEAN to promising data assets (data NFTs) via the DF dapp.

DF39 counting started 12:01am May 25, 2023 and ended 12:01am Jun 1. You can claim them at the DF dapp Claim Portal.

DF40 is live and will conclude on Jun 08, 2023.

DF Round 40 (DF40) is the 12th week of DF Main. Details of DF Main can be found here.

The rest of this post describes how to claim rewards (section 2), and DF40 overview (section 3), and the introduction of veOCEAN Delegation.

2. How To Claim Rewards

As a participant, follow these step on how to claim rewards:

Go to DF dapp Claim Portal Connect your wallet Passive and Active Rewards are distributed on Ethereum mainnet. Click “Claim”, sign the tx, and collect your rewards

Rewards accumulate over weeks so you can claim rewards at your leisure. If you claim weekly, you can re-stake your rewards for compound gains.

3. DF40 Overview

DF40 is part of DF Main, phase 1. This phase emits 150K OCEAN / week and runs for 52 weeks total. (A detailed DF Main schedule is here.)

Ocean currently supports five production networks: Ethereum Mainnet, Polygon, BSC, EWC, and Moonriver. DF applies to data on all of them.

Some key parameters:

Total budget is 150,000 OCEAN. 50% of the budget goes to passive rewards (75,000 OCEAN) — rewarding users who hold veOCEAN (locked OCEAN) 50% of the budget goes to active rewards (75,000 OCEAN) — rewarding users who allocate their veOCEAN towards productive datasets (having DCV).

Active rewards are calculated as follows:

First, distribute OCEAN across each asset based on rank: highest-DCV asset gets most OCEAN, etc. Then, for each asset and each veOCEAN holder:
– If the holder is a publisher, 2x the effective stake
– Baseline rewards = (% stake in asset) * (OCEAN for asset)
– Bound rewards to the asset by 125% APY
– Bound rewards by asset’s DCV * 0.1%. This prevents wash consume.

For further details, see the “DF Reward Function Details” in DF Main Appendix.

As usual, the Ocean core team reserves the right to update the DF rewards function and parameters, based on observing behavior. Updates are always announced at the beginning of a round, if not sooner.

4. Introducing veOCEAN Delegation

To continue improving accessibility and composability of Data Farming, delegation has been introduced into the UI, and integrated into Active Rewards.

Starting in DF40 participants can delegate their Allocation Power via df.oceandao.org/delegate to facilitate coordination between wallets, apy optimizers, or to engineer novel mechanisms.

This is the same implementation as veCRV’s veDelegate mechanism and has been available for a close to 1 year. However it’s delegation mechanism is now woven into Data Farming’s Active Rewards stream.

Addresses being delegated to will start receiving the rewards generated by participating in Data Farming and helping to curate quality assets that generate DCV (Data Consume Volume).

4.1 Steps to delegate your veOCEAN Go to the DF Portal — Delegate and enter or copy the wallet address you wish to delegate to into the ‘Receiver wallet address’ field. Click the Delegate button and sign the transaction. You can view information about your delegation in the My Delegations component. If needed, you can cancel the delegation to regain your Allocation Power before the delegation expires. 4.2 Steps after delegation.

When you delegate, you transfer 100% of your veOCEAN Allocation Power for a limited period. After delegation, you cannot manage your allocations until the delegation expires.

The delegation expiration date is the same as the veOCEAN lock end date at the time of delegation. If necessary, you can extend your lock before delegating.

If you extend your lock, you have to update your delegation.

You can cancel your delegation at any time.

Conclusion

DF39 has completed. To claim rewards, go to DF dapp Claim Portal.

DF40 begins Jun 1, 2023 at 12:01am UTC. It ends Jun 08, 2023 at 12:01am UTC.

DF40 is part of DF Main. For this phase of DF Main, the rewards budget is 150K OCEAN / week.

Starting in DF40 participants can delegate their Allocation Power via df.oceandao.org to improve their UX when engaging with Data Farming and Active Rewards.

Appendix: Further Reading

The Data Farming Series post collects key articles and related resources about DF.

About Ocean Protocol

Ocean was founded to level the playing field for AI and data. Ocean tools enable people to privately & securely publish, exchange, and consume data.

Follow Ocean on Twitter or Telegram to keep up to date. Chat directly with the Ocean community on Discord. Or, track Ocean progress directly on GitHub.

DF39 Completes and DF40 Launches. Delegation is now live. was originally published in Ocean Protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.


Holochain

hApps Spotlight: Flux

Reinventing Collaboration

Flux is a fully distributed, human-centered platform that facilitates social coordination with greater privacy, agency, and flexibility. With a Discord style design for communities to grow on, Flux is customizable open-source software made for Web3 integrations.

Flux at a Glance

Description: Flux is a fully distributed, Discord style platform.

Website: Fluxsocial.io Developer Docs: docs.fluxsocial.io/ Discord: discord.com/invite/mP4vQkVgwp Twitter: @flux_social Why is Flux being built?

When using traditional social media applications you are always restricted by the feature sets that the developers of that specific app have decided to implement. Discord, Facebook, Slack, and others are all implementing features like events, chat, groups, posts etc. The vision for Flux is to become a general purpose platform for collaboration. Community members themselves decide which features they want to install using our plugin store. And any developer can extend their community’s plugins by developing their own.

Why did they choose Holochain?

Flux is built on AD4M, which is an agent-centric framework for decentralized social applications. The core of AD4M uses Holochain, which makes Flux extendable and customizable while ensuring validation rules are enforced.

Flux, as a project, grew out of Junto which had set out to build a social media platform using Holochain where people held their own data. As it developed Junto shifted focus to more intimate communities, becoming Flux. The importance of agent-centricity and data ownership remains however. So while there are many p2p technologies out there, Holochain continues to be a key tool for delivering on this vision.

Flux Ecosystem Session

Get a hands on view of Flux as David Atkinson downloads and uses the app with the Flux team in an ecosystem session.

Can you say more about the collaborative potential of Flux?

While the collaborative nature of Flux seems apparent in its basic Discord-like features, its true power lies beneath the surface. First, as an AD4M app, Flux is completely interoperable with any other AD4M app — meaning you can see data from Flux in any other UI built over AD4M. This means you could turn chat messages into “to-do” stickies in a different view, without actually duplicating or siloing any data. Because AD4M decouples data from the front-end, it’s super simple to develop new apps or UIs, deploy them, and invite others to use them with you. In this way, Flux is a fully evolvable and extendable platform in which any user can develop new features and put them into play with others.

How are they using Holochain?

AD4M is a sort of spanning layer that connects lots of different protocols together, from Ethereum to AWS. AD4M uses Holochain to store these connections and ensure their security. While it is designed to connect all of these different protocols together, AD4M needs Holochain to bootstrap everything and make it open, resilient, and scalable.

In many ways AD4M is an extension upon Holochain, taking the same root ideas of agent-centricity, and giving them access to all of these other systems to interact with. Holochain is the key enabler of this social technology.

Can you say a little about your experience developing on Holochain?

The way that Flux accesses Holochain through the AD4M spanning layer makes it even easier to build applications on Holochain as the SDK really feels like working with a traditional real time database like Firebase! This is a different way of interacting with Holochain than most developer teams have, but we really think that this is just the beginning of all the ways this technology will be used. We are learning and building alongside Holochain with excitement for the future.

How can I get involved?

Join the AD4M Discord, where you can connect with us about Flux and other AD4M apps: discord.com/invite/mP4vQkVgwp

Try Flux yourself! First download ad4m (ad4m.dev/download), and then connect to Flux (app.fluxsocial.io/) and join our test community.


FindBiometrics

Is Age Verification the Next Big Biometric Market?

In the realm of digital and mobile identity technologies, selfie-based identity verification has been perhaps the most salient success story of the last few years, as a growing number of […]
In the realm of digital and mobile identity technologies, selfie-based identity verification has been perhaps the most salient success story of the last few years, as a growing number of […]

IBM Blockchain

IBM Cloud Databases for Elasticsearch End of Life and pricing changes

There is action required before November 30, 2023, for your Elasticsearch version 7.9 and 7.10 deployments. On November 30, these versions will be End Of Life. The post IBM Cloud Databases for Elasticsearch End of Life and pricing changes appeared first on IBM Blog.

As part of our partnership with Elastic, IBM is announcing the release of a new version of IBM Cloud Databases for Elasticsearch. We are excited to bring you an enhanced offering of our enterprise-ready, fully managed Elasticsearch. Our partnership with Elastic means that we will be able to offer more, richer functionality and world-class levels of support.

The release of version 7.17 of our managed database service will include support for additional functionality, including things like Role Based Access Control (RBAC) and Index Lifecycle Management (ILM).

Please read this announcement carefully as it contains important information regarding pricing and version lifecycle management.

End of Life for versions 7.9 and 7.10

There is action required before November 30, 2023, for your Elasticsearch version 7.9 and 7.10 deployments. On November 30, these versions will be End Of Life.

We strongly recommend that you upgrade your database deployments to version 7.17 before that date. Our recommended procedure for this is restoring from a backup, as detailed here.

Using this method will give you control of the timing and execution of your upgrade and will guarantee the integrity of your data. You will also be able to ensure that any of the breaking changes between versions detailed by Elastic (see here) do not affect your application(s).

After November 30, 2023, deployments of IBM Cloud Databases for Elasticsearch on versions 7.9 and 7.10 (the IBM Cloud Databases for Elasticsearch Standard Plan) that are still active will be forcibly upgraded to version 7.17 (the IBM Cloud Databases for Elasticsearch Enterprise Plan).

We do not recommend that you wait until then for the following reasons:

We provide no SLAs for this type of forced migration. You may experience some data loss. Your application may experience downtime. Your application may stop working if it has any incompatibilities with the new version (see above about breaking changes between versions). You cannot control the timing of when this upgrade will happen for your deployment. There is no rollback process for this forced upgrade. Pricing changes

The cost of IBM Cloud Databases for Elasticsearch deployments is increasing. All new deployments of Elasticsearch version 7.17 (the Enterprise Plan) will be charged at the new list price.

To give customers an opportunity to upgrade and take advantage of the new service features, we are holding the prices of versions 7.9 and 7.10 (the Standard Plan) at existing prices until June 30, 2023. From July 1, 2023, all deployments of Elasticsearch will be charged at the new prices, without exception.

As mentioned in the previous section, our recommendation is that you upgrade as soon as possible to ensure a stable upgrade while making full use of the service improvements.

Learn more

If you have any questions about your Elasticsearch deployments, please visit our documentation or open a support ticket.

See the complete pricing details for IBM Cloud Databases for Elasticsearch here

The post IBM Cloud Databases for Elasticsearch End of Life and pricing changes appeared first on IBM Blog.


Entrust

Three best practices to prepare your campus ID credential issuance program for fall

In preparing to welcome students back to campus at the beginning of each year, semester,... The post Three best practices to prepare your campus ID credential issuance program for fall appeared first on Entrust Blog.

In preparing to welcome students back to campus at the beginning of each year, semester, or quarter, many higher education institutions struggle with managing the process of registering students. With 80% to 90% of school identification cards printed during a short registration period, it’s not uncommon for campuses to experience long lines of students crammed into small registration offices waiting to receive their credentials. And, with a greater focus on school security and safety, campuses are increasingly adopting secure identification credentials to help mitigate unauthorized access into campus facilities.

With our expert guidance, you can learn how to ensure your card printers perform consistently and produce high-quality student ID credentials while minimizing costs and security risks. These best practices will help ensure your student ID program is ready so you can confidently and seamlessly welcome students back to campus.

Here are three essential tips every campus card office needs to know now:

Ensuring proper maintenance of your ID card printers

To ensure optimal printer performance for your student ID program, it’s crucial that the printer is cleaned according to manufacturer recommendations. We recommend cleaning after every 500 cards are printed or every ribbon change. Cleaning regularly will significantly reduce cost as well as technical cleaning, which requires the services of a technician and can be costly. Additionally, regular cleaning ensures that your printer consistently prints high-quality cards every time, that your credential administrators have a better user experience, that printer downtime as well as service calls are minimized, and that the life of the printer and printhead are extended.

Minimizing costs

You can do several things before the next school year begins that will help reduce future maintenance and service costs. As mentioned above, if you clean your printer with care, you will reduce the number of service requests and calls, as well as waste. This also helps extend the printhead life, which means fewer replacements to pay for. We also recommend testing your card stock and card technology to ensure they are compatible with your printer. Another thing to look at is your supplies inventory to prevent last-minute purchases. Do you have everything you need to start the school year strong? Finally, follow the printer User Guide’s storage and shelf-life recommendations to extend the printer’s life as long as possible.

Reducing security risks

ID credential issuance manufacturers often release updates regularly to enhance security. Take advantage of each update – don’t ignore them; they are there for a reason. It’s also important to follow through with any recommended software driver and firmware updates to minimize the possibility of security breaches. Ensure you have the latest version of Instant ID software to resolve any existing software and hardware issues.

To learn more about how campus card offices can prepare for back-to-school this fall, download our best practices guide here. Or, visit our website for more tips on reducing costs and waste, as well as enhancing security with every card: https://www.entrust.com/issuance-systems/instant/id-card-issuance/supplies/c/regional-supplies.

The post Three best practices to prepare your campus ID credential issuance program for fall appeared first on Entrust Blog.


Ontology

Ontology Monthly Report — May 2023

Ontology Monthly Report — May 2023 Ontology has updated the partners page on the official website, with more than 70+ partners now listed. We love partnerships and the collaborative spirit that fuels the Web3 movement. Development/Corporate Updates Development Progress - We are 90% done with the EVM bloom bit index optimization. - We are 85% done with the high ledger memory usage optimiz
Ontology Monthly Report — May 2023

Ontology has updated the partners page on the official website, with more than 70+ partners now listed. We love partnerships and the collaborative spirit that fuels the Web3 movement.

Development/Corporate Updates Development Progress

- We are 90% done with the EVM bloom bit index optimization.

- We are 85% done with the high ledger memory usage optimization.

We are 80% done with the optimization of ONT staking liquidity. Out & About — Event Spotlight

It was all hands on deck this month with a string of news reports and developments:

● Humpty Calderon recently joined Tech Talks Daily, talking about Ontology’s unique approach to addressing blockchain challenges and the exciting potential for content creators to take control of their work!

Humpty Calderon joined a discussion on Oracles and Self-Sovereign Identity, hosted by UMA Protocol. Geoff attended a TwitterSpace AMA with Getblock, sharing valuable information about Ontology and its potential use cases. OKX Wallet now supports the Ontology Bridge! This collaboration makes it even easier to participate in the Ontology #EVM, ensuring a seamless experience. We’re committed to making Web3 as smooth as possible. We celebrated our integration with iZUMi! They now support assets on the Ontology EVM, paving the way for improved liquidity and a smoother experience. We made a detailed guide on using iZUMI to swap ONG into ONT. We held a TwitterSpace chat with Gameta. Discover the future of GameFi, and the power of DID in onboarding millions to Web3. We held a TwitterSpace to delve into iZUMIi ‘s iZiSwap support , exploring the expanding possibilities of DeFi on Ontology. Loyal member Sasen and Furst joined the AMA with Crypto Wallet on behalf of Ontology, discussing the ecosystem of Ontology and how ONT ID empowers the Web3 world. On May 25th, we held a Twitter Space and delved into Optimistic Roll Ups and their application in the Web3 space with other leading figures — MetisDAO, Mantle, and Goshen. Geoff attended the AMA on Alchemy Pay Discord and delved deeper into the world of Ontology. Our community on CoinMarketCap just hit the 30K mark! Thanks to each and every one of you for your support and engagement. Continuing our ‘Meet the Team’ series, we’re very pleased to ask several team members and Ontology Harbingers a few questions. Product Development ONTO has launched a campaign together with Ivy, Zealy, and SWFT. ONTO launched a giveaway event with Signtn on Galxe. ONTO accepted Near’s invitation and joined a voice AMA in Near’s Discord. ONTO invited the CSO of Dreamix to join a DIscord AMA.

- ONTO accepted the invitation from FIO and joined a Twitter Space talking about Bitcoin Pizza Day.

ONTO has published an article about Crypto wallets. ONTO has published an article about DID in crypto wallets. ONTO has launched a Community Quiz campaign with Hamster. ONTO has launched partnerships with Umi’s Friend, SWFT, BIBI, Manta Network and SightN. On-Chain Activity

- 164 total dApps on MainNet as of May 31st, 2023.

- 7,565,981 total dApp-related transactions on MainNet, an increase of 66,250 from last month.

18,732,429 total transactions on MainNet, an increase of 106,577 from last month. Community Growth & Bounties

- This month, several Ontology Community Calls and discussions were held on Discord, Telegram and Twitter, focusing on topics such as “How Web3 intersects people’s daily life”, “Bitcoin Miami Conference” and “The purpose of various layers of protocols”. Community members actively shared their views, and participants also got the chance to win Loyal Member NFTs.

- We held our Monthly Quiz led by Ontology Harbinger Benny. Community members actively raised questions and shared 100 ONG rewards.

- Our Philippines community held a Chess Tournament and members actively engaged.

As always, we’re active on Twitter and Telegram where you can keep up with our latest developments and community updates. To join Ontology’s Telegram group and keep up to date, click here. Recruitment

At Ontology, we are always looking to expand our team. We currently have a list of open roles and are looking to hire ambitious and hardworking individuals (see below). Check out our website for full details.

Recently, we hired:

- A Marketing Manager

- A Go Engineer

- A Front-end Engineer

Now, we are still looking for:

- Senior Engineer, DevOps

- Community Operation Associate

UI Designer Follow us on social media!

Ontology website / ONTO website / OWallet (GitHub)

Twitter / Reddit / Facebook / LinkedIn / YouTube / NaverBlog / Forklog

Telegram Announcement / Telegram English / GitHubDiscord

Española Türk русский Tagalog Français বাংলা 한국어 සිංහල

Ontology Monthly Report — May 2023 was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


Elliptic

Crypto Regulatory Affairs: G20 Keeps its Sights Set on Stablecoins

Stablecoins have been featured as another controversial topic across 2020, and the G20 looks set to keep the debate a hot one into 2021.  This week, finance ministers from the G20 countries issued a statement that included remarks on stablecoins. The statement notes that technological innovation is critical in light of the COVID-19 crisis, but

Stablecoins have been featured as another controversial topic across 2020, and the G20 looks set to keep the debate a hot one into 2021. 

This week, finance ministers from the G20 countries issued a statement that included remarks on stablecoins. The statement notes that technological innovation is critical in light of the COVID-19 crisis, but it stresses that the G20 will remain alert to risks from new financial technologies and singles out stablecoins. 

According to the G20: “No so-called ‘global stablecoins’ should commence operation until all relevant legal, regulatory and oversight requirements are adequately addressed through appropriate design and by adhering to applicable standards.”

While not necessarily a new position from the G20, underscores the concerns that  financial policymakers have about innovations such as Facebook's Diem, formerly known as Libra, which news reports suggest may launch as soon as January. As the Financial Action Task Force (FATF) also highlighted in a report this year, financial watchdogs are concerned that the international financial system could face new systemic financial crime risks if stablecoins are launched at a global scale rapidly. The G20’s statement also coincided with the release of a report by the Bank for International Settlements examining the risks of stablecoins and potential policy responses. 

At Elliptic, we feel the financial crime risks from stablecoins are ultimately manageable and don’t require a heavy-handed response. Our blockchain analytics solutions enable cryptoasset businesses to monitor transactions in stablecoins, ensuring regulatory compliance and management of financial crime risks. 

Contact us today to learn more about how we can assist your businesses in issuing or handling stablecoins in a safe and compliant manner. 

Crypto.com Gets the Green Light from Regulators in Malta

Don’t worry, it’s not all controversy out there! This week also saw positive “firsts” for the crypto industry. 

Hong Kong-based Crypto.com became the first cryptoasset business to receive conditional approval for a license in Malta. In 2018, the European island country launched a cryptoasset legal framework that many claimed would make it a sought-after destination for crypto businesses from around the world. In fact, progress in implementing the regime has been slow. The Malta Financial Services Authority has taken its time in rolling out its licensing process – leading some crypto businesses to leave the island rather than wait for license approval

The MFSA’s conditional approval to Crypto.com is a sign that things may be changing. Importantly, the licenses they're due to receive could allow Crypto.com to be grandfathered into the European Commission's proposed Markets in Crypto-Assets (MiCA) framework, which would greatly expand regulatory requirements for European crypto businesses in the future. 

We hope the news about Crypto.com’s progress in Malta is a sign of good things to come there. If your crypto business is looking to obtain regulatory approval in Malta, contact us to learn more about how Elliptic's blockchain analytics solutions can assist you in gaining regulators’ confidence. 

France’s Crypto Registration Deadline Looms

In France this week, regulators issued an important warning to cryptoasset businesses: get your registration applications in, or else!

The Autorite des Marches Financiers (AMF) and the Prudential Supervision and Resolution Authority (ACPR) issued a statement this week reminding cryptoasset exchanges and custodians they have until December 18th to register with regulators if they have already been operating in France. Those who fail to register by the deadline face two years’ imprisonment and a fine. 

The statement shows that the AMF and ACPR are serious about enforcing the cryptoasset framework France first proposed in spring 2019, and that formally rolled out in December 2019. At that time, cryptoasset exchanges and custodians already operating were given one year to register with the AMF – and that deadline is fast approaching. 

To date, five cryptoasset exchanges have had their registration successful approved by the AMF. Any cryptoasset exchange or custodian operating in France needs to have a robust compliance program in place if it wants its registration approved by the AMF. That includes having in place a strong blockchain analytics solution to enable you to detect high risk wallets and transactions. 

Contact us to learn more about how we can assist your cryptoasset business in meeting the AMF's requirements. 

  Read more

Crypto Regulatory Affairs: Iran and Russia Exploring Cross-border Stablecoin Settlements

In a move that could have significant implications for sanctions evasion utilizing cryptocurrencies, local Russian news outlets are reporting that Russia and Iran are considering collaboration to skirt financial and economic restrictions imposed by the US and other countries. On January 16th, reports emerged from Russia indicating that the country i

In a move that could have significant implications for sanctions evasion utilizing cryptocurrencies, local Russian news outlets are reporting that Russia and Iran are considering collaboration to skirt financial and economic restrictions imposed by the US and other countries.

On January 16th, reports emerged from Russia indicating that the country intends to partner with Iran to create a stablecoin for use in cross-border trade settlement. The reports suggest that the stablecoin would be gold-backed, and would enable Russia to pay for shipping imports from Iran in the face of severe banking restrictions that both countries face due to sanctions.

The move would not mark the first time a new cryptocurrency was created to enable sanctions evasion; in 2018, the government of Venezuela launched the petro, a cryptocurrency it created to settle oil imports in the face of sanctions, while reducing reliance on the US dollar. 

The news has not yet been confirmed by sources outside Russia, and it is not clear whether the proposed move has been officially given its blessing by either country’s government. The reports also suggest Russia would not undertake such a move until it has finalized a proposed legal and regulatory framework for cryptocurrencies, which is still making its way through the legislative process. 

However, if true, the reports would hardly come as a surprise. As Elliptic’s previous research has shown, Iran has since 2018 used its vast energy reserves to engage in Bitcoin mining to circumvent sanctions, generating as much as $1 billion in revenue in the process. Last year, Iran acknowledged that it has used Bitcoin to pay for imports – a form of settlement that allows it to transact outside the banking system. 

Russia, for its part, has had a mixed history with crypto, and at one point leaned towards banning it. However, following its invasion of Ukraine in February 2022 and the subsequent imposition of expansive sanctions, Moscow has taken a more open stance – setting out a proposed regulatory framework that would allow crypto to be used in international trade settlement. Russian President Vladimir Putin has also stated publicly that the country could establish a competitive advantage in Bitcoin mining owing to its energy reserves, which would allow it to take a page from Iran’s playbook. In April 2022, the US Treasury sanctioned BitRiver, an Russian mining company, in an apparent effort to preempt Russia’s efforts to rely on mining to evade sanctions. 

Based on these developments, a prospective Russia-Iran stablecoin settlement project is hardly far-fetched. Any such attempt would undoubtedly be met with a response from the US, which in March 2018 issued sanctions prohibiting US citizens from dealing with the petro in response to Venezuela’s attempts to evade sanctions with crypto. 

Whether or not a gold-backed, Russia-Iran stablecoin comes to fruition, the news is an important reminder for crypto exchanges and financial institutions of the need to undertake proactive sanctions compliance. At Elliptic, our crypto wallet screening solution Elliptic Lens enables exchanges and FIs to identify wallets associated with entities in sanctioned jurisdictions such as Iran and Russia.

By equipping themselves with blockchain analytics solutions, businesses can protect themselves from exposure to prohibited entities and transactions – an essential risk management tool, particularly in light of increasingly large penalties the US Treasury has levied on crypto companies for failing to ensure sanctions compliance. 

Contact us to learn more about how we can assist your business in sanctions compliance for crypto. In the meantime, read our report on Sanctions Compliance in Cryptocurrencies.

FinCEN Identifies Bitzlato as a Primary Money Laundering Concern

For the first time, the US Department of the Treasury’s Financial Crimes Enforcement Network (FinCEN) has identified a crypto exchange as a Primary Money Laundering Concern.

On January 18th, the US Department of Justice and the Department of the Treasury announced a major international enforcement action against Hong Kong-registered crypto exchange Bitzlato, and the arrest of its founder and majority owner Anatoly Legkodymov – a Russian national – for money laundering.

In an order related to the action, FinCEN labeled Bitzlato as a “Primary Money Laundering Concern” under Section 9714 of the Combating Russian Money Laundering Act to disrupt its catering to known criminal activity tied to Russia – in particular, its connections to Russia-connected darknet market Hydra, as well as ransomware attackers. Covered US financial institutions – which includes cryptocurrency exchanges and banks – must cease transacting with Bitzlato, or with any account or crypto addresses it uses. 

The designation was made under Section 9714 of the Combating Russian Money Laundering Act, which has a similar effect to a FinCEN 311 Action under the USA PATRIOT Act – an authority used only on a select number of financial institutions over the past 20-plus years involved in egregious and major money laundering.

A Primary Money Laundering Concern designation is one of the most severe actions the US Treasury can take against an entity it accuses of involvement in financial crime. FinCEN’s action against Bitzlato may be the first time it has labelled a crypto business as a primary money laundering concern, but it is unlikely to be the last. 

In announcing the action, US Treasury Deputy Secretary Wally Adeyemo stated that “we are prepared to take action against any financial institution – including virtual asset service providers – with lax controls against money laundering, terrorist financing, or other illicit finance”.

Immediately after the FinCEN announcement, the Elliptic team was quick to conduct research and update our monitoring tools, so that customers could immediately identify exposure to this entity.

Compliance teams at crypto exchanges and financial institutions should take immediate steps to ensure they can adhere to these requirements set out in FinCEN’s order on Bitzlato. Because the prohibitions on dealing with Bitzlato include not only crypto transfers but also fiat currency transactions, banks and other financial institutions must ensure that they can detect exposure to Bitzlato among their fiat currency transactions, and reject any inbound transactions that involve the exchange.

Using Elliptic Discovery – our dataset of information on thousands of cryptocurrency exchanges – a financial institution can obtain additional information and identifiers related to Bitzlato that they can integrate into their transaction monitoring systems to identify potential exposure. This information is essential for ensuring that payments involving Bitzlato are appropriately rejected.  

Crypto exchanges and financial institutions should be prepared for similar actions in the future, and should take steps to ensure they can adhere to FinCEN’s requirements. Contact us to learn more about how Elliptic’s blockchain analytics solutions can assist.

Nexo Settles With SEC For $45 Million Over Lending Product

On January 19th, the US Securities and Exchange Commission (SEC) announced that crypto exchange Nexo had agreed to pay $45 million in penalties related to a crypto lending product it offered. According to the agency, Nexo offered a product it called its Earned Interest Product (EIP), which the SEC deemed to be a security because it allowed users to earn high returns with funds the exchange used at its discretion. 

Under the settlement agreement, Nexo will pay $22.5 million to the SEC for failing to appropriately register EIP as a security, and another $22.5 million to state regulators. The firm has also agreed to stop offering EIP to its users in the US. As we noted in last week’s regulatory update, Nexo is reportedly also the subject of an investigation in Bulgaria related to fraud and money laundering issues. 

The Nexo settlement is the latest in a series of news underscoring the SEC’s enforcement focus on high yield crypto lending products. On January 12th, it charged US crypto firms Gemini and Genesis with engaging in an unregistered offering of securities through their offering of a similar lending product to Nexo’s. As Elliptic’s research has shown, US regulators have issued more than $3.3 billion in enforcement penalties on crypto firms to date – a trend that underscores the importance of proactive regulatory compliance.

MiCA Faces Two-month Delay 

New crypto rules in Europe will face a slight delay. The final vote in the EU Parliament on the Markets in Cryptoasset (MiCA) regulation has been delayed from February to April – a postponement that will shift the implementation timeline for the measures back two months as well. MiCA – a comprehensive piece of regulation aimed to promote sound market practices among crypto firms and stablecoin issuers in the EU and to protect consumers – has generally been welcomed by the crypto industry because it offers a clear regulatory pathway forward for crypto businesses in the bloc. 

The slight delay means that MiCA’s provisions will likely come into effect between May and November 2024. The two-month shift in timelines, however, should not lead compliance teams to become complacent. It remains critical to start preparing for complying with MiCA now, so that your business can get ready for the significant incoming changes. 

Finnish Lawmaker Calls For EU DAO Framework

In other news from Europe, a senior lawmaker from Finland has called for Europe to pave the rise of the metaverse and web3 by providing a legal framework for decentralized autonomous organizations (DAOs). 

At the recent annual meeting of the World Economic Forum in Davos, Timo Harakka said that the EU should provide a harmonized legal framework for DAOs – a move that would position a leader in web3 innovation, and would ensure a common approach across the bloc. Regulators around the world have been wrestling with the question of how to address the risks and legal issues around DAOs, including in the US, where the Commodity Futures Trading Commission (CFTC) has petitioned the courts to issue a judgement on a DAO that the CFTC alleges violated US futures trading rules. 

These moves are hardly surprising. In our recent Regulatory Outlook Report, we predicted that regulators would focus increasing scrutiny of DAOs across 2023. 

Thailand Sets Out Crypto Custody Requirements

Thailand has published regulatory requirements for crypto custody businesses in the country. On January 17th, the Thai Securities and Exchange Commission (SEC) issued regulations clarifying standards that businesses engaged in crypto custody must adhere to for securing customer funds. 

The measures include requirements for custodians to have risk management frameworks in place for managing customers’ funds, ensuring they have policies in place to securely store private keys to wallets, and developing contingency plans and testing to manage potential compromise of wallets and keys.

The SEC’s regulations are designed to protect consumers from loss, including the risk of having their funds lost in cybercriminal hacks. To learn more about the country’s regulatory approach to crypto, see our Thailand country guide

  Read more

TBD

TBD partners with Benri to host the SSI Console

TBD partners with Benri to host the SSI Console: an interface for easy interaction with SSI services.

In the realm of digital identity, Self-Sovereign Identity (SSI) empowers individuals to control their personal data and online presence. SSI Console is a user-friendly SSI admin console that simplifies the management of SSI services.

With its range of features, including credential issuance, DID management, credential presentation requests, and DID registry access, the SSI Console serves as a gateway to the services hosted at SSI Service.

To learn more about how to use it, check out the SSI Console Guide.

Wednesday, 31. May 2023

SC Media - Identity and Access

CAPTCHA-breaking services gaining traction

More threat actors have been leveraging illicit services aimed at bypassing CAPTCHA checks, according to The Hacker News.

More threat actors have been leveraging illicit services aimed at bypassing CAPTCHA checks, according to The Hacker News.


Universal 2FA implemented for PyPI project maintainers

All Python Package Index project maintainers have been required to adopt two-factor authentication by the end of the year in a bid to better prevent account takeover attacks, reports SecurityWeek.

All Python Package Index project maintainers have been required to adopt two-factor authentication by the end of the year in a bid to better prevent account takeover attacks, reports SecurityWeek.


Indicio

Putting the You in UI/UX

The post Putting the You in UI/UX appeared first on Indicio.
Your product or service lives and dies by your brand. So how can you make sure your brand is remembered and leaves a good impression? It all starts with user experience. In this article we’ll look at how UX impacts your brand, a few tips and tricks, and look at Indicio’s approach.

By Tim Spring

It is typically said that you only get one chance at a first impression. Unfortunately, when it comes to new technologies, that often means it can’t just “work;” it also needs to look and feel enticing enough to the end user to get them to try it in the first place — and come back for more.

In the decentralized identity space, many companies are working on creating digital wallets to store digital identities using verifiable credentials that are held on the end user’s mobile device. Unfortunately, given the sheer number of options people have with apps on phones, it’s not a surprise that a recent survey found roughly 25% of apps are used only once after being downloaded.

Percentage of apps used only once by year

Chart credit: https://www.statista.com/statistics/271628/percentage-of-apps-used-once-in-the-us/ 

So how do we make sure that digital wallet apps don’t suffer this fate? Verifiable credentials have to be useful. The more useful they are, the more people can do with them in everyday life, the more they will be used. But that alone isn’t a guarantee of success. Digital wallets and verifiable credentials have to be easy to understand and easy to use. That’s obvious, right? But even that is not enough: They must be visually attractive. A 2009 study examined the effects of visual appeal and usability on user satisfaction found that user satisfaction was higher when using websites with high visual appeal and low usability than when using a site with low visual appeal but high usability. That first impression counts — but it’s often the last thing teams will think about when creating innovative technology. If you’re wondering how long that first impression lasts, according to research done by Google you have just 17 milliseconds to “hook” someone. The secret sauce to doing that is providing a combination of low complexity and high prototypicality (basically what the user expects to see — most users will have an assumption in their mind of what an ecommerce site or blog should look like and how they can interact with it).

When it comes to UI/UX, Indicio employs a simple, clean design. “We want someone to be able to pick up this technology and be able to use it without having to put a lot of thought into it,” says Indicio’s VP of Operations Scott Harris. While we have a step-by-step guide for our Holdr+ app (our version of a digital wallet) and other customer-facing products, we recognize that users are busy, or simply don’t want to read an exhaustive manual (only 25% of people read manuals for consumer products), the shortest learning curve is the path to success.

If you’re building a decentralized identity solution and would like to review your UI/UX or discuss best practices, Indicio is happy to help. You can get in contact with our team here. And if you are just looking for a plug-and-play solution we can easily adjust Holdr+ or Proven to fit your needs.

To learn more about Indicio’s approach to UI/UX you can watch a more in-depth interview with Scott on Identity Insights.

The post Putting the You in UI/UX appeared first on Indicio.


SC Media - Identity and Access

Guardrails on AI tools like ChatGPT needed to protect secrets, CISOs say

Identiverse panelists offer tips for developing policies around how employees can safely leverage artificial intelligence tools like ChatGPT.

Identiverse panelists offer tips for developing policies around how employees can safely leverage artificial intelligence tools like ChatGPT.


FindBiometrics

Missouri Launches Mobile ID with Remote Driver’s License Renewal Feature

Missouri has become the fifth state to launch a mobile ID based on technology from IDEMIA — and the first to let end users remotely renew their driver’s license. The […]
Missouri has become the fifth state to launch a mobile ID based on technology from IDEMIA — and the first to let end users remotely renew their driver’s license. The […]

Kenya’s Leadership Lays Out Digital ID Roadmap

With this year’s ID4Africa AGM event having taken place in Kenya, the digital identity-focused event appears to have prompted a slew of comments from the country’s leadership concerning Kenya’s own […]
With this year’s ID4Africa AGM event having taken place in Kenya, the digital identity-focused event appears to have prompted a slew of comments from the country’s leadership concerning Kenya’s own […]

Indicio

Newsletter Vol 54

The post Newsletter Vol 54 appeared first on Indicio.

Now on LinkedIn! Subscribe here

Polly Wants Self-Sovereign Identity — Taking Control of Your Digital Identity w/ Indicio

In this 60-minute video, Indicio CEO Heather Dahl joins the This Week in Enterprise Tech podcast to share her insights into everything decentralized identity — from why it is a foundation for creating verifiable data to how it can be accomplished.

Watch the Interview Identity Insights — The Importance of UI/UX with Scott Harris

In this episode, we are joined by Indicio VP of Operations Scott Harris to discuss the importance of UI/UX to developing successful verifiable credential solutions and driving adoption.

Watch the interview Digitalizing Vital Records: Do’s and Don’ts

Vital records — such as your birth, death, marriage, and divorce certificates — are going digital. The only question is how. In this article, we explain why verifiable credential technology offers the best solution.

Read more Data Security with ChatGPT

Indicio VP of Communications and Governance Trevor Butterworth was featured on a recent PhocusWire event to discuss the impact of generative AI on the travel sector and how best to meet data privacy and security challenges.

Read the full article Want to see more weekly videos? Subscribe to the Indicio YouTube channel! News from around the community:

IdRamp Orchestration Fabric now available in the MS Azure Marketplace

Anonyome shares seven Insights Into Progress on the Path to Decentralized Identity

Upcoming Events

 

Here are a few events in the decentralized identity space to look out for.

Identity Implementors Working Group 6/1 DIF DIDcomm Working Group 6/5 Aries Bifold User Group 6/6 TOIP Working Group 6/6 Hyperledger Aries Working Group 6/7 Cardea Community Meeting 6/8

The post Newsletter Vol 54 appeared first on Indicio.


UbiSecure

The Monetary Union Deepens as Remote Identity Proofing Becomes Reality

Finance is one of the most tightly regulated sectors in the Western world, and not without reason. It forms the foundation on... The post The Monetary Union Deepens as Remote Identity Proofing Becomes Reality appeared first on Ubisecure Customer Identity Management.

Finance is one of the most tightly regulated sectors in the Western world, and not without reason. It forms the foundation on which modern welfare states are built on. Historically, it has been one of the most fragmented sectors within the European Union, as each of the member states had spent centuries co-evolving with their local finance providers. Each has adapted to local needs and has been shaped by local histories.

European Monetary Union

Slowly but surely, the European Commission is seeking to accomplish a true monetary union within the member states, mimicking the formation of the largest contemporary federal republic – the United States of America. The process has been far from a walk in the park, and there is still considerable work required before the “United States of Europe” becomes a sovereign superstate.

Adopting the Euro as a common currency was a major accomplishment towards the goal. Another important aspect, hidden from plain sight, is unifying the regulation between the member states. For the traditional banking sector, one of the most important regulatory frameworks is called the Basel Framework. Now in the third iteration of the framework, it is often referred to as Basel III. Banks are currently implementing the latest updates after the previous iterations, Basel I and Basel II.

Following the delay caused by the COVID-19 pandemic, the European Commission has published its long-awaited proposal on implementing the final Basel III reforms. These reforms were required to be implemented in European law by January 1st, 2023.

What is the Goal of the Basel Framework?

The end goal has been clearly stated since the Basel I agreement:

“Taking a major step towards European unification by completing an EU-wide finance union and trying to prevent a continent-wide economic collapse by strengthening bank capital requirements, increasing minimum capital requirements, mandating the holdings of high-quality liquid assets, and decreasing bank leverage dynamically.”

This has proved to be controversial from the start. To address the presented concerns, the Commission is expected to launch a public consultation on the review of the Bank Recovery and Resolution Directive (BRRD), and the Deposit Guarantee Schemes Directive (DGSD). The implementation of the Basel III reforms is another major milestone towards the goals of the Commission.

Ongoing Issues in Implementing the Basel Framework

The impact of the COVID-19 pandemic, and the persistent inflation stemming from the extremely stimulatory monetary policy practiced by the European Central Bank (ECB) since the 2008 financial crisis, is far from over. It is likely to fuel arguments in favour of allowing more divergence in European law from the Basel III standard. For example, Nordic banks are currently operating at noticeably lower risk levels than major French and south European banks.

The Nordic banks have more of their assets tied to mortgages that are already stress-tested to 6% Euribor rates, instead of holding large quantities of sovereign debt from highly indebted southern European countries. In addition, the mortgages are typically valued at 70% of their face value, compared to 100% of the face value of sovereign debt. This is because the current regulation allows the assumption that a sovereign state cannot become insolvent – until of course one does, like Sri Lanka recently. EU-wide stress tests clearly reflect the difference in resilience. If this difference will not be reflected in the regulatory environment, then banks that have historically operated at higher risk than their competitors – by, for example, not diversifying their assets clearly across asset classes – could gain an unfair advantage. We will see whether the inevitable Basel IV will address these concerns.

For private investors, not all of the changes have been positive. For example, artificially limiting access to non-EU markets and financial instruments helps to keep the fees of European brokers at much higher levels than their American and Asian counterparts. In addition, some EU countries continue to blatantly ignore both European law and national one-to-one agreements on taxation, instead pushing all the burden to correct double taxation to the private investors themselves. They are now faced with complex legal procedures in foreign languages with little to no help from the national authorities – who were supposed to handle everything automatically already, years ago.

Remote Identity Proofing

Many could argue therefore that the changes have been without a net benefit so far, but there is one positive change on the horizon. This change is about allowing one to create a strong digital identity via remote connection, without already possessing a strong electronic ID (eID). This is called remote identity proofing, and it could potentially change the fintech landscape dramatically throughout the whole European Union, by allowing its more than 300 million residents to utilise financial services in all of the 27 member states, without having to physically travel.

The identity verification process is necessary to fulfil the legal requirements of Know Your Customer (KYC) and Anti-Money Laundering (AML) legislation. Despite the common misunderstanding, identity verification does not, by itself, represent any kind of trustworthy relationship between the end user and the organisation. Unlike the verifiable LEI, for example.

How Remote Identity Proofing Works

The verification would traditionally be carried out in person, by verifying a physical identity document and comparing the photograph of the holder of the document to their face in real life. For remote identity proofing, the European Telecommunications Standards Institute (ETSI), an independent organisation that standardises norms at the European level, has created a standard framework for remote identity proofing.

The standard recognises identification through video in a live stream, with the recording of the entire process, as the only way that remote identity proofing can have the same level of legal compliance as a physical face-to-face verification. This repeats what is already stipulated in the eIDAS regulation for the European Digital Identity Wallet (EUDI, often referred to as EU ID or EUID). This is as opposed to using still photographs or selfies, which have high fraud rates and therefore are not eIDAS compliant.

The first national launches of the EUDI are scheduled for 2024, and a continent-wide rollout will hopefully follow soon after. That will bring strong individual identities to the masses, then it’s time for the other half: strong organisational identities. Discover more about digital identities in finance or strong organisational identities below:

Components of the Digitalisation Journey of International Trade and Finance Realising the Vision for the Future of Digital Identity Organisation Identity, Legal Intent, and the Power of Frameworks The Importance of LEI in Global Trade and Supply Chains Comparing Organisation Identifiers

The post The Monetary Union Deepens as Remote Identity Proofing Becomes Reality appeared first on Ubisecure Customer Identity Management.


Entrust

Entrust Digital Card Solution launches new In-app Provisioning extension for Apple Pay

Consumer demand for intuitive digital services has changed the way we approach everything, including banking.... The post Entrust Digital Card Solution launches new In-app Provisioning extension for Apple Pay appeared first on Entrust Blog.

Consumer demand for intuitive digital services has changed the way we approach everything, including banking. We are moving from a primarily in-person service model to an anytime, anywhere model where consumers can manage their accounts and cards on their own conveniently on their mobile banking app or online portal. This digital transformation is a journey that requires making decisions that not only provide consumers with the convenience and flexibility of self-service capabilities, but also time to value for the financial institution.

One digital tool gaining in popularity among consumers is digital wallets. In fact, the use of digital wallets made up 48.6% of 2021’s ecommerce transaction value. Banks that want to compete and satisfy consumer expectations must better support these popular digital wallets. Yet, as convenient as these apps are, the ability to enroll a payment card into the wallet can be frustrating if not fully supported properly. Currently, cardholders can either choose to enroll their card in Apple Pay manually or via a bank enabled push provisioning.  Manual entry of card details for wallet enrollment can be a time-consuming and complicated process. Push provisioning can be done right within the banking app with a click of a button; however, not all banks have enabled this process. To offer an additional enrollment option for cardholders, Apple Pay has now issued a mandate requiring participating banks to offer more simplified options for provisioning to their wallet.

In response to this mandate, Entrust has launched a new in-app provisioning extension to the Digital Card Solution, from within the Apple Pay app. Coupled with the in-app push provisioning feature Entrust already offers, banks are now enabled to offer two ways of automatically enrolling cardholder cards into Apple Pay:

from the banking app from the Apple Pay app

How the Entrust Digital Card Solution in-app provisioning extension works

If a bank enables the Entrust Digital Card Solution in-app provisioning extension, the cardholder will follow the first steps of manual card entry in Apple Pay, but will benefit from a more seamless experience:

Within the Apple Pay app, the cardholder will click on the “+ Icon” in the top right corner. On the “Add to Wallet” screen, they will select a section called “From Apps on Your iPhone”, with their bank. The cardholder will then authenticate to their banking app. From there, they then click on the card they want to enroll and will confirm enrollment. The cardholder will then accept the terms and conditions.

After this short process, their payment card is enrolled and ready to use.

With this new feature within the Apple Pay app, the cardholder does not have to manually enroll the card by entering sensitive information (like the card number and the PAN). Instead, the cardholder will see their bank card already proposed for enrollment. This occurs because Apple Pay will pull the information from all banking apps installed on the phone to show the cards eligible for provisioning into the wallet. Cardholders also benefit from full tokenization and security, thanks to the authentication process that has already taken place in the bank app.

 

The experience for the cardholder is like the in-banking app enrollment but is initiated directly in the Apple Pay wallet.

And the benefit to banks? It will allow banks to promote the enrollment to Apple Pay even further. The bank also gains additional visibility within the Apple Pay app. Ultimately, adding this enrollment option will lead to higher usage and help banks reach and/or maintain top of wallet status.

Existing Entrust Digital Card Solution customers can easily add the new in-app provisioning extension to their offering, as this feature can be enabled thanks to the already implemented Entrust SDK.

Existing customers can reach out to their account executive to learn more. Those who are interested in this new feature, and push provisioning to Apple Pay in general, can contact us here.

The post Entrust Digital Card Solution launches new In-app Provisioning extension for Apple Pay appeared first on Entrust Blog.


IBM Blockchain

Preventive maintenance vs. predictive maintenance

Exploring some of the most commonly used proactive maintenance approaches. The post Preventive maintenance vs. predictive maintenance appeared first on IBM Blog.

Your maintenance strategy may not be the first thing that springs to mind when thinking about the bottom line. Yet, given that machinery, equipment and systems keep businesses running, maintenance strategies have a major role to play. Without due care and attention, things break—regardless of whether that’s a transformer in an electricity grid, an axle bearing on a train or a refrigerator in a restaurant.

When assets malfunction or aren’t performing optimally, there can be safety issues and financial implications – the average manufacturer reportedly loses about 800 hours a year in downtime. Add to that aging infrastructures, workforce retention, budget constraints and sustainability pressures, and it’s easy to see why businesses need to find ever better ways to keep assets in good operating condition.

Understanding and planning for when your equipment is likely to fail can drive greater efficiency in production operations, but how do you decide which strategy is the most cost-effective one for you? The decision isn’t simple. Multiple factors must be considered, such as your industry, the type and usage of the asset, how expensive it is to replace, how much of the right kind of data you have, and how much impact failure would have on your business and customers. There is no one-size-fits-all solution, and most companies opt for a combination of different maintenance strategies across their asset portfolios.

Reactive, preventive and predictive maintenance

Reactive, preventive and predictive maintenance strategies are the most commonly used maintenance approaches. Reactive maintenance (also called corrective maintenance) is exactly that—reacting to breakdowns when they occur. It is suited to low-cost, non-critical assets that don’t pose safety or operational risks if a run-to-failure strategy is deployed.

Preventive and predictive maintenance are proactive maintenance strategies that use connectivity and data to help engineers and planners to fix things before they break. Predictive strategies take this even further and use advanced data techniques to forecast when things are likely to go wrong in the future. Both strategies aim at reducing the risk of catastrophic or costly problems.

Let’s take a deeper look at these proactive approaches.

What is preventive maintenance?

Preventive maintenance uses regular maintenance plans to reduce the chances of an asset breaking down by carrying out routine maintenance tasks at regular intervals. Using best practices and historical averages, such as mean-time-between-failure (MTBF), downtime is planned. Preventive maintenance strategies have been around since about 1900 and widely used since the late 1950s.

Three major preventative maintenance types have developed that all involve carrying out maintenance on a regular basis but are scheduled differently and are tailored to different business operation purposes.

Usage-based preventive maintenance schedules base future maintenance and inspections on asset usage, like changing your car tires after 50,000 miles. Calendar or time-based preventive maintenance sets specific time intervals for maintenance, such as having your home furnace serviced annually. Condition-based maintenance creates schedules based on factors like asset wear and degradation.

In all types of preventive maintenance, machine downtime is planned in advance, and technicians use checklists for checkups, repair, cleaning, adjustments, replacements and other maintenance activities.

What is predictive maintenance?

Predictive maintenance builds on condition-based monitoring by continuously assessing an asset’s condition. Sensors collect data in real-time, and it is fed into AI-enabled enterprise asset management (EAM), computerized maintenance management systems (CMMS) and other maintenance software. Through these types of software, advanced data analysis tools and processes like machine learning (ML) can identify, detect and address issues as they occur. Algorithms are also used to build models that predict when future potential problems may arise, which mitigates the risk of the asset breaking down further down the line. This can result in lower maintenance costs, a reduction of some 35-50% in downtime and a 20-40% increase in lifespan.

Various condition monitoring techniques are used to identify asset anomalies and provide advance warnings of potential problems, including sound (ultrasonic acoustics), temperature (thermal), lubrication (oil, fluids), vibration analysis and motor circuit analysis. A rise in temperature in a component, for example, could indicate a blockage in airflow or coolant; unusual vibrations could indicate misalignment of moving parts or wear and tear; changes in the sound can provide early warnings of defects that can’t be picked up by the human ear.

The oil and gas industry was a pioneering adopter of predictive maintenance as a way to lower the risk of environmental disasters, and other industries are also increasingly seeing the benefits. In the food and beverage industry, for example, undetected food storage issues could have major health consequences, and in shipping, anticipating and preventing equipment failures reduces the number of repairs that have to be made at sea, where it is harder and more expensive than in port.

What’s the difference between predictive and preventative maintenance?

Both types of maintenance strategies increase uptime and reduce unplanned downtime, improving the reliability and lifecycle of assets. The main differences are in timing and the ability to predict the future likely condition of an asset.

Preventive maintenance programs use historical data to anticipate the expected condition of an asset, and they schedule routine maintenance tasks at regular intervals in advance. While this is good for planning, assets may be under- or over-maintained, given that the vast majority of asset failures are unexpected. A problem might be diagnosed too late to prevent damage to an asset, for example, which will likely mean longer downtime while it’s fixed, or time and money may be spent when there’s no need.

Predictive maintenance avoids unnecessary maintenance by understanding the actual condition of the equipment. This means it can flag up and fix problems earlier than preventive maintenance and prevent more serious issues from developing.

Predictive maintenance leverages new technologies like artificial intelligence, machine learning and the Internet of Things (IoT) to generate insights. Maintenance management systems and software automatically create corrective maintenance work orders, enabling maintenance teams, data scientists and other employees to make smarter, faster and more financially sound decisions.

Inventory management workflows like labor and spare parts supply chains become more efficient and sustainable through minimizing energy usage and waste. Predictive maintenance can feed data into other maintenance practices based on real-time analytics like digital twins, which can be used to model scenarios and other maintenance options with no risk to production.

There are obstacles to overcome for predictive maintenance to be effective or even possible, such as complexity, training and data. Predictive maintenance requires a modern data and systems infrastructure that may make it costly to set up when compared with preventive maintenance. Training the workforce to use the new tools and processes and correctly interpret data can be expensive and time-consuming. Predictive maintenance also relies on the collection of substantial volumes of specific data. And lastly, implementing a predictive maintenance strategy requires a cultural change to accommodate the shift from predetermined to more flexible daily operations, which can be challenging.

In summary, although preventive and predictive maintenance strategies both focus on increasing asset reliability and reducing the risk of failures, they are very different. Preventive maintenance is regular and routine, whereas predictive maintenance focuses on providing the right information about specific assets at the right time. Preventive maintenance is suited to assets where failure patterns are predictable (e.g., recurring or frequent problems) and the impact of failure is comparatively low, whereas predictive maintenance may be more advantageous for strategic assets where failure is less predictable and the business impact of failures is high. Ultimately, if predictive maintenance strategies are successfully deployed and run, they will result in happier customers and substantial cost savings through optimized maintenance and asset performance.

Transform your maintenance management with IBM Maximo® Application Suite

The good news is IBM can help. IBM Maximo Application Suite is a set of applications that enables you to move maintenance planning beyond time schedules to condition-based predictive maintenance based on asset health insights.

Combining operational data, IoT, AI and analytics in a single, integrated cloud-based platform, Maximo will drive smarter, data-driven decisions that improve asset reliability, lengthen asset lifecycles, optimize performance and reduce operational downtime and costs.

The post Preventive maintenance vs. predictive maintenance appeared first on IBM Blog.


Connected products at the edge

A look at the frequently overlooked phenomenon of connected products and how enterprises are using them to their advantage. The post Connected products at the edge appeared first on IBM Blog.

There are many overlapping business usage scenarios involving both the disciplines of the Internet of Things (IoT) and edge computing. But there is one very practical and promising use case that has been commonly deployed without many people thinking about it: connected products. This use case involves devices and equipment embedded with sensors, software and connectivity that exchange data with other products, operators or environments in real-time.

In this blog post, we will look at the frequently overlooked phenomenon of connected products and how enterprises are using them to their advantage. This is especially true in manufacturing and industrial engineering. From strategy to design, development and deployment, there is a lot of thought that goes into connecting physical products. While we examine this from the perspective of edge computing, it also has major implications for Industry 4.0.

We assume readers are familiar with Industry 4.0, which involves the integration of advanced digital technologies and IoT into manufacturing processes and connected devices that transmit and receive instructions and data. This allows for greater automation and optimization of production processes, leading to increased efficiency, productivity and flexibility in manufacturing. For more information about the concept, see the link below.

Learn more about Industry 4.0

Please make sure to check out all the installments in this series of blog posts on edge computing:

Part 1: “Cloud at the Edge“ Part 2: “Rounding out the Edges“ Part 3: “Architecting at the Edge“ Part 4: “DevOps at the Edge“ Part 5: “Policies at the Edge“ Part 6: “Models Deployed at the Edge“ Part 7: “Security at the Edge“ Part 8: “Analytics at the Edge“ Part 9: “5G at the Edge“ Part 10: “Clusters at the Edge“ Part 11: “Automation at the Edge“ Part 12: “Network Slicing at the Edge“ Part 13: “Data at the Edge“ Part 14: “Architectural Decisions at the Edge“ Part 15: “GitOps at the Edge“ Part 17: “Storage Services at the Edge“ Part 18: “Cloud Services at the Edge“ Part 19: “Distributed Cloud: Empowerment at the Edge“ Part 20: “Data Sovereignty at the Edge“ Part 21: “Solutioning at the Edge“ Elements of connected products

There are three core elements when it comes to connected products:

Physical components (industrial, mechanical, automotive, home appliances and electrical parts). Smart components (phones, tablets, sensors, microprocessors and analytics). Connectivity components (antennae, ports, protocols and networks that can send data to the cloud). Figure 1. Core elements of connected products.

As we see in Figure 1, data is the lifeblood of connected products. That said, why are connected products becoming vital to enterprises? With connected products, customers expect to not only buy the latest and greatest cutting-edge devices, but they also expect these devices to continuously work, improve and be updated with newer features over their lifetimes. Over-the-air (OTA) updates of Tesla cars or Apple products are good examples of delivering new software, firmware, features, safeguards, etc. to connected products.

Connected products and services

There is more to connected products than just over-the-air updates. Connected products can enhance support by monitoring and optimizing usage. In industrial settings, they provide the health status of machines and can predict/prevent failures or downtime by anticipating service needs. Robots on the manufacturing floor are programmed to be aware of and work with other robots. Fueled by data, a product can be connected interactively with a broader ecosystem, offering an enhanced customer experience (CX), optimized product performance and services, and an agile supply chain that can deliver new sources of value to the customer.

Figure 2. Connected products and services across a manufacturing enterprise.

Companies have connected products and services that span business and technology strategy, connectivity enablement, intelligent edge connectivity and computing, and more. Figure 2 provides IBM’s view of components that will provide a competitive advantage for today’s manufacturing enterprises. In the Connected Products and Services domain, we see the three main aspects:

Intelligent Automation of IT and Business Operations Zero-Trust for Manufacturing and Connected Products Efficiently Manage Engineering Data

IBM also has an offering that includes IBM Cognitive Assistant for connected products which leverages data and learns from the customer interactions across all aspects of connected products lifecycle.

With connectivity comes security concerns as products become more and more software-defined and data-driven. Connected products are built on multi-faceted platforms running on multiple operating environments that include the combination of custom compute platforms, traditional compute platforms and cloud. It is paramount to transform current security approaches from point solutions to an end-to-end, cross-platform solution that protects connected products and related services.

Connected products vs. edge computing

Many of the features that we just alluded to are also seen in edge computing solutions.

One might argue that connected products are just a manifestation of an edge computing use case specifically related to the domain of customer experience (CX). The difference is found in the definition of edge computing, which states that data is analyzed at the source where data is generated. Connected products, on the other hand, are driven by responses received by sending the data to the cloud.

In some cases, smart connected products can operate independently. If a smart robotic vacuum cleaner can return to its docking station, is it far-fetched to envision autonomous vehicles driving themselves to a charging station? Vehicles are evolving into connected products with connections to smartphones, other vehicles and surrounding infrastructure via advanced sensor technologies with both remote and on-board processing capabilities. Such capabilities require frequent OTA updates from the manufacturer. We can make the case that edge computing and the supporting ecosystem are paving the way for autonomous vehicles to become the ultimate mobile edge devices and prototypical connected products.

Figure 3. Connected products/edge.

As we have described in previous blogs in this series, IBM Edge Application Manager (IEAM) is best suited to deploy and manage applications on edge devices and far edge devices.

Data as the currency of connected products

One of the past blogs in this series—“Data at the Edge”—talked about handling all the data that is generated at the edge. Those same requirements of data compliance, data privacy, data sovereignty, data governance and data residency are just as relevant with connected products.

Connected products send and receive lot of data to the cloud. There are laws dictating the collection and storage of all this data. When adding connectivity to products, designers and manufacturers should understand and take steps to mitigate the threat radius. The IEC 62443 and UL 2900 families of standards apply to connected products used in the home or commercial settings, medical devices, and security and life safety systems. The State of California, for example, has a law called the “Teddy Bear and Toaster Act” that purports to provide increased security to avoid malware attacks and protect consumers who use connected devices.

While connected products vis-à-vis IoT and edge generate a lot of data, neither offers a data plane. To that end, the IBM Cloud Pak for Data offers many options to handle and store data for operational purposes, analytics and auditability. We envision it being a part of the connected-products solution.

Wrapping up

From a customer experience perspective, we see connected products having a major impact in automotive, medical technology, consumer products, Industry 4.0, and energy and utilities. To harness the full potential of connected products and services, organizations must put customer experience front and center while also ensuring data and network security.

We view connected products as devices operating within and helping humans in an edge computing domain,  with data being the lifeblood of such solutions. Enterprises can offer many personalized after-market services via connected products and expand industry boundaries.

Let us know what you think.

Thanks to Joe Pearson and Charla Stracener for reviewing the article and providing their thoughts.

Please make sure to check out all the installments in this series of blog posts on edge computing:

Part 1: “Cloud at the Edge“ Part 2: “Rounding out the Edges“ Part 3: “Architecting at the Edge“ Part 4: “DevOps at the Edge“ Part 5: “Policies at the Edge“ Part 6: “Models Deployed at the Edge“ Part 7: “Security at the Edge“ Part 8: “Analytics at the Edge“ Part 9: “5G at the Edge“ Part 10: “Clusters at the Edge“ Part 11: “Automation at the Edge“ Part 12: “Network Slicing at the Edge“ Part 13: “Data at the Edge“ Part 14: “Architectural Decisions at the Edge“ Part 15: “GitOps at the Edge“ Part 17: “Storage Services at the Edge“ Part 18: “Cloud Services at the Edge“ Part 19: “Distributed Cloud: Empowerment at the Edge“ Part 20: “Data Sovereignty at the Edge“ Part 21: “Solutioning at the Edge” Learn more IBM Edge Application Manager IBM Cloud Pak for Data IBM Cloud Pak for Network Automation Red Hat Edge Related articles “How Smart, Connected Products Are Transforming Competition” “Cybersecurity for Connected Products: Part 2

The post Connected products at the edge appeared first on IBM Blog.


IDnow

IDnow joins the ICAO Public Key Directory (PKD) pilot

Collaboration increases IDnow’s document checking capabilities in the air travel industry Munich, May 31, 2023 – IDnow, a leading identity proofing platform provider in Europe, announces its participation in the International Civil Aviation Organization (ICAO) Public Key Directory (PKD) pilot program to further enhance its document checking capabilities in commercial air travel. First established i
Collaboration increases IDnow’s document checking capabilities in the air travel industry

Munich, May 31, 2023 – IDnow, a leading identity proofing platform provider in Europe, announces its participation in the International Civil Aviation Organization (ICAO) Public Key Directory (PKD) pilot program to further enhance its document checking capabilities in commercial air travel. First established in 2007, the ICAO PKD is a central repository for exchanging the information required to authenticate electronic Machine Readable Travel Documents (eMRTDs), such as ePassports and electronic ID cards.

Today covering 90 issuing authorities worldwide, the ICAO PKD offers an effective means for countries to upload their own technical data related to ePassports and electronic ID cards and download that of other issuing authorities, thus eliminating the need for bilateral exchange of this data. Essentially, the ICAO PKD takes on the role of a central broker for this technical information while also helping to ensure that the information adheres to the necessary technical standards (i.e., ICAO 9303 specifications) to maintain interoperability.

Security is a top priority in air travel

By joining the time-limited ICAO PKD pilot, IDnow has received temporary permission to read out the data stored in the NFC (Near-field communication) chip of ePassports and electronic ID cards and to use this data in their commercial solutions in the air travel domain.

“Security and facilitation are a top priority in air travel, and identity verification plays a crucial role in any travel security equation nowadays,” says Lovro Persen, Director Document and Fraud at IDnow. “We are therefore very pleased to participate in the ICAO PKD pilot. Being part of this program allows us to verify the authenticity of an electronic ID document and its rightful origin. In addition to our advanced in-house fraud fighting measures, we are thus able to verify that the data on the document’s chip is authentic and that the issuing authority is the one it claims to be. Furthermore, we can extract the embedded portrait photo of the document holder from the NFC chip and use it for biometric verification, knowing that it has not been tampered with,” he adds.  

Wednesday, 31. May 2023

IBM Blockchain

Join us at PrestoCon Day, a free virtual community event

Learn how companies are using Presto to power their data lakehouses at scale by attending the free virtual PrestoCon Day conference. The post Join us at PrestoCon Day, a free virtual community event appeared first on IBM Blog.

The Presto Foundation is excited to share its upcoming virtual community conference PrestoCon Day, taking place on 7 June 2023.

Register for the free, virtual event What is Presto?

Presto is an open-source, fast and reliable SQL query engine that provides one simple ANSI SQL interface for all your data analytics and your open lakehouse. Some of the biggest companies in the world are contributing to the Presto open-source project, including Meta, Uber and Intel.

Why Presto and IBM?

IBM recently made an exciting announcement in AI: watsonx, an enterprise-ready AI and data platform designed to multiply the impact of AI across your business, with 3 core capabilities:

watsonx.ai — to train, validate, and tune AI models watsonx.data — to scale AI workloads for all your data, anywhere watsonx.governance — to enable responsible and transparent AI

Built on an open lakehouse architecture, watsonx.data is the open, hybrid and governed fit-for-purpose data store optimized to scale all data, analytics and AI workloads. Watsonx.data is designed with multiple open-source query engines, including Presto and Spark, that can optimize workload costs and performance at scale. Watsonx.data will incorporate the latest performance enhancements to the Presto query engine and continue to optimize the engine through IBM’s recent acquisition of Ahana, the only SaaS for Presto and a strong contributor to the Presto open-source community.

IBM has a significant history contributing to open-source capabilities. The company was one of the earliest champions of open source, partnering with organizations like Linux, Apache, and Eclipse, pushing for open licenses, open governance and open standards. Additionally, Intel is a founding member of the PrestoDB foundation, and has collaborated with IBM to deliver rapid and reliable data processing to watsonx.data, which is also engineered to use Intel’s built-in accelerators on Intel’s new 4th Gen Xeon Scalable Processors.

What is PrestoCon Day?

PrestoCon Day brings together Presto users and developers from organizations around the globe. The virtual show will feature many marquee users sharing how they use Presto to power their data analytics and lakehouses. Session highlights include:

Presto at Adobe: How Adobe Advertising uses Presto for Adhoc Query, Custom Reporting, and Internal Pipelines How Bolt, the “Uber” of Europe, implemented a Lakehouse architecture with Presto Presto query performance optimization for Alibaba Cloud’s log analytics service How HPE Ezmeral uses Presto for unified analytics Driving innovation with the power of open source and open governance with Presto Foundation Chair Girish Baliga of Uber and Vikram Murali, VP of Hybrid Data Management at IBM

The full agenda is available here. Anyone who wants to learn how some of the world’s largest companies are using Presto to power their data lakehouses at scale is encouraged to attend. At PrestoCon Day, you’ll get to see some of the new features coming to Presto, IBM watsonx.data, as well as learn more about common use cases and best practices.

Register for PrestoCon Day

The post Join us at PrestoCon Day, a free virtual community event appeared first on IBM Blog.


Identosphere Identity Highlights

Identosphere 135: European Pilots Updates • Wallets can't be Transformers • Unique Short History of the LEI

You weekly guide to the latest news events and related info surrounding development and implementation of decentralized identity and verifiable credentials
Identosphere’s Weekly Highlights We Gather, You Read! We’ll keep aggregating industry info. Show support by PayPal, or Patreon! Upcoming

[Finland] MyData 2023 (the most impactful event for personal data) 5/31-6/1

[Las Vegas] Identiverse 5/31-6/1 ←Kaliya is there in person, reach out.

[online] Hyperledger AnonCreds Workshop: Using ZKP Verifiable Credentials Everywhere 05-31

[online] did:hack - decentralized identity hackathon for people to learn, collaborate, and build a project centered around decentralized identity. 6/5-8

[Zurich] Digital Identity unConference Europe 6/7-9 ← Kaliya co-facilitating.

[Virtual \ Paris] Web3 & Digital Identity: a human-centric technology 06-14 

[NYC] Velocity Network Foundation® 2023 General Assembly 6/19-20 

[Amsterdam] The Closing Conference of the Blockchain & Society Policy Research Lab [Call for Papers] 7/3-4

[New Zealand] Digital Trust Hui Taumata: Registrations now open! 8/1

Industry Commentary No, Wallets Can’t be the Adapters Between Credential Formats 2023-05-26 IdentityWoman

Northern Block published a post last week written by their CEO Mathieu Glaude, putting out a call to action to figure out how we can have a “Digital Universal Credential [sic] Adaptor”. 

It is not possible because deep properties of how the cryptographic technology works and how technical trust is confirmed do not allow the wallet to be a “transformer”.

New Podcast Making Data Better Steve Lockstep

Data quality is fundamental to changing the Internet's economic model from the coin of our attention to something better

Development Finding SDKs for #verifiablecredentials has been a challenge Waylon Kenning

though there are good ones built right here in NZ with MATTR and now some from Microsoft too. Shows some real product maturing occurring.

Reduce fraud and improve engagement using Digital Wallets Microsoft

Decentralized Identity on Hedera Hedera

Applications issuing credentials to an end user, as well as every event in the credential’s lifecycle, are recorded on Hedera. When a credential is presented to an application or business, supporting information is securely retrieved to either validate or lookup related identity information.

IoT Swarms and  SSI for Constrained Networks Geovane Fedrecheski

Swarms of IoT devices

SSI for IoT makes sense, but has limitations

Reducing overhead of DIDComm and DID Documents

Standards Securing Verifiable Credentials using JSON Web Tokens 2023-05-25, W3C

This specification defines how to secure Verifiable Credentials with JSON Web Tokens (JWT) [RFC7519], which build on JSON Web Signatures (JWS) [RFC7515]. This enables Verifiable Credentials to be easily integrated into ecosystems that already support JSON Web Tokens.       

Verifiable Credentials JSON Schema Specification 2023 2023-05-24 W3C

Among other things, the [VC-DATA-MODEL-2] specifies the models used for Verifiable Credentials, Verifiable Presentations, and explains the relationships between three parties: issuers, holders, and verifiers.

Explainer

[explainer] Open Standards: The Building Blocks of Open Innovation 2023-05-24, Oasis Open, Francis Beland

Decentralized Identifiers (DIDs) 2023-05-24, WaltID - Today we explore decentralised identifiers (DIDs), self-sovereign, globally unique digital identifiers for individuals, companies, and devices.

7 Insights Into Progress on the Path to Decentralized Identity 2023-05-24, Anonyome, Dr. Paul Ashley

Verifiable credentials (VCs) have quickly become the “killer feature” of DI because they’re making selective disclosure of PII possible and will drive DI’s ubiquity across the internet in the next decade. 

[video] Webinar: Verifiable Credentials and Digital Wallets CI Compass, Erik Scott

Two pervasive problems throughout distributed computing are the determination of who a user is and their attributes, and the user-controlled, fine-grained sharing of personal information to third parties.

[Whitepaper] Innovating Identity and Access Management with Decentralized Identity AI and Decentralized Identity 2023-02-15, DONALD BULLERS

A complete information transformation, including how we perceive each other online, is underway. Shifting into warp speed, AI innovation, and the companies adopting this new technology, are leaving humanistic information in their dust.

Organization The Dawn of Decentralized Organizational Identity, Part 1: Identifiers 2023-05-24, Timothy Ruff

But first things first: before authority can be strongly verified, its source must first be strongly identified; it does no good to prove representation of an employer without first uniquely identifying who that employer is.

The Unique Short History of the LEI 2023-05-25, Timothy Ruff

In 2014, through its Financial Stability Board (FSB), the G20 carried out one of its decisions from the aftermath of the 2008 Financial Crisis: it formed GLEIF, the Global Legal Entity Identifier Foundation. GLEIF is as neutral as can be

Government Digital Identity - State of the Nations 2023-04-27, The Future of You

What is the state of Digital ID in the UK? Will the wallets offered by Big Tech win out in the USA? And how might eIDAS or the EUDI approach evolve across the EU?

DC4EU [tweet] What about sharing information with private stakeholders? 2023-05-26, DC4EU

Are you concerned about how your data is used and disclosed? #DC4EU is and will be GDPR compliant. You will have control of the attributes shared. #verifiablecredentials #eidas #eudiw #ebsi #w3c

[tweet] How many times have you shown your university diploma? 2023-05-26 DC4EU_project

How many times have you shown your university diploma? #DC4EU doesn't focus on higher education certificates only but on lifelong learning and micro-credentials on knowledge or competencies.

New Massive European Pilots EU Digital identity: 4 projects launched to test EUDI Wallet 2023-05-23, Digital Strategy - EU

involve more than 250 private and public organisations across almost every Member State, as well as Norway, Iceland, and Ukraine, and will run for at least 2 years. They represent a combined investment of over €90 million in the EU digital identity ecosystem

Pilots for European Digital Identity Wallet Consortium POTENTIAL

Access to government services

Opening of a bank account

Registration for a SIM card

Mobile driving licence

eSignatures

ePrescriptions

EU Digital Identity Wallet Consortium EWC

The storage and display of digital travel credentials

The organisation of digital wallets

The organisation of payments

Nordic-Baltic eID Wallet Consortium NOBID

The project will focus on a single use-case: the use of the EUDI wallet for the authorisation of payments for products and services by the wallet user.

Digital Credentials for Europe Consortium DC4EU

The project will test the use of the EUDI wallet in the educational sector and the social security domain. The pilot project will align with the European Social Security Pass and the European Learning Model. It will use the European Blockchain Services Infrastructure (EBSI) in the context of the EUDI wallet.

Company Stories Atala PRISM: pioneering digital identity with decentralized solutions 2023-05-11, Olga Hryniuk

his also includes the W3C DID spec for standardization and interoperability, DIDComm v2 – a chat for communication, and, coming soon AnonCreds, which ensure credentials’ increased privacy using selective disclosure and zero-knowledge proofs (ZKP).

dApp Dive: Disco & Decentralized Identity 2023-05-18, Zerion

In practice, decentralized identifiers are data and the way to use it. DID by itself isn’t particularly useful. To do something with a DID, you usually need a verifiable credential.

IdRamp Orchestration Fabric now available in the MS Azure Marketplace 2023-05-26

IdRamp combines Microsoft Entra Verified ID with a user-friendly zero-code approach that enables seamless integration and automated service delivery across multiple cloud environments.

[verifiable credentials] Energy Web Launches Certification for Sustainable Bitcoin Mining 2023-05-25 [architecture] [thread]

Switchboard, our decentralized access management system, used for allowing independent auditors to review and issue certifications to Bitcoin miners in verifiable credential format.

[SIWE.eth] used for authenticating the users into platform securely using their Energy Web Chain anchored decentralized identifiers (#DID).

verifiable credentials (#VC) used for storing Bitcoin miner certifications in a secure and verifiable off-chain data format.

verifiable presentations used for secure and verifiable data sharing between applications inside the GP4BTC system.

Funding Worldcoin Approaches $100M in Latest Funding Round to Build Global Digital ID System 2023-05-17

Clearly there is money in this market 👀👀👀

As for its iris-scanning orb, the device is still very much in play. Worldcoin launch a registration tour last month that aims to scan eyeballs across nine cities. In the meantime, interested users can register for Worldcoin with their phone number until they’re in a position to get their eyes scanned

Business Digitalizing Vital Records: Do’s and Don’t 2023-05-24, Indicio, Tim Spring

A key takeaway here is that this technology is not difficult to integrate into current systems. All this data is already available in government databases. The addition of verifiable credentials allows people to better manage their own documents and have an extra layer of security and verification.

Use Case Open Recognition: Towards a Practical Utopia: Exploring the Future of Work and Learning 2023-05-23, We Are Open Co-op, Doug Belshaw

Microcredentials are usually based upon a standard known as Open Badges. The original dream for this standard was to solve the problem that lifelong learning happens everywhere, and but isn’t always recognised.

Why new identity systems that work are needed!

IRS flagged more than 1 million tax returns for identity fraud in 2023 2023-05-17, CNBC, Greg Lacurci

[not (just) Ticketmaster’s fault] A Mexican superstar’s concert tour proves no ticketing company’s tech is a match for fans and fraud. RestofWorld - “last year, when reggaeton superstar Bad Bunny performed to a partially empty stadium in Mexico City, while thousands of ticket holders crowded outside, unable to get in”

Thanks for Reading!

Read more \ Subscribe: newsletter.identosphere.net
Please support our efforts by Patreon or Paypal
Contact \ Submission: newsletter [at] identosphere [dot] net


FindBiometrics

Register for the Online Event: Around the World with Secure Identity

On September 27, 2023, the biometrics and identity conversation continues with our next full-day virtual event: “Around the World With Secure Identity.” Get ready for a day of interview-style sessions […]
On September 27, 2023, the biometrics and identity conversation continues with our next full-day virtual event: “Around the World With Secure Identity.” Get ready for a day of interview-style sessions […]

IBM Blockchain

SRG Technology drives global software services with IBM Cloud VPC under the hood

SRGT chose IBM Cloud VPC for a new cloud platform that offered both flexibility and security for hosting its client-facing applications. The post SRG Technology drives global software services with IBM Cloud VPC under the hood appeared first on IBM Blog.

Headquartered in Ft. Lauderdale, Florida, SRG Technology LLC. (SRGT) is a software development company supporting the education, healthcare and travel industries. Their team creates data systems that deliver the right data in real time to customers around the globe. Whether those customers are medical offices and hospitals, schools or school districts, government agencies, or individual small businesses, SRGT addresses a wide spectrum of software services and technology needs with round-the-clock innovative thinking and fresh approaches to modern data problems.

The task at hand

In 2019, SRGT needed a new cloud platform that offered both flexibility and security for hosting its client-facing applications. The cloud platform they ultimately would choose needed to scale up and down quickly to align with not only their own growth and usage but also the growth and usage from their application customers across dispersed geographic locations.

To solve for these essential production requirements, SRGT chose IBM Cloud—specifically, the next-generation of IBM Cloud’s infrastructure, called IBM Cloud Virtual Private Cloud (VPC). Within IBM Cloud VPC Infrastructure, SRGT decided to use IBM Cloud Virtual Servers for VPC.

“As we implemented Blender in education, healthcare and travel, we began to realize that we needed a more reliable, secure and scalable platform to meet the growing and various needs of our business and our customers. After a year of evaluating cloud providers, we selected IBM Cloud as the best solution,” said Mike Stone, Vice President, New Markets and Strategic Partnerships, SRGT.

Confidence and capabilities

IBM Cloud Virtual Servers for VPC deliver ultra-fast provisioning and elastic scaling with the highest network speeds and most secure software-defined networking resources available on IBM Cloud today. This was the exact kind of developer-friendly infrastructure SRGT needed to help drive their modern applications.

For example, SRGT hosts its premiere BlenderPass application on IBM Cloud Virtual Servers for VPC. BlenderPass is like the holy grail of smart wallets. It’s a complete mobile solution to organize documents and credentials with personalized notifications and reminders.

Travel businesses implement SRGT’s BlenderPass so their customers can save, organize and access all the IDs, medical information, tickets and documents needed to accurately and safely make someone’s final destination come to fruition as smoothly as possible. BlenderPass also provides real-time recommendations and creates individualized action plans that help users find the most efficient path into and around each country. IBM Cloud Virtual Servers for VPC enables BlenderPass to run on a private, enterprise network that’s streamlined for provisioning on and off.

“There’s no question that IBM Cloud is providing us and our customers the best solution. It’s allowing us to focus on the continued development of Blender without having to also worry about the underlying platform Blender runs on,” said Gail Pierson, Chief Academic Officer, SRGT.

Secure to the core

Security was also crucial to SRGT. Besides BlenderPass, they also deliver an entire fleet of software solutions—including BlenderLearn, BlenderExchange and BlenderConnect—that store and manage personal and sensitive customer data—items like vaccination cards, passports, student credentials and more.

IBM Cloud VPC delivered a level of continuous edge-to-edge cloud protection that each data set and application required since SRGT and their clients must comply with many travel requirements and regulations that are unique to each country and each municipality.

So, what’s really behind the curtain of IBM Cloud VPC that gives companies like SRGT uncompromised reliability and security to count on? The IBM Cloud VPC network is completely isolated, and the isolation takes place at three different levels:

The hypervisor: SRGT’s virtual server instances are isolated by the hypervisor. Their instances can’t directly reach other virtual server instances that are hosted by the same hypervisor if they’re not in the same VPC. The network: SRGT’s isolation occurs at the network level by using virtual network identifiers (VNIs). These identifiers are assigned to each subnet and scoped to a single zone. A VNI is added to every one of their data packets that enter any zone of the VPC. A packet that leaves a zone has the VNI stripped off. When the packet reaches its destination zone, entering through the implicit routing function, the implicit router always adds the proper VNI for that zone. It’s almost like IBM Cloud’s version of a ‘smart wallet’— networking style. The router: The implicit router function provides isolation to each VPC by providing a virtual routing function (VRF) and a VPN with MPLS (multi-protocol label switching) in the cloud backbone. Each VPC’s VRF has a unique identifier, and this isolation allows each VPC to have access to its own copy of the IPv4 address space. Reinforced partnership

Prior to hosting on IBM Cloud Virtual Servers for VPC, SRGT was running a majority of services on their own footprint, in their own hosting facility. Through industry research, they knew they could exceed their network, compute and storage performance if they optimized their core systems and applications through adoption of cloud services. More poignantly, with a global hyperscale provider that could deliver the kind of enterprise-grade backbone they, in turn, wanted to deliver to their customers. Enter IBM Cloud.

“In IBM we found an extensive cloud and technology portfolio, plus competitive pricing. Their support also made it extremely smooth and easy for us to migrate our technology and customers to their cloud platform, further reinforcing our selection of IBM Cloud as our technology partner,” said Mike Stone, Vice President, New Markets and Strategic Partnerships, SRGT.

In addition to virtual servers on IBM Cloud VPC, SRGT also uses IBM Cloud Load Balancers, IBM Cloud Block Storage, IBM Cloud Transit Gateway and IBM Cloud Monitoring and Advanced Support. Throughout migration, testing and production deployments, SRGT found IBM Cloud VPC offerings to provide an easy-to-understand view of resources including billing and usage information.

“The IBM Cloud VPC toolset allows us to manage geographically diverse resources and gives us a status dashboard with near real-time reporting of issues. And, because customer workloads fluctuate due to things like seasonality and growth, IBM Cloud VPC offerings dynamically scale in line with customer business needs to meet and manage these needs with a balance between resource costs and application responsiveness,” said Mike Stone, Vice President, New Markets and Strategic Partnerships, SRGT.

Get started Accelerate your operations with IBM Cloud VPC

The post SRG Technology drives global software services with IBM Cloud VPC under the hood appeared first on IBM Blog.


SC Media - Identity and Access

Managed IAM: The Quest for an Evolved Identity Experience - Bill Brenner - CFH #23


KuppingerCole

Policy-based Provisioning: Automating Access Entitlements and Access Reviews

by Nitish Deshpande Manually granting access to entitlements and access reviews is not easy. Organizations are shifting focus towards learning and discovering capabilities. Automation tools such as policy mining are being introduced to reduce the burden of governance for granting access and reviewing access. Policy-based provisioning of access and access reviews can help to reduce the efforts of r

by Nitish Deshpande

Manually granting access to entitlements and access reviews is not easy. Organizations are shifting focus towards learning and discovering capabilities. Automation tools such as policy mining are being introduced to reduce the burden of governance for granting access and reviewing access. Policy-based provisioning of access and access reviews can help to reduce the efforts of reviewing fine-grained or coarse-grained access by automating these processes. In this whitepaper, we will take a look at Tuebora’s model for mining provisioning policies, the importance of birthright provisioning, and automatic access to entitlements.

KYC Chain

What is KYC and Why do Crypto Custodians Require It?

Crypto custodians play an essential role in the global crypto ecosystem, securely storing digital assets for their customers. Due to the sensitivity of their business, they need to establish strong and secure anti-fraud measures in order to carefully vet their customers, and they are also subject to regulations that require them to carry out robust KYC and AML. This article takes a look at what th

SC Media - Identity and Access

OneMain pays $4.25M after ignored security flaws caused data breaches

New York’s Department of Financial Services hit OneMain Financial with a massive penalty, reflecting the severity of security failures found during an audit tied to multiple data breaches.

New York’s Department of Financial Services hit OneMain Financial with a massive penalty, reflecting the severity of security failures found during an audit tied to multiple data breaches.


KuppingerCole

Jun 29, 2023: Ensuring a Proper Fit: Trends in CIAM and Strategies for Effective Solution Selection

KuppingerCole would like to invite you to an upcoming webinar on Consumer Identity and Access Management (CIAM). Join us for an enlightening session where we will delve into the world of CIAM, exploring its significance in today's digital landscape and the best practices to ensure a seamless user experience while safeguarding consumer data.
KuppingerCole would like to invite you to an upcoming webinar on Consumer Identity and Access Management (CIAM). Join us for an enlightening session where we will delve into the world of CIAM, exploring its significance in today's digital landscape and the best practices to ensure a seamless user experience while safeguarding consumer data.

Dock

New Release: Send Credentials Directly to Recipients’ Dock Wallet and Email

Issuers can use Dock Certs to send credentials directly to recipients' emails and Dock Wallet. This streamlines the issuance process which saves time, improves efficiency, and ensures secure delivery.

We’re thrilled to announce the latest Dock Certs feature that allows issuers to conveniently send credentials directly from the platform to recipients’ emails and Dock Wallet. Now, issuers can enjoy a streamlined process that saves time, improves efficiency, and ensures secure delivery of important credentials.

When you issue a credential by using the recipients’ decentralized identifiers (DIDs), they will immediately receive the credential in their Dock Digital Identity Wallet app.

A decentralized identifier is a unique digital identifier that you fully own and control through your ID Wallet. DIDs enable you to securely receive and present their credentials in a privacy-preserving way.

Steps for Sending the Credential by Email

Let’s say the Nursing Licensing Authority wants to send each nursing graduate their credential by email. They will now see a field in Dock Certs to add the recipient’s email whenever they issue a credential.

1. The Nursing Licensing Authority enters Clarissa Olson’s email in Dock Certs.

2. The Nursing Licensing Authority previews the credential and selects Send credentials.

3. Clarissa will receive the credential by email as soon as it is issued from Dock Certs.

4. Clarissa scans the QR code with her Dock Wallet to import it on the app. She then gets a wallet notification saying she received a credential from the Nursing Licensing Authority. The QR Code becomes voided after Clarissa imports the credential.

Free trial users can only send themselves a  Credential and will need to choose a plan in order to email it to other people.
See the complete Dock Certs issuer guide on how to use all of the features.

Send a Credential by DID

With our latest feature, an issuer can send a credential directly to a recipient's Dock Wallet by simply entering the recipient’s DID in the Subject ID field in Dock Certs. If, for example, an employer wants to send an employee a job status credential, the employer would simply enter the staff’s DID in Dock Certs and they will receive the credential directly in their Dock Digital Wallet. This feature is powered by our DID Relay Service.

The Nursing Licensing Authority sends the credential by using Clarissa’s DID.

2. Clarissa receives the notification in the Dock Wallet.

About Dock

Dock is a Verifiable Credentials company that provides Dock Certs, a user-friendly, no-code platform, and developer solutions that enable organizations to issue, manage and verify fraud-proof credentials efficiently and securely. Dock enables organizations and individuals to create and share verified data.

Dock’s Solutions

Dock Certs: Web app to issue and manage digital identity and Verifiable Credentials Certs API: Easy integration with your system to make your data fraud-proof and instantly verifiable Dock Wallet: Easily store and manage their digital credentials and identity (individuals) and verify credentials instantly (organizations) Partner Use Cases SEVENmile issues fraud-proof verifiable certificates using Dock BurstIQ Makes Health Data Verifiable, Secure, and Portable With Dock Gravity eliminates Health & Safety certificate fraud with Dock Learn More Decentralized Identity Verifiable Credentials Self-Sovereign Identity Decentralized Identifiers (DIDs) Web3 Identity Blockchain Identity Management Selective Disclosure How to Prevent Supply Chain Fraud With Blockchain BurstIQ Use Cases That Leverage Verifiable Credentials Blockchain Food Traceability: Enhancing Transparency and Safety How to Prevent Certificate Fraud Digital Credential Platform

Monday, 29. May 2023

Shyft Network

Japan to Enforce Crypto Travel Rule from June 1st

Japan’s enforcement of the Crypto Travel Rule from June 1st, 2023, marks a significant step in tightening cryptocurrency regulations. The new regulations, while promising increased transparency and trust in the market, may also pose challenges to privacy and operational changes. Shyft Veriscope, with its strategic partnership with Sygna, emerges as the best frictionless Travel Rule Solution, ba
Japan’s enforcement of the Crypto Travel Rule from June 1st, 2023, marks a significant step in tightening cryptocurrency regulations. The new regulations, while promising increased transparency and trust in the market, may also pose challenges to privacy and operational changes. Shyft Veriscope, with its strategic partnership with Sygna, emerges as the best frictionless Travel Rule Solution, balancing regulatory compliance, cost-efficiency, and user privacy.

In the world of technology and innovation, Japan has always been a front-runner, setting trends that many follow.

With a unique blend of technological adeptness and a robust economy, the country continues to maintain its global significance, and its latest move in the cryptocurrency domain is no exception.

Starting June 1st, 2023, Japan will enforce stringent Anti-Money Laundering (AML) regulations to further refine and intensify the oversight of cryptocurrency transactions while supporting the crypto economy.

Announced by the Japanese parliament on May 23, this decision exemplifies the nation’s proactive efforts to align with global crypto regulations more effectively and exhibits the state’s adaptability to the evolving financial regulatory requirements.

More on the topic: The Most Googled Questions on FATF Travel Rule Answered

Work in Progress Since March 2021

In a regulatory journey spanning over a year, the Japan Virtual Currency Exchange Association (JVCEA) received a request in March 2021 from the Financial Services Agency (FSA) to introduce self-regulatory rules concerning the “crypto Travel Rule.”

This was followed by the JVCEA incorporating these rules in April 2022, which further led to the enforcement of the crypto Travel Rule through an amendment to the Act on Prevention of Transfer of Criminal Proceeds (APTCP) in December 2022.

Later, to clarify how the “crypto Travel Rule” applies to Virtual Asset Service Providers (VASPs), the FSA initiated a consultation process on these guidelines, signifying its intent to uphold transparency and dialogue within the crypto ecosystem.

And by March 2023, the FSA concluded its public consultation on APTCP guidelines, and now, in June, the Travel Rule will soon be officially enforced in Japan. In a country where cryptocurrencies have been adopted legally, the implications of these new AML regulations are far-reaching.

The Impact on Crypto Ecosystem

The enforcement of the Travel Rule in Japan is set to cast a wide net of implications across the cryptocurrency landscape. For starters, it may infuse a greater degree of trust in the cryptocurrency market.

By taking stringent measures against potential illicit activities, the rule can effectively alleviate some of the concerns existing and potential users might have about the crypto industry. This could increase crypto use and investment, lending greater liquidity and stability to the market.

Moreover, the Travel Rule brings the promise of heightened transparency. The mandate for exchange providers to share crypto sender and receiver information allows for a more open view of transaction flows. This could significantly reduce fraudulent activities and make it difficult for malicious actors to exploit the system, fostering a safer trading environment.

Despite these positives, there are some contradictions not to be ignored.

For instance, for many crypto users, the allure of cryptocurrencies lies in the anonymity they offer.

But with the new regulations in place, this veil of anonymity gets partially lifted, potentially dissuading some users and investors. This loss of anonymity could slow down the growth of the crypto market in Japan, at least in the short term.

However, it’s important to note that the effect of the Travel Rule on users’ privacy and businesses’ operations will largely depend on the nature of the platform and the Travel Rule solution they employ.

Choice of Travel Rule Solution

The choice of a Travel Rule solution can substantially impact the user experience and the level of privacy maintained in these transactions.

One promising approach is the adoption of a fully automated, peer-to-peer Travel Rule solution, like Shyft Veriscope. It allows for the necessary data exchange between VASPs without storing any sensitive user information on its servers. This approach not only mitigates the risk of data breaches but also restricts any third-party access, thereby adding an extra layer of privacy protection.

Moreover, Shyft Veriscope’s fully automated nature ensures that user experience remains positive and seamless, an essential factor that could influence users’ decision to continue engaging with the crypto market amidst these regulatory changes.

Ultimately, the choice of the Travel Rule solution becomes a critical determinant in striking the right balance between adhering to regulatory standards and preserving user privacy and experience. This will be a key factor in shaping the impact of the new regulations on both users and businesses in the crypto sphere.

CEX vs DEX: How do They Fare Under Japan’s Travel Rule?

For Centralized Exchanges, which already have strict Know Your Customer (KYC) protocols in place, the impact may not be substantial. After all, these platforms already possess much of the required information about their users, minimizing the effect of these new regulations on their operations and user experience.

On the other hand, Decentralized Finance (DeFi) platforms, known for their anonymity and less stringent KYC procedures, may face a greater challenge. The enforcement of the Travel Rule could disrupt the degree of privacy users enjoy on these platforms, potentially affecting their appeal to a certain user base.

So, while this regulatory change is undoubtedly transformative, it’s also diverse in its impact, affecting different areas of the crypto ecosystem to varying degrees. This difference, in effect, underscores the importance of a Travel Rule solution that can navigate these challenges while preserving user privacy and experience.

Comparatively Higher Threshold

Another noteworthy aspect of Japan’s enforcement of the Travel Rule is the transaction threshold set at $3,000. This means the rule will only apply to transactions exceeding this limit. This is a significant figure, higher than many other countries’ thresholds for applying similar AML regulations.

This relatively high threshold could have important implications for the crypto market in Japan. Smaller, everyday transactions, which constitute a large volume of crypto activity, will remain largely unaffected by these new regulations. This allows a significant portion of crypto users to continue enjoying the benefits of anonymity and ease of transactions they are accustomed to.

However, for larger transactions, the enforcement of the Travel Rule will necessitate more stringent information-sharing and verification processes. While this could potentially slow down transaction times and complicate the process for some users, it will also provide added security and transparency, helping to protect users against fraud and other illicit activities.

For crypto businesses, compliance with the Travel Rule means embracing operational changes. As such, the rule necessitates significant investments in infrastructural upgrades and personnel training to meet the new regulatory standards. This could impose a heavy burden on smaller businesses and startups, which might struggle to keep up with these requirements.

More on the topic: Demystifying the FATF Travel Rule: Your Most Googled Questions Answered — Part 2

How Should VASPs Comply?

With the impending enforcement of the Travel Rule in Japan, VASPs are faced with the challenging task of choosing a Travel Rule Solution.

Although the FATF doesn’t advocate for a specific compliance solution or method, VASPs have a moral duty to choose a solution that seamlessly navigates the intricate compliance requirements while preserving user privacy and experience. Shyft Veriscope ticks all the boxes.

Not only does Shyft Veriscope demonstrate cost-efficiency when compared to other available options in the market, but it is also a one-of-a-kind solution. As we highlighted above, being a Peer-to-Peer Travel Rule Solution, Shyft Veriscope blends efficiency and regulatory compliance without infringing upon user experience or privacy.

Shyft Veriscope’s appeal in the evolving Japanese crypto market is further strengthened by Shyft’s strategic partnership with Sygna, a well-regarded name in the realm of crypto AML compliance services with a robust presence in Japan.

The result is a solution that moves beyond mere compliance, addressing the “sunrise issue” of the Travel Rule and nudging the crypto industry closer towards full interoperability.

More details here: Shyft Veriscope — The Critical Infrastructure Underpinning FATF Travel Rule

Frequently Asked Questions on Japan Enforcing Crypto Travel Rule Q1: When will the Crypto Travel Rule be enforced in Japan?

The Crypto Travel Rule will be enforced in Japan from June 1st. The Japanese parliament announced the decision on May 23, aligning Japan more effectively with global crypto regulations.

Q2: How will the Crypto Travel Rule impact crypto users in Japan?

The Crypto Travel Rule will inject more trust and transparency into the cryptocurrency market, potentially increasing cryptocurrency usage and investment. However, the rule may also deter some users and investors as it partially lifts the veil of anonymity associated with cryptocurrencies.

Q3: What does implementing the Crypto Travel Rule mean for crypto businesses in Japan?

For crypto businesses, the Crypto Travel Rule means they will need to undergo operational changes to comply with the new regulations. This includes significant investments in infrastructural upgrades and personnel training to meet the new regulatory standards.

Q4: What is the role of the Financial Services Agency (FSA) in implementing the Crypto Travel Rule in Japan?

The FSA is the primary regulatory body overseeing the implementation of the Crypto Travel Rule in Japan. It works in collaboration with self-regulatory entities such as the Japan Virtual Currency Exchange Association (JVCEA) and the Japan Security Token Offering Association (JSTOA) to develop policies and rules that shape the cryptocurrency landscape in the country.

Final Note

The Travel Rule brings a significant shift towards a safer and more transparent crypto market in Japan, but not without impacting some of the very elements that made cryptocurrencies attractive to a certain segment of users and businesses.

As the industry adapts to these regulatory shifts, the long-term implications will largely hinge on the evolution of this new paradigm and the ensuing global trends.

However, amidst this sea of changes, one thing remains crystal clear: Japan, the “Land of the Rising Sun,” is taking a proactive and exemplary stance in tackling regulatory challenges.

Therefore, as countries across the globe grapple with the intricacies of crypto regulation, they can look towards Japan’s progressive strides for valuable insights and guidance.

____________________

Shyft Network powers trust on the blockchain and economies of trust. It is a public protocol designed to drive data discoverability and compliance into blockchain while preserving privacy and sovereignty. SHFT is its native token and fuel of the network.

Shyft Network facilitates the transfer of verifiable data between centralized and decentralized ecosystems. It sets the highest crypto compliance standard and provides the only frictionless Crypto Travel Rule compliance solution while ensuring user data is protected.

Visit our website to read more, and follow us on Twitter, LinkedIn, Telegram, Medium, and YouTube. Sign up for our newsletter to keep up-to-date on all things privacy and compliance.

Japan to Enforce Crypto Travel Rule from June 1st was originally published in Shyft Network on Medium, where people are continuing the conversation by highlighting and responding to this story.


auth0

What's the Right Authorization Model for My Application?

Learn which is the right authorization model for your application based on your use case.
Learn which is the right authorization model for your application based on your use case.

Sunday, 28. May 2023

KuppingerCole

Analyst Chat #174: Access Governance - Ensuring Visibility and Control of User Access

Join Matthias Reinwarth, Director of Identity and Access Management, and Nitish Deshpande, Research Analyst, as they delve into one of the most critical challenges faced by organizations today: visibility. Discover why organizations struggle with understanding user access and the potential risks of this lack of visibility. In this episode, they explore the key capabilities of access governance,

Join Matthias Reinwarth, Director of Identity and Access Management, and Nitish Deshpande, Research Analyst, as they delve into one of the most critical challenges faced by organizations today: visibility. Discover why organizations struggle with understanding user access and the potential risks of this lack of visibility.

In this episode, they explore the key capabilities of access governance, such as access review, certification, risk management, request management, and analytics, and how these capabilities enable organizations to gain comprehensive visibility into their assigned accesses. Don't miss this insightful discussion on enhancing control and mitigating risks through effective access governance.

Read Nitish's Leadership Compass here



Friday, 26. May 2023

FindBiometrics

Illinois Lawmakers Are Okay With Face-scanning Drones (Sometimes) – Identity News Digest

Welcome to FindBiometrics’ digest of identity industry news. Here’s what you need to know about the world of digital identity and biometrics today: AuthID Targets Web 3.0 After ‘Investment and […]
Welcome to FindBiometrics’ digest of identity industry news. Here’s what you need to know about the world of digital identity and biometrics today: AuthID Targets Web 3.0 After ‘Investment and […]

Worldcoin Raises $115M Series C to Fuel Biometric UBI Efforts

Worldcoin has raised $115 million in a Series C funding round led by Blockchain Capital, with contributions from a16z crypto, Bain Capital Crypto, and Distributed Global. The news comes after reports earlier […]
Worldcoin has raised $115 million in a Series C funding round led by Blockchain Capital, with contributions from a16z crypto, Bain Capital Crypto, and Distributed Global. The news comes after reports earlier […]

Fission

Fission Fridays: May 26th, 2023

Announcements Causal Islands🏝️ talks are now LIVE on YouTube and the Causal Islands website. Share your favorite learnings and takeaways by mentioning us on Twitter @causalislands and Mastodon @plnetwork.xyz/@causalislands. The May Distributed Systems Reading Group met this week to discuss A.M.B.R.O.S.I.A: Providing Performant Virtual Resiliency for Distributed Applications. Watch the call reco
Announcements Causal Islands🏝️ talks are now LIVE on YouTube and the Causal Islands website. Share your favorite learnings and takeaways by mentioning us on Twitter @causalislands and Mastodon @plnetwork.xyz/@causalislands. The May Distributed Systems Reading Group met this week to discuss A.M.B.R.O.S.I.A: Providing Performant Virtual Resiliency for Distributed Applications. Watch the call recording, join the Fission Discord, and visit the #dsys-reading-group channel to continue the conversation. ​New On The Blog IPFS Thing 2023: Decentralizing Auth, and UCAN Too

Fission co-founder and CTO Brooklyn Zelenka shares how UCANs simplify authorization while keeping it decentralized and preserving user agency.

IPFS Thing 2023: WNFS - Versioned and Encrypted Data on IPFS

Protocol Engineer Philipp Krüger gave a comprehensive yet approachable breakdown of how we constructed WNFS, our versioned and encrypted file system built on top of IPFS.

Thank you, and we'll be back in two weeks with more exciting updates!

-The Fission Team


This week in identity

E29 - Identity Mesh and Identity Fabric / Heliview IAM Conference Review / Cyber + Identity Mashup / People, Process and Technology / IAM Threat Reports

This week Simon and David review the recent Heliview IAM Conference that took place in the Netherlands. The main topic for the day was the rise of the identity fabric (or mesh) and how this can enable the modern organisation with a range of agile IAM components that supports both business and security use cases. Simon presented a keynote on the future of IAM - using some research from The Cyber Hu

This week Simon and David review the recent Heliview IAM Conference that took place in the Netherlands. The main topic for the day was the rise of the identity fabric (or mesh) and how this can enable the modern organisation with a range of agile IAM components that supports both business and security use cases. Simon presented a keynote on the future of IAM - using some research from The Cyber Hut focusing on where IAM may look like in 2028 and beyond...

They also discussed the need for people, process and technology integration, in order to map the existing IAM landscape to future investment and metrics.

They finish off by discussing the rise in cyber threat reports that have emerged in the past month that all have a very strong reliance on IAM - and why ITDR is a process not a product.

Cyber Threat Reports:

Joint Cyber Advisory: People's Republic of China State-Sponsored Cyber Actor Living off the Land to Evade Detection CISA Advisory: Hunting Russian Intelligence “Snake” Malware Permiso Security: Unmasking GUI-Vil - Financially Motivated Cloud Threat Actor




Tokeny Solutions

Network-agnostic tokenization platform for enterprises

The post Network-agnostic tokenization platform for enterprises appeared first on Tokeny.

Product Focus

Network-agnostic tokenization platform for enterprises

Any type of asset can be easily and compliantly represented on-chain using our platform, with the freedom for the issuer to select their preferred blockchain network.

This content is taken from the monthly Product Focus newsletter in May 2023.

We started Tokeny with the belief that everyone should have the opportunity to effortlessly purchase and trade financial products they are eligible for. With our network-agnostic tokenization platform, supporting multi-chains, we enable enterprises to turn this belief into a tangible reality.

Any type of asset can be easily and compliantly represented on-chain using our platform with the freedom for the issuer to select their preferred blockchain network. We cater to institutions seeking to tokenize assets, empowering them to benefit from direct securities distribution, automated subscription processes, and diverse liquidity solutions (DvD transferCEX featureBillboard, etc.), regardless of the underlying blockchain network.

Our T-REX tokenization platform acts as a data aggregator that harmoniously aggregates relevant information from blockchains, ensuring data on the platform is always synchronized with the selected blockchains. With a user-friendly interface, it facilitates efficient smart contract management, streamlines operations, and enables ERC3643 token trading. Moreover, it provides seamless connectivity to both external platforms and the DINO distribution network, covering centralized and decentralized trading platforms within the same blockchain network that the securities were issued.

In the realm of digital securities, the ERC3643 token standard brings compliance and control by only allowing transfers when both the investor rules (via ONCHAINID, digital identity smart contracts) and offering rules are met. While this standard is open-source, its implementation is considerably more intricate compared to the permissionless token standard ERC20. Building a functional platform from scratch demands a significant amount of time, resources, and expertise.

With an extensive track record spanning 6 years, our team of 20 experienced developers has built an end-to-end platform covering the entire lifecycle of ERC3643 tokens. Our value proposition revolves around expediting time-to-market for institutions to increase their market share with proven solutions.

It is worth noting that our SaaS platform and API are technically capable of supporting all EVM blockchains, which constitute a dominant share of on-chain activities (94% as per DefiLlama, a DeFi data aggregator). Our solutions have been meticulously optimized for three prominent networks: Ethereum, Polygon, and Avalanche. Notably, the presence of key multi-chain components, such as indexer and gas tank smart contracts, allows us to efficiently add new EVM networks to our platform.

Additionally, issuers have the flexibility to move digital securities from one chain to another as well as upgrade their smart contracts via our solutions. This dynamic future-proof solution allows issuers to confidently begin asset tokenization in response to any evolving circumstances. By providing a catalyst for tokenization adoption, we ensure a secure and reliable foundation for issuers to embrace this transformative trend.

Subscribe Newsletter

This monthly Product Focus newsletter is designed to give you insider knowledge about the development of our products. Fill out the form below to subscribe to the newsletter.

Other Product Focus Blogs Network-agnostic tokenization platform for enterprises 26 May 2023 Introducing Dynamic NAV Updates For Open-Ended Subscriptions 25 April 2023 New Feature: Enable CEX Trading for Tokenized Securities 21 March 2023 Tokeny’s 2023 Product Roadmap: Scaling RWA tokenization 2 February 2023 Our product highlights from 2022 30 December 2022 Upgradable ERC3643 V4 for tokenized securities 28 November 2022 DINO: the largest distribution network for tokenized securities 31 October 2022 How we are building an interoperable ecosystem for private markets 26 August 2022 ERC3643 Onchain Factory 22 July 2022 The Sound of the ERC3643 Community 21 June 2022 Tokenize securities with us

Our experts with decades of experience across capital markets will help you to digitize assets on the decentralized infrastructure. 

Contact us

The post Network-agnostic tokenization platform for enterprises first appeared on Tokeny.

The post Network-agnostic tokenization platform for enterprises appeared first on Tokeny.


IDnow

The future of crypto in the EU: A revolution is coming.

How on-chain KYC and the Lightning Network will take the industry to the next level. The future of crypto within the European Union (EU) is a complex and ever-evolving topic. The good news is that the EU is finally setting some ground rules for the crypto market. The bad news is that it has taken, […]
How on-chain KYC and the Lightning Network will take the industry to the next level.

The future of crypto within the European Union (EU) is a complex and ever-evolving topic. The good news is that the EU is finally setting some ground rules for the crypto market. The bad news is that it has taken, and is likely to take, a long time to implement. 

In recent years, there has been a growing sense of urgency to create a comprehensive crypto regulatory framework, partly due to the increasing popularity of crypto assets, as well as concerns about their potential to be used for illicit activities.  

Since the beginning of the year, I have been traveling around Europe, attending events and meeting with industry insiders and thought leaders, and one thing’s for sure: the crypto market is certainly moving in the right direction. Of course, there’s always been buzz – both positive and negative – surrounding the industry, but what with upcoming regulatory frameworks like the Markets in Crypto Assets regulation, and the UK’s proposed crypto regime, there appears to be more confidence and excitement from operators, regulators and consumers.  

Two recent developments, in particular, have really captivated the imagination of the industry: on-chain KYC and the Lightning Network. In fact, I recently moderated a fascinating roundtable discussion on how they’re both set to revolutionize the market. Before we delve into these topics, it’s important to know what decentralized finance (DeFi) is, and why I think it is crucial to the development of the industry. 

Centralization is a risk. Why DeFi is the way to go.

There has been a lot of talk about the ‘perils’ of DeFi, with many assuming the recent failure of the likes of FTX could indicate the technology doesn’t really work. But, I think the opposite is true, and that the recent events have just emphasized that we actually need DeFi, and cannot be dependent on certain authorities. Centralization is actually a risk in itself.  

Events such as the failure of FTX spark innovation and improvements, as well as regulation.  

DeFi refers to a set of financial applications and platforms built on blockchain technology that aim to provide open, permissionless, and inclusive financial services. Unlike traditional centralized financial systems such as banks, DeFi eliminates intermediaries and enables direct peer-to-peer transactions and interactions through smart contracts. 

Why is DeFi integral to the future of crypto? 

Decentralization: DeFi operates on blockchain networks, which are decentralized and distributed across multiple computers or nodes. This decentralized nature eliminates the need for intermediaries like banks, enabling users to have more control over their funds and financial activities.

Accessibility: DeFi platforms are generally open to anyone with an internet connection, allowing individuals from around the world, including the unbanked or under banked, to access financial services. This inclusivity promotes financial empowerment and can provide opportunities for economic growth in underserved regions. By removing barriers like geographical limitations, high fees, and strict regulations, DeFi can empower individuals to take control of their finances, access capital, and participate in global markets.

Transparency: DeFi operates on public blockchains, making all transactions and activities transparent and auditable. This transparency can help build trust among users as they can independently verify and audit the underlying code, smart contracts, and transactions.

Another important reason why DeFi is the future, is programmability. The technology utilizes smart contracts, which are self-executing agreements with predefined rules and conditions. Smart contracts automate financial processes, reducing the need for manual intervention and enabling the creation of innovative financial products like decentralized exchanges, lending and borrowing platforms, prediction markets, and more.

DeFi offers various opportunities for users to earn yields on their crypto assets. Activities like liquidity provision, staking, lending, and yield farming enable individuals to earn passive income on their holdings, potentially outperforming returns from traditional financial instruments.

While DeFi has gained significant attention and adoption, it’s important to note that the space is still evolving, and there are significant risks involved, such as smart contract vulnerabilities, regulatory challenges, and market volatility. However, many believe that with further development, innovation, and user education, DeFi has the potential to revolutionize the financial industry by offering more accessible, efficient, and inclusive financial services. 

Learn more about the role of KYC in a decentralized world in our blog, ‘Can Defi and KYC ever peacefully co-exist?’.

Why is MiCA so important?

 Amidst the ongoing development and regulatory considerations surrounding DeFi, it is worth examining recent regulations in the crypto space within the EU. One notable example is the MiCA regulation, which was approved by the European Parliament in April 2023. This ground-breaking regulation marks the world’s first comprehensive framework specifically tailored to crypto assets.  

MiCA introduces a range of requirements for crypto businesses to adhere to, including: 

Registration with national authorities
Compliance with anti-money laundering and terrorist financing (AML/CTF) regulations 
Provision of clear information to consumers about the risks of crypto assets 

MiCA provides legal clarity by defining and categorizing different types of crypto assets and activities within the EU. It sets out clear rules and definitions for crypto assets, such as cryptocurrencies, utility tokens, and stablecoins, which helps reduce ambiguity and promotes regulatory certainty for businesses and investors operating within the EU. 

It also introduces a licensing framework for crypto service providers. Businesses such as crypto firms, wallet providers, and custodians must obtain authorization from competent authorities within the EU. This authorization requirement aims to enhance consumer protection and ensure compliance with Anti-Money Laundering (AML) and Counter-Terrorism Financing (CTF) regulations. 

In order to safeguard the interests of investors, MiCA requires issuers of certain crypto assets to provide comprehensive and accurate information. MiCA also aims to promote market integrity and stability by setting standards for the operation of crypto exchanges and trading platforms. It includes provisions for orderly trading, transparency of prices, and measures to prevent market abuse and insider trading. These measures contribute to building trust and confidence in the crypto market. 

While MiCA is undoubtedly an important step toward regulating crypto in the EU, the UK is still debating how to move forward. However, it’s important to remember that MiCA is not a complete solution to the EU’s crypto problem, as it doesn’t address on-chain KYC.  

What is on-chain KYC?  

On-chain KYC refers to the process of verifying the identity of a crypto user by analysing their on-chain activity. This can be done by looking at things like the addresses they use, the transactions they make, and the tokens they hold. 

It is a valuable tool for law enforcement and regulators, as it can help them to identify and investigate suspicious activity, as well as track down criminals who use crypto to launder money or finance terrorism. 

On-chain KYC is necessary to protect businesses and consumers, while preventing crime. If your crypto platform doesn’t cover on-chain KYC, then how well do you really Know Your Customer?  

On-chain KYC is set to revolutionize DeFi and this cannot be ignored. Now let’s discuss the Lightning Network. 

Why is the Lightning Network important to the future of crypto? 

The Lightning Network is a second-layer payment protocol that runs on top of the Bitcoin blockchain. It allows for faster and cheaper payments and is becoming increasingly popular. 

The protocol is very important as it addresses one of the biggest challenges of Bitcoin: scalability. Although Bitcoin is a very secure and decentralized network, it is also very slow and expensive to use. The Lightning Network solves this problem by allowing for off-chain payments. 

How does the Lightning Network actually work?

The Lightning Network operates through the use of payment channels, where two participants open a payment channel by creating a multi-signature transaction on the underlying blockchain. This transaction serves as the funding transaction for the channel. 

The Lightning Network allows participants to route payments through a network of interconnected payment channels. This routing capability makes it a scalable solution. Participants can, at any point, close a payment channel and settle the final state on the underlying blockchain. The final settlement transaction reflects the net result of all the off-chain transactions that occurred within the channel. By ultimately settling on the  blockchain, participants ensure the security and immutability of the transactions. 

What is driving the development of the crypto industry? 

Decentralization, security and transparency are driving the evolution of the crypto industry. 

The leading blockchain protocols are decentralized and are not subject to government control or control from financial institutions. This makes them an attractive proposition to people concerned about privacy and censorship. Cryptocurrencies are secured by cryptography, which makes them difficult to hack, and a more secure store of value than traditional fiat currencies. 

Lastly, most blockchains are transparent, meaning all crypto transactions are recorded on a public ledger. This is not the case with traditional financial systems. 

Challenges and opportunities in the DACH region. 

Although they each have their specific challenges, Germany, Austria and Switzerland are great markets for crypto businesses to enter. 

Some of the key challenges are the lack of regulation, as the crypto industry is still in its early stages there. Their comparatively high taxes can also make the markets a less attractive proposition to investors. And while there is a growing community of people who are aware of crypto, vast swathes of the general public in those countries don’t fully understand the topic, let alone want to be involved. However, this presents a great opportunity to both educate people about the benefits of crypto, while attracting new users. 

The German government is considering a more innovative regulatory approach to cryptocurrencies, which could create a more favourable environment for crypto companies. A strong talent pool can be helpful in developing new products and services, and the DACH region has an abundance of skilled developers.  

Looking at the future of crypto, I am reassured by the state of play in Europe. Yes, there are a number of challenges, but there are also a number of opportunities. The future of crypto in Europe will likely depend on how the industry addresses these challenges and capitalizes on the opportunities arising from DeFi, on-chain KYC, and the Lightning Network to create a safe, transparent, decentralized and trustworthy environment for customers.

By

Jason Tucker-Feltham
Head of Crypto Sales at IDnow
Connect with Jason on LinkedIn


IdRamp

IdRamp Orchestration Fabric now available in the MS Azure Marketplace

Strategic business leaders aren’t simply living with the convolution created by multi-cloud and the expanding technology footprint. They untangle the web of multi-cloud computing with IdRamp Identity Orchestration The post IdRamp Orchestration Fabric now available in the MS Azure Marketplace first appeared on Decentralized Identity Orchestration.

Strategic business leaders aren’t simply living with the convolution created by multi-cloud and the expanding technology footprint. They untangle the web of multi-cloud computing with IdRamp Identity Orchestration

The post IdRamp Orchestration Fabric now available in the MS Azure Marketplace first appeared on Decentralized Identity Orchestration.

Thursday, 25. May 2023

OWI - State of Identity

The Path to Reusable Identity

Onfido CEO Mike Tuchen discusses the challenges of the fragmented digital identity landscape and the need for mess management. He predicts that travel will be a significant driver of change in the reusable identity space and that the winners and losers in the market will emerge within the next three years. Tuchen also sees a future for digital identity in age verification and address verification,

Onfido CEO Mike Tuchen discusses the challenges of the fragmented digital identity landscape and the need for mess management. He predicts that travel will be a significant driver of change in the reusable identity space and that the winners and losers in the market will emerge within the next three years. Tuchen also sees a future for digital identity in age verification and address verification, emphasizing privacy and security's importance.


KuppingerCole

Why Network Detection & Response (NDR) Is Central to Modern Cyber Architecture

Join security and business experts from KuppingerCole Analysts and cyber technology firm Exeon Analytics as they discuss how these challenges can be met using machine learning supported and log data based Network Detection & Response solutions to improve the overall cyber security and resilience of organizations. John Tolbert, Lead Analyst at KuppingerCole Analysts will look at reasons for de

Join security and business experts from KuppingerCole Analysts and cyber technology firm Exeon Analytics as they discuss how these challenges can be met using machine learning supported and log data based Network Detection & Response solutions to improve the overall cyber security and resilience of organizations.

John Tolbert, Lead Analyst at KuppingerCole Analysts will look at reasons for deploying NDR, the various deployment models, and use cases for enterprise IT and OT environments. He will also explain how ML-enhanced detection algorithms increase confidence and reduce false positives, and discuss key requirements for choosing NDR solutions and how NDR fits into the XDR landscape.

Michael Tullius, Sales Director Germany at Exeon Analytics will discuss why NDR is necessary and how it can benefit security leaders, admins, and incident responders. He will also give examples of detectable use cases, provide an overview of Exeon’s NDR solution, and share recommendations for improving cyber resilience.




Fission

IPFS Thing 2023: Decentralizing Auth, and UCAN Too

Fission co-founder and CTO Brooklyn Zelenka shares how UCANs simplify authorization while keeping it decentralized and preserving user agency. When we think about authorization, an image of a central authority determining what a person is and is not allowed to do comes to mind. In technology, this translates to authorization servers checking a list to see if the person has the correct permissions

Fission co-founder and CTO Brooklyn Zelenka shares how UCANs simplify authorization while keeping it decentralized and preserving user agency.

When we think about authorization, an image of a central authority determining what a person is and is not allowed to do comes to mind. In technology, this translates to authorization servers checking a list to see if the person has the correct permissions to access a service.

Even on the decentralized web, this still occurs. For example, if a user is on a machine that isn't running an IPFS node, or the IPFS node they are trying to access is asleep, the only way they can access the namespace that that IPFS node is managing is to log in through some other mechanism. The same goes for many edge apps or dapps. They create a point of centralization, even in a mostly distributed system.

But what if we got rid of the authorization server entirely? What if the user acted as their own auth server? Then we would have decentralized auth. This is where UCAN comes in.

Capabilities

There are two ways to handle authorization - access control lists and capabilities.

Access Control Lists are designed for when users are sharing a terminal. Think of a shared computer register where each sales associate logs in and out of the machine to tie their transactions to their individual profile.

Properties:

Stateful Auth - There is a user, a gatekeeper, and a service. The gatekeeper determines if the user can access the service according to the list. Pro - There are three clear stages in the process. Con - Neither the user nor the service is in control. They have no agency, and all the data is centralized in one list.

Capabilities are designed for networking.

Properties:

Stateless Auth - There is a user who has an address (pointer) to the service they are trying to access and a cryptographic token that says that they can access the service. Pro - The user and service are in control. Pro - All the required info is in the token. We can make copies of the token, share those copies with others, and revoke access to copies if we choose. Access can also be set to expire after a certain timeframe has elapsed. UCANs vs DIDs - DID you know UCAN have both?

How are UCAN tokens different from DIDs?

DIDs are building blocks inside of UCANs! DIDs say who you are. They give you a public key, and you can prove you're allowed to sign (AuthN - authentication) UCANs say what you can do (AuthZ - authorization) The UCAN Token

The UCAN comprises three main components: The JWT Header, the Payload (which includes the issuer / a DID - and proofs / CIDs), and the Signature (which must match the DID in the issuer).

The proofs demonstrate what capabilities the user has access to. The capabilities include a resource (for example, a path to a document), read or write access, and other extensible fields, like an email address.

The best part, we can delegate authority without sharing keys, making the process much more secure!

What UCAN do with UCANs

Authorized Data Retrieval - When you make your files shareable, UCAN ensures no unauthorized users can request them. In other words, unless you have a valid UCAN, you can't download a content address.

Apps - UCANs are used to delegate write access in WNFS, and WNFS, in turn, is used in many different apps like FxFotos and Fission Drive to manage files.

Authorized Channels - If you'd like to set up a group chat or gossip with specific peers, you will want a way for folks to prove that they are allowed to be in that channel so they can be admitted. Giving peers a shared capability provides them with that proof.

Decentralized Compute - We're using UCANs in IPVM to power distributed pipelines and collaborative processes.

IPFS Thing 2023 Presentation

Watch Brooklyn's full presentation below:

Brooklyn's Presentation Slides

If you'd like to learn more, UCAN visit the GitHub working group, check out the UCAN validator on the community site, or even play a fun adventure game to see UCANs in action!


IBM Blockchain

Keep it simple: How to succeed at business transformation using behavioral economics

Business leaders often think it’s impossible to predict the outcome of a transformation effort—whether employees will embrace a new process, for example, or how customers will react to a new service. They’re missing out on a secret of change management, says IBM Global Managing Partner Jesus Mantas: “You really can predict, for the most part, […] The post Keep it simple: How to succeed at busine

Business leaders often think it’s impossible to predict the outcome of a transformation effort—whether employees will embrace a new process, for example, or how customers will react to a new service. They’re missing out on a secret of change management, says IBM Global Managing Partner Jesus Mantas: “You really can predict, for the most part, why people do what they do.” The answers, he says, come from ​​behavioral economics.

In his role overseeing Business Transformation Services for​​ IBM Consulting, Mantas guides organizations toward success as they redesign their businesses. Mantas has spent years combing through findings from behavioral economics and incorporating them into his consulting work. The principles of human behavior can seem simple and even obvious, he says, but time and again, companies ignore them, then wonder what went wrong. Here are a few essential—but often overlooked—guidelines for any leader aiming to influence people’s decisions and drive change.

​​Realize it’s less about the data—and more about the presentation

“In a business environment, we tend to think everybody makes rational decisions,” Mantas says. But emotion plays a much larger role than leaders think. Case in point: Take the same facts and present them differently, and you get a different reaction from customers. A pair of headphones selling for 50% off $60 feels more compelling than the same item selling for $30. Ground beef that’s labeled “85% lean” seems more appealing than an identical product labeled as “15% fat.”

As another example of the power of presentation, Mantas cites studies that show a powerful way to encourage behavior in people is to sign them up for something—like a ​​401(k) savings plan—and allow them to opt out. That brings much higher adoption rates than a program requiring people to opt in. According to research from fund manager Vanguard, people who are auto-enrolled in a 401(k) have a 93% participation rate, compared to a 66% rate when people have to opt in.

In both cases, people are given the same choice—to join a 401(k) or not—but the facts are presented differently, using an opposite ​​choice architecture, as behavioral economists call it. Research about the power of auto-enrollment is so persuasive, in fact, that a new U.S. federal spending package requires employers to automatically sign up their employees for 401(k) plans to improve their retirement security.

Mantas believes data and facts make up 20% of a decision, while presentation is the other 80%. Factors like color and design “have disproportionately more impact than baseline statements,” he says. Efforts around transformation should always keep that in mind, and businesses should spend much more time getting the presentation right.

​​​​Stop making things so hard for people

The 401(k) research bolsters another point Mantas mentions frequently: If you want to influence behavior and encourage adoption, create the simplest possible path. What could be easier than joining a 401(k) through auto-enrollment? In other words, make things easy for people.

“‘Why is nobody following our new process?’ OK, well, it has 42 steps.”Jesus Mantas

As Mantas says, “People will do what’s easy more often than they will do what is correct, right or expected. It’s so simple, so obvious. Nobody has ever disagreed with me when I say that. And yet people barely ever apply it in practice. And then they ask, ‘why is nobody following our new process?’ OK, well, it has 42 steps.'”

When Mantas worked with a company looking to build a network of charging points for electric vehicles, the company’s team was focused on getting the technology to work well and rolling the stations out widely. “That’s great,” Mantas recalls asking them, “but why will someone adopt yours versus any other option that they have?” His own answer: “The charging experience has to be easier than any other one on the market. If you do that, you will have more adoption than anybody else.”

​​​Build strong and sticky habits

Mantas once spoke with a CEO who wondered how employees could adopt his company’s new principles as it underwent a transformation. It wasn’t about principles, Mantas told him, but habits.

The objective is to change what people do every day, which is very different from what they believe in or aspire to. Habits are what we do, who we are and how we think. If using new technology or processes doesn’t become a habit, the effort will ultimately fail.

The first step to developing a habit is, of course, making it easy and starting small; that’s an idea shared by BJ Fogg, a behavior scientist at Stanford University, in his book Tiny Habits. Leaders can establish habit-building cues, or reminders to do something.

IBM Consulting has its own list of habits, one of which is to build client trust. In practice, that means creating processes around transparency, like supplying data and metrics that measure success. “Building client trust is not a principle,” Mantas says. “That’s something you need to do in every interaction. That’s like brushing your teeth.”

The old adage is true: Humans are creatures of habit, and building routines will make your transformation stick. Like Mantas’s other recommendations, it’s a commonsense truth that’s backed up by research. The big picture, he says: ​​“When you study behavioral economics and science, you really find new avenues and tools to accelerate transformation—and unlock a significant amount of value.”

Transform your business with IBM Consulting

The post Keep it simple: How to succeed at business transformation using behavioral economics appeared first on IBM Blog.


FindBiometrics

Worldcoin, Mobile ID, Biometric Privacy, and More – Identity News Digest

Welcome to FindBiometrics’ digest of identity industry news. Here’s what you need to know about the world of digital identity and biometrics today: Worldcoin Raises $115M in Series C Worldcoin […]
Welcome to FindBiometrics’ digest of identity industry news. Here’s what you need to know about the world of digital identity and biometrics today: Worldcoin Raises $115M in Series C Worldcoin […]

SelfKey

The Secret Side of Online Identity: Benefits and Risks

The notion of 'online identity' is something that people naturally integrate into their daily lives nowadays. But how well do you grasp this concept, with its good and bad? SelfKey strives to offer a comprehensive resource for individuals seeking knowledge and understanding about the realm of 'online identity'. Through education and raising awareness, SelfKey aims to emphasize the significance of e
Summary

In the present, the vast majority of individuals are familiar with modern technology, the internet, and the notion of online identity. 

Even though digitization may have not reached every corner of the world yet, most people are no strangers to these concepts. Especially among the younger generation, many have already established a digital presence and have been active online for years.

The integration of the digital realm into people's lives has unlocked immense opportunities for education, communication, and work. Technological advancements continue to push boundaries each year, with innovations like the metaverse and its captivating shared virtual spaces.

However, it’s important for individuals to grasp the concept of online identity, recognizing both its advantages and the potential risks it entails. Among these risks, identity theft stands out as one of the most impactful.

Considering the long-lasting and potentially devastating consequences of identity theft, it’s very important for people to educate themselves on the matter. Additionally, it’s crucial to raise awareness about this issue. Doing so can greatly empower individuals and help them protect their online identities from cyberattacks.

In this article, we will thoroughly discuss the concept of online identity and how it works, with the good and the bad. Additionally, we will discuss how SelfKey may enhance security and privacy for individuals operating in the digital realm.

Highlights What is Online Identity? How Online Identity Works The Benefits of Having an Online Identity Online Identity Risks and Challenges How SelfKey May Improve your Online Identity Management Conclusions What is Online Identity? A brief definition

The concept of online identity refers to a persona which people establish for themselves on the internet. Also called internet identity, online personality, or internet persona, it may be used in online communities and websites. It can also be a way of intentionally showing and representing oneself.

Although some people choose to use their real names online, some internet users prefer to be anonymous. They identify themselves by means of pseudonyms, which reveal varying amounts of personally identifiable information. 

Additionally, an online identity may even be determined by a user's relationship to a certain social group they are a part of online. 

Online Identity vs Digital Identity

Even though online identity and digital identity are often found in the same context, there is a notable distinction between them. The difference between online identity and digital identity lies in their scope and focus.

Online identity specifically pertains to the persona and representation of a person in the online realm. It encompasses the social identity an individual establishes in online communities, websites, and social media platforms.

It includes elements such as:

Usernames and profiles. Personal information. Digital footprints.

All in all, it is the perception others have of the individual based on their online presence.

Digital identity, on the other hand, is a broader concept that encompasses the information and attributes associated with any entity, not limited to individuals. It extends beyond just personal online presence and encompasses a wider range of entities and their attributes. 

This includes organizational identities, system identities, device identities, and more. It focuses on how entities are identified, authenticated, and represented in the digital domain.

In simple terms, online identity refers to how a person appears and acts online, while digital identity is a broader concept that includes representing any entity in the digital world, such as people, organizations, apps, and devices.

Digital Footprints

While navigating the internet, it's important to remember that our online actions can have lasting effects on specific websites. These effects are referred to as digital footprints.

Digital footprints are the marks or records of our online activities and interactions that remain after using digital devices and platforms. They can include different types of data, like personal information, online behaviors, communications, and contributions.

Some examples include, but are not limited to:

Social Media Posts Web Browsing History Online Purchases Email Communication Online Profiles and Accounts Social Media Posts

Any content shared on social media platforms, including photos, videos, comments, and likes, contributes to a person's digital footprint. These posts can provide insights into their interests, opinions, and activities.

Web Browsing History

Every website visited and the search queries made leave a digital trace in the form of browsing history. It can include information about the types of websites visited, articles read, products searched for, and online services used.

Online Purchases

When making online purchases, individuals leave a digital footprint through transaction records. This can include payment information, shipping addresses, and product preferences. This data can be collected and analyzed to understand consumer behavior and tailor personalized advertisements.

Email Communication

Emails sent and received, including attachments, provide a digital trail of personal and professional conversations. The content of these emails can reveal information about relationships, affiliations, and communication patterns.

Online Profiles and Accounts

The creation and maintenance of online profiles on platforms such as LinkedIn, dating websites, or gaming communities contribute to a person's digital footprint. These profiles often contain personal information, employment history, educational background, and connections. And they form a digital representation of an individual's identity.

It's important to note that digital footprints can have implications for privacy, security, and reputation management. The data accumulated over time can be used for various purposes, including targeted advertising, data analysis, and even potential misuse. 

Because of this, it’s vital that individuals take extra caution when browsing the internet, especially when sharing private information. Even websites that appear trustworthy may be at risk of data breaches. 

How Online Identity Works

On a daily basis, individuals rely on their online identities to access various applications. Certain websites play a pivotal role in everyday work, communication, and education. However, there are also online domains that serve as sources of entertainment or offer delightful distractions.

Here are a few examples in which people make use of their online identities: 

Social Media Online Gaming Professional Networks Online Communities and Forums E-commerce and Online Shopping

Now, let's take a moment to briefly discuss them.

Social Media

On platforms like Facebook, Twitter, or Instagram, online identity is built through the creation of profiles. This may also include sharing personal information, posting content, and engaging with others. For example, a person's online identity on Instagram might include their username, profile picture, bio, and the photos or videos they share. 

Online Gaming

In the realm of online gaming, players often create unique usernames, avatars, and profiles that represent their gaming persona. Their online identity in this context is influenced by their gaming achievements, playstyle, interactions with other players, and reputation within the gaming community. 

Professional Networks

Platforms like LinkedIn focus on professional networking and career development. Online identity on LinkedIn is shaped by an individual's professional experience. This can include work experience, education, skills, recommendations, and professional achievements. It allows professionals to establish their expertise, connect with colleagues, and showcase their qualifications.

Online Communities and Forums

Though online communities and forums centered around specific interests or hobbies, individuals have the opportunity to participate in discussions, ask questions, and share knowledge. Online identity within these communities is built based on the content contributed, expertise demonstrated, and the respect gained from other community members. 

E-commerce and Online Shopping

When engaging in online shopping, individuals create accounts and profiles on e-commerce platforms. Online identity in this context involves personal information, purchase history, reviews, and ratings given to products or sellers. It influences the trustworthiness and credibility of an individual as a buyer or seller. 

In each of these contexts, online identity is constructed through the information individuals share, their online activities, and the interactions they have with others. It plays a role in shaping how others perceive them and how they navigate various online spaces.

The Benefits of Having an Online Identity

Because the wide majority of the world has already adopted modern technology, having an online identity has become a necessity. However, there are a multitude of benefits which come with having an online identity, namely:

Expanded Reach and Influence Professional Opportunities Networking and Collaboration Personal Branding Access to Resources and Information Expanded Reach and Influence

Having an online identity allows individuals to extend their reach beyond physical boundaries. It provides a platform to share ideas, expertise, and creativity with a global audience. And this may potentially lead to increased influence and impact.

Professional Opportunities

An online identity can open doors to various professional opportunities. It allows individuals to showcase their skills, accomplishments, and qualifications to potential employers, clients, or collaborators. It acts as a digital resume, making it easier for others to discover and connect with them.

Networking and Collaboration

An online identity facilitates networking and collaboration by connecting individuals with like-minded people, professionals in their field, or communities of interest. It provides a platform to engage in meaningful discussions, share knowledge, and collaborate on projects, fostering valuable connections and relationships.

Personal Branding

Online identity enables individuals to shape and manage their personal brand effectively. They can curate their online presence to reflect their values, expertise, and unique qualities. This branding can help individuals stand out, establish credibility, and create a positive perception among peers, potential clients, or collaborators.

Access to Resources and Information

Having an online identity grants individuals access to a vast array of resources, information, and opportunities available on the internet. It allows them to tap into online communities, educational platforms, industry-specific websites, and research databases. This empowers them to stay informed, learn, and grow in their personal and professional pursuits.

Online Identity Risks and Challenges

As with any innovation, having an online identity does not come without risks. It’s important for people to be aware of these negative aspects and to take precautions to avoid becoming victims. Even more, it’s equally important to spread awareness and educate as many people as possible about these matters.

If proper security measures would be in place, the following situations could be prevented:

Identity Theft Phishing and Social Engineering Online Reputation Damage Data Breaches and Privacy Breaches Identity Theft

Online identity theft occurs when someone gains unauthorized access to personal information and uses it for fraudulent purposes. For example, a hacker might gain access to an individual's online banking account and use their identity to make unauthorized transactions or steal funds.

Phishing and Social Engineering

Phishing is a tactic used by cybercriminals to deceive individuals into sharing sensitive information such as passwords or credit card details by disguising themselves as a trustworthy entity. For example, a person may receive an email appearing to be from their bank, asking them to click on a link and provide login credentials. Hackers may use these credentials to gain unauthorized access to the victim's accounts.

Online Reputation Damage

In the digital age, a person's online reputation can have significant consequences. Negative or inappropriate content shared online, whether by oneself or others, can harm one's personal or professional reputation. For instance, a controversial tweet or a compromising photo shared online without consent can damage an individual's reputation and have long-lasting consequences.

Data Breaches and Privacy Breaches

Data breaches occur when sensitive information is accessed, stolen, or exposed due to security vulnerabilities in online systems. This can lead to the compromise of personal information, such as names, addresses, social security numbers, or financial data. An example is when a major retail company experiences a breach, resulting in the theft of customer credit card information from their database.

These risks and challenges highlight the importance of maintaining strong online security practices, being cautious with personal information sharing, and being vigilant about potential threats to online identity.

How SelfKey May Improve your Online Identity Management

As online identities have become an integral part of daily life, it is crucial to prioritize privacy and security. Insufficient security measures or a lack of awareness regarding online safety can have long-lasting consequences for unsuspecting individuals.

Given this, SelfKey aims to contribute to a safer digital future through the development of decentralized solutions. Decentralization has the potential to address numerous issues related to poor security in centralized systems. 

One decentralized solution which SelfKey proposes, and which may prevent data breaches in the future is SelfKey iD.

SelfKey iD

Designed with privacy and security in mind, SelfKey iD may be the best solution for online identity management. Individuals who operate in the online world may be in charge of their private data, and this may prevent identity theft. 

Because SelfKey iD uses AI-Powered Proof of Individuality, unlawful access to a user’s private account may become considerably challenging. Artificial Intelligence is capable of detecting AI-generated images which bad players may use to try and compromise one’s online account. 

And, with recurring selfie checks, SelfKey iD aims to add an extra layer of security by continuously making sure that the user is who they claim to be. This way, security is enhanced and access to one’s private account is limited only to the user.

The goal is to enable individuals to operate in the online world confidently, so that people may enjoy the benefits of modern technology in a secure and safe manner. 

Conclusions

The advent of modern technology has presented us with abundant opportunities for personal growth and advancement. 

Utilizing online platforms and leveraging AI-powered tools offer numerous benefits in our day-to-day lives. Having an online identity has revolutionized the way we navigate challenges that were once formidable obstacles, such as remote work, distance learning, and long-distance communication.

To continue harnessing the potential of these remarkable opportunities, it is crucial that we take the necessary precautions to safeguard our online identities. The first step is to be well-informed and actively share knowledge and spread awareness.

SelfKey strives to compile as much relevant information regarding digital identities, online identities, as well as privacy and security news in its blog. The more informed you are, the better the chances to prevent online security risks.

By prioritizing security and privacy, SelfKey strives to empower individuals to embrace technology without sacrificing the safety of their digital identity.

Stay up to date with SelfKey on Discord, Telegram, and Subscribe to the official SelfKey Newsletter to receive new information!

 

 

 


1Kosmos BlockID

Bringing Verified Identity and Passwordless to the Masses

In this vlog, our COO, Huzefa Olia, is joined by Kevin Shanley, Principal at Amazon Web Services Identity to discuss bringing verified identity and passwordless to the masses. Huzefa Olia: Hello and welcome. Hi Kevin, how are you? Kevin Shanley: Hey, Huzefa. I’m doing great, thanks and yourself? Huzefa Olia: Wonderful. So you are new … Continued The post Bringing Verified Identity and Passwordle

In this vlog, our COO, Huzefa Olia, is joined by Kevin Shanley, Principal at Amazon Web Services Identity to discuss bringing verified identity and passwordless to the masses.

Huzefa Olia:
Hello and welcome. Hi Kevin, how are you?

Kevin Shanley:
Hey, Huzefa. I’m doing great, thanks and yourself?

Huzefa Olia:
Wonderful. So you are new to this particular vlog our so would love for you to introduce yourself first to our audience.

Kevin Shanley:
Absolutely. My name is Kevin Shanley. I lead go to market for AWS consumer identity services. And while I’m relatively new at AWS, I’ve actually been an IM for over 25 years.

Huzefa Olia:
Thanks for the intro. Now, in your world today you see so many different types of identities that are coming in. What are the different challenges that you see with respect to authentication?

Kevin Shanley:
Well, in three words, passwords, friction, and passwordless. If we double click on passwords, they’re difficult to make both complex and to remember. And this makes them susceptible to being these weak, easy, remember simple passwords that are often reused at different sites. And then that next problem becomes friction. So as we try to add security to passwords, we add friction, you make it harder for those legitimate users to actually authenticate.

And then the solution to that is passwordless and this is where everyone wants to go, we all want to go passwordless but that really brings it own set of problems. You may not have hardware that supports passwordless. You may not be a tech savvy person and know how to use passwordless. And so while passwords may not be very secure, at least everyone knows how passwords work, it’s just a known quantity. So that question becomes how do we get to passwordless?

And so well, it can brings up a couple of other points. Huzefa, can you explain how bringing a user’s true identity into authentication enhances both the security and mitigates risks?

Huzefa Olia:
Kevin, very interesting question. When you talk about passwordless, there’s so many different ways to get to passwordless in the market today, and most of them always take into the equation what a user’s device is.

But what is important is also to know who’s the user behind the device. You need to know if the user is authorized to access that particular device and the user is in fact the one that you provisioned or you created the account for. At the end of the day, identity and access management has to work together and identity has to be brought in into any kind of an access or a sign on scenario.

Kevin Shanley:
Thanks, Huzefa, that’s great. And so I guess what specific strategies or technologies can be employed to incorporate a user’s true identity into authentication?

Huzefa Olia:
Often when we look at any kind of access, we always looking at how do I register a user’s device and then give them access?

What we advocate is when you bring on a user, right, already you, onboard a employee, contractor, customer of yours, do identity proofing or identity verification, and there are various different standards, the most prominent one being the NIST 800-63. So you have a guideline to how to proof and verify a user as well. But the result of that proofing and verification can be not just a verified user, but also using the user’s biometrics to authenticate them.

So today, when you look at any connect for passwordless access, you can essentially ask a user to prove who they are by taking their live biometrics or their live selfie to access various different systems and mainly critical systems as well where you no longer are requiring an OTP, you’re no longer requiring a user just to use a push notification but actually validate themselves by looking at the camera and us identifying if that’s the same user who registered for the service as well.

So hopefully that gave you some context, but now let me ask you this question, Kevin, right? When you see this flow of an identity based authentication, right? How do you think that would benefit? Which industries would benefit from this kind of an authentication?

Kevin Shanley:
So, well, AWS especially, it’s really anything you care about securing, any industry that cares, cares about securing your site, financial services, healthcare, commercial sites. I mean even social networking because you’re concerned about data mining.

This point, bad bot traffic is now the majority of traffic on the internet. And what are these bots doing? They’re creating fake accounts, they’re attempting to break into existing user accounts. And so I mean, why do they even create these fake accounts and how can they create them? It’s because for the most part, nothing is actually verifying that there’s a real person on the end of the line and that that’s a real user identity.

And as we kind of pivot over to identity verification, identity proofing and stronger authentication, passwordless, that’s where we start to really close the loop on that and that’s both, it’s going to mean that you’re going to have real accounts actually in your databases. You’re going to see real traffic and real metrics and not have it all abstracted with a bunch of bad bot traffic and you’re going to end up as well as saving money because you’re not going to have to pay for that bot traffic that are taking up your monthly user accounts and such.

And so I guess from your perspective Huzefa, what steps can an organization take to successfully roll out an identity based authentication for the masses?

Huzefa Olia:
Simple answer to this would be to include any kind of identity proofing and verification into your onboarding flow. And your onboarding flow can be depending on the persona of the user, if it’s an employer or contractor, when you’re creating them as a new user. If you have customers when you’re registering them, for example, for financial services, if you’re doing a KYC or if you’re in retail and if you need to know for any kind of offer sensitive transaction that needs to be done, know who the user is, do the proofing and verification that is required, which will have all the added benefits of reducing fraud like you mentioned Kevin. And then not just do this one time, but use this identity template that has been accumulated for that specific user to use that into their authentication as well.

So all of, our companies have worked together to put 1Kosmos and the AWS solutions. Where do you think this collaboration with 1Kosmos and AWS will contribute to the market?

Kevin Shanley:
Oh, great question. So well, coming from the AWS side and how I see it, AWS is the leading cloud provider and AWS identity and access management services were recently given a top customer choice award from Gartner. In fact, AWS IM handles hundreds of millions of API calls per second, which is just a crazy number. I mean hundreds of millions of API calls per second. So we go big, we do identity well, but we can’t be everywhere. We can’t do everything.

And as an agile startup, 1Kosmos has both, I think the technical depth and focus to really innovate in identity-based authentication. And together I think we bring that targeted innovation together with the stability, scale availability, and cost-efficient computing to drive identity over to all to the masses.

Huzefa Olia:
Thank you. Thank you for that, Kevin. This is very, very insightful and I’m sure our audience wants to learn a little bit more about this. So you are going to Identiverse, you will be speaking at Identiverse. Can you tell our audience where will you be?

Kevin Shanley:
Yeah, absolutely. So, at Identiverse, I’ll be speaking with Mike Engle in Las Vegas. The title of our session is Bringing Verified Identity and Password List to the Masses, and that’s going to be running at 4:30 PM on Thursday, June the 1st.

Huzefa Olia:
Awesome. Look forward to. I’ll be audience as well.

Kevin Shanley:
Great. Looking forward to seeing you there.

The post Bringing Verified Identity and Passwordless to the Masses appeared first on 1Kosmos.


FindBiometrics

Learn How Biometrics Are Fighting AI-Enhanced Fraud with Onfido’s Therese Stowell

On May 17, 2023, FindBiometrics hosted the full-day online event, “Biometrics and Mobile ID on the Innovation Highway,” featuring a fireside chat with renowned AI and identity technology expert Therese […]
On May 17, 2023, FindBiometrics hosted the full-day online event, “Biometrics and Mobile ID on the Innovation Highway,” featuring a fireside chat with renowned AI and identity technology expert Therese […]

Ontology

Meet the Team: Ontology Harbinger, Donny Clutterbuck

What’s your name and where are you from? Clutterbuck and I am from upstate New York in the USA. Tell us a bit about yourself. What did you study? What are your hobbies? What is your superpower? At university, I studied Philosophy and Astronomy, but ended up working mostly in the restaurant and bar industry for the following two decades. I like to swim, target shoot, box, watch movies,
What’s your name and where are you from?

Clutterbuck and I am from upstate New York in the USA.

Tell us a bit about yourself. What did you study? What are your hobbies? What is your superpower?

At university, I studied Philosophy and Astronomy, but ended up working mostly in the restaurant and bar industry for the following two decades. I like to swim, target shoot, box, watch movies, go for walks, and read.

What is your role and when did you join Ontology?

Ontology enthusiast since 2020, and I joined the Harbinger team in the second half of 2022.

What kind of work do you do on a day-to-day basis?

Beverage director for a restaurant group in New York, and have worked behind the bar for about 20 years.

In your opinion, what makes Ontology stand out from other blockchains?

From what I’ve been reading, Ontology is tackling just about every problem in the space, while most companies focus on one piece of the larger solutions. It seems to me that Ontology has the benefit of being an all-in-one solution.

What is the most exciting part of the project you’re working on for Ontology?

I love providing insight regarding use-case and development of the space in general, and trying to provide easy-to-understand answers to really complex questions.

What has been the most important moment in your career so far?

Taking some time out from behind the bar to chase down more work with regards to numbers and processes. My quality of life has improved incredibly.

What are you most excited for in your future at Ontology?

I’m excited to see a lot of the technological planning and progress in 2022 come to fruition in 2023 and 2024.

Where do you see Ontology and Web3 going in the next five years?

Ontology may be a DID solution/hub for both individuals and businesses. It’s exciting to think that what’s been built here can potentially facilitate this.

Contact us

- Ontology official website: https://ont.io/

- Email contact: contact@ont.io

- GitHub: https://github.com/ontio/

- Telegram group: https://t.me/OntologyNetwork

Meet the Team: Ontology Harbinger, Donny Clutterbuck was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


UNISOT

UNISOT’s Solutions at the Forefront

In this digital age, UNISOT, a leading technology company, is transforming the skin care industry through its cutting-edge solutions, offering unparalleled benefits to both manufacturers and consumers. The post UNISOT’s Solutions at the Forefront appeared first on UNISOT.

PHARMACEUTICAL SKIN CARE, AN EVER EVOLVING INDUSTRY

The pharmaceutical skin care industry is constantly evolving, driven by a growing demand for effective and personalized products. As consumers become more aware of the importance of skincare and the impact it has on their overall well-being, there is an increasing need for innovative solutions that cater to individual needs. In this digital age, UNISOT, a leading technology company, is transforming the skin care industry through its cutting-edge solutions, offering unparalleled benefits to both manufacturers and consumers.

STREAMLINING SUPPLY CHAINS

One of the key challenges in the skin care industry is managing complex supply chains effectively. UNISOT’S public blockchain-based supply chain platform provides a transparent and secure network that enables seamless collaboration and traceability from ingredient sourcing to product distribution. By leveraging a Digital Product Passport, manufacturers can track the origin, quality and authenticity of each ingredient, ensuring transparency and reducing the risk of counterfeit products. This not only enhances consumer trust but also enables companies to maintain high-quality standards.

ENHANCING PRODUCT SAFETY AND EFFICACY

Product safety and efficacy are of utmost importance in the skin care industry. By integrating public blockchain technology into the testing and validation processes, manufacturers can securely record and track product data, including test results and certifications. This not only streamlines compliance procedures but also helps identify any potential issues or recalls promptly, safeguarding consumer health and strengthening brand reputation.

REAL-TIME FEEDBACK AND CUSTOMER ENGAGEMENT

UNISOT’s solutions empower manufacturers to engage with their customers in real-time, fostering a stronger bond between brands and consumers. Through innovative technologies like smart packaging and connected devices, manufacturers can gather valuable data on product usage, customer feedback, and preferences. This data can be utilized to improve existing products, develop new formulations, and adapt marketing strategies to meet evolving customer needs. By actively involving consumers in the product development process, manufacturers can build a loyal customer base and drive brand advocacy.

SUSTAINABLY AND ECO-FRIENDLY PRACTICES

The skin care industry is increasingly focused on sustainability and eco-friendly practices. Our solutions support companies in adopting greener initiatives throughout the supply chain. By providing end-to-end traceability, manufacturers can ensure responsible sourcing of ingredients and monitor the environmental impact of their operations. Additionally, enabling streamlined inventory management, reducing waste and minimizing the carbon footprint associated with excess production and transportation.

Contact US

The post UNISOT’s Solutions at the Forefront appeared first on UNISOT.


KuppingerCole

Unified Endpoint Management (UEM)

by Richard Hill This report provides an updated overview of the Unified Endpoint Management (UEM) market and provides a compass to help you find the solution that best meets your needs. We examine the market segment, vendor service functionality, relative market share, and innovative approaches to providing UEM solutions.

by Richard Hill

This report provides an updated overview of the Unified Endpoint Management (UEM) market and provides a compass to help you find the solution that best meets your needs. We examine the market segment, vendor service functionality, relative market share, and innovative approaches to providing UEM solutions.

PingTalk

Ping Achieves FAPI 2.0 Certification | Ping Identity

I was thrilled to get the news from one of our teammates at Ping that we had achieved FAPI 2.0 certification and were one of the first companies to do so.    I’ve been involved in identity standards for a long time—from the old SAML interop testing days to OAuth to OpenID Connect and now FAPI. It is great to see open standards continuing to grow and thrive.

I was thrilled to get the news from one of our teammates at Ping that we had achieved FAPI 2.0 certification and were one of the first companies to do so. 

 

I’ve been involved in identity standards for a long time—from the old SAML interop testing days to OAuth to OpenID Connect and now FAPI. It is great to see open standards continuing to grow and thrive.


Ocean Protocol

DF38 Completes and DF39 Launches

Stakers can claim DF38 rewards. DF39 runs May 25 — Apr 1, 2023 1. Overview Data Farming Round 39 is here (DF39). DF39 is the 11th week of DF Main, the final phase of DF. This week, users can earn rewards up to 150K OCEAN. In DF Main, weekly rewards will grow to 1M+ OCEAN. The article “Ocean Data Farming Main is Here” has the full details of DF Main. In fact, it’s a self-containe
Stakers can claim DF38 rewards. DF39 runs May 25 — Apr 1, 2023 1. Overview

Data Farming Round 39 is here (DF39).

DF39 is the 11th week of DF Main, the final phase of DF. This week, users can earn rewards up to 150K OCEAN. In DF Main, weekly rewards will grow to 1M+ OCEAN.

The article “Ocean Data Farming Main is Here” has the full details of DF Main. In fact, it’s a self-contained description of Ocean Data Farming (DF), including all the details that matter. It is up-to-date with the latest reward function, weekly OCEAN allocation, and estimates of APYs given the current amount of OCEAN staked.

DF is like DeFi liquidity mining or yield farming, but is tuned to drive data consume volume (DCV) in the Ocean ecosystem. It rewards stakers with OCEAN who allocate voting power to curate data assets with high DCV.

To participate, users lock OCEAN to receive veOCEAN, then allocate veOCEAN to promising data assets (data NFTs) via the DF dapp.

DF38 counting started 12:01am May 18, 2023 and ended 12:01am May 25. You can claim them at the DF dapp Claim Portal.

DF39 is live and will conclude on Apr 01, 2023.

DF Round 39 (DF38) is the 11th week of DF Main. Details of DF Main can be found here.

The rest of this post describes how to claim rewards (section 2), and DF38 overview (section 3).

2. How To Claim Rewards

As a participant, follow these step on how to claim rewards:

Go to DF dapp Claim Portal Connect your wallet Passive and Active Rewards are distributed on Ethereum mainnet. Click “Claim”, sign the tx, and collect your rewards

Rewards accumulate over weeks so you can claim rewards at your leisure. If you claim weekly, you can re-stake your rewards for compound gains.

3. DF39 Overview

DF39 is part of DF Main, phase 1. This phase emits 150K OCEAN / week and runs for 52 weeks total. (A detailed DF Main schedule is here.)

Ocean currently supports five production networks: Ethereum Mainnet, Polygon, BSC, EWC, and Moonriver. DF applies to data on all of them.

Some key parameters:

Total budget is 150,000 OCEAN. 50% of the budget goes to passive rewards (75,000 OCEAN) — rewarding users who hold veOCEAN (locked OCEAN) 50% of the budget goes to active rewards (75,000 OCEAN) — rewarding users who allocate their veOCEAN towards productive datasets (having DCV).

Active rewards are calculated as follows:

First, distribute OCEAN across each asset based on rank: highest-DCV asset gets most OCEAN, etc. Then, for each asset and each veOCEAN holder:
– If the holder is a publisher, 2x the effective stake
– Baseline rewards = (% stake in asset) * (OCEAN for asset)
– Bound rewards to the asset by 125% APY
– Bound rewards by asset’s DCV * 0.1%. This prevents wash consume.

For further details, see the “DF Reward Function Details” in DF Main Appendix.

As usual, the Ocean core team reserves the right to update the DF rewards function and parameters, based on observing behavior. Updates are always announced at the beginning of a round, if not sooner.

Conclusion

DF38 has completed. To claim rewards, go to DF dapp Claim Portal.

DF39 begins May 25, 2023 at 12:01am UTC. It ends Apr 01, 2023 at 12:01am UTC.

DF39 is part of DF Main. For this phase of DF Main, the rewards budget is 150K OCEAN / week.

Appendix: Further Reading

The Data Farming Series post collects key articles and related resources about DF.

About Ocean Protocol

Ocean was founded to level the playing field for AI and data. Ocean tools enable people to privately & securely publish, exchange, and consume data.

Follow Ocean on Twitter or Telegram to keep up to date. Chat directly with the Ocean community on Discord. Or, track Ocean progress directly on GitHub.

DF38 Completes and DF39 Launches was originally published in Ocean Protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.

Wednesday, 10. May 2023

Radiant Logic

Using Identity Analytics to Fight Ransomware and other Cyber Threats

Read along with our blog series by Sebastien Faivre, CTO of Brainwave GRC, to learn how major ransomware and cybersecurity concerns can be prevented with Identity Analytics. The post Using Identity Analytics to Fight Ransomware and other Cyber Threats appeared first on Radiant Logic.

FindBiometrics

The Seamless Future of Travel Starts with Passwordless Booking

The following article is a guest submission from OwnID. The travel industry is starting to revive from the massive disruptions of the COVID-19 pandemic, but it has been a bumpy ride, […]
The following article is a guest submission from OwnID. The travel industry is starting to revive from the massive disruptions of the COVID-19 pandemic, but it has been a bumpy ride, […]

Indicio

Digitalizing Vital Records: Do’s and Don’ts

The post Digitalizing Vital Records: Do’s and Don’ts appeared first on Indicio.
Vital records — such as your birth, death, marriage, and divorce certificates — are going digital. The only question is how. In this article, we explain why verifiable credential technology offers the best solution.

By Tim Spring

As we look to modernize legacy systems and create digital equivalents of our most important and personal documents, it is important to get this transformation right. As a leading developer of verifiable data solutions using open-source decentralized identity technology, Indicio would like to offer the following recommendations for governments looking into turning vital paper records into vital digital records. 

Do

Build on open-source technology

To put it simply, don’t tie yourself to proprietary technology when technology is evolving at a rapid pace. Plus, there’s the risk of not being able to interoperate with other identity and data ecosystems unless you go down the path of costly and complex direct integrations. With open source you own your solution and can develop on it as you see fit. You also benefit from the innovation generated by a vibrant open-source community. Community-developed code tends to be more resilient code. 

Make use of decentralization technology

Decentralized identity technology allows people to hold their own data and share it in a way that the authenticity and integrity of the data can be verified without having to check in with or create a direct integration with the issuer of the information. This means that a vital record can be immediately trusted and easily shared across different systems. Sharing records through a peer-to-peer encrypted channel further enhances privacy and security. 

Provide verification methods for partners and stakeholders

When it comes to vital records, digital copies will only be as useful in proportion to their adoption and verification. To have a successful solution businesses and organizations must be able to easily recognize and verify a vital digital record as an acceptable document.

More partners and stakeholders also bring their own points of view, priorities, and use cases; they may have ideas for adoption and use cases that you haven’t thought of yet!

Start somewhere

You need to make something. Excitement over an idea can spread easily, but progress cant happen until you have a prototype or proof of concept. Even if it’s not perfect, it drives feedback from stakeholders, customers, or partners.

Don’t

Get locked into a proprietary solution

Even if it’s a solution from a big name there’s alway the risk of loss of support or costly upgrades. Open-source code provides longevity—it’s something you can fully own, develop on as you see fit, or draw on new features being added to the codebase by other users. This is why governments are increasingly requiring open-source solutions in RFPs. 

Wait for the perfect solution

Several states have already implemented virtual driver’s licenses. The technology is mature and has already been deployed by many governments. As the technology evolves incremental changes are possible to keep your solution up to date.

Build a siloed solution

Interoperability is important to creating a vital record that is widely recognized. Building and testing your solutions with other compatible solutions guarantees your records will be easily verifiable.

A key takeaway here is that this technology is not difficult to integrate into current systems. All this data is already available in government databases. The addition of verifiable credentials allows people to better manage their own documents and have an extra layer of security and verification.

The effort to move vital records from paper file cabinets to digital wallets or files is both nuanced and exciting to the team here at Indicio. If you found these tips helpful or would like to pick our team’s brain about a specific project you are working on please feel free to reach out to us here.

To learn more about Vital records you can hear about some of the efforts to implement systems at the upcoming Cardea call.

The post Digitalizing Vital Records: Do’s and Don’ts appeared first on Indicio.


Anonym

7 Insights Into Progress on the Path to Decentralized Identity: Dr Paul Ashley

Anonyome Labs’ CTO and Co-CEO Dr Paul Ashley recently gave Australian students studying advanced topics in security 7 insights into how decentralized identity (DI) is rapidly solving the global data privacy crisis.  He gave the lecture at The University of Queensland, a leading provider of education in cyber security, cryptography and blockchain technology. Dr Ashley’s lec

Anonyome Labs’ CTO and Co-CEO Dr Paul Ashley recently gave Australian students studying advanced topics in security 7 insights into how decentralized identity (DI) is rapidly solving the global data privacy crisis

He gave the lecture at The University of Queensland, a leading provider of education in cyber security, cryptography and blockchain technology.

Dr Ashley’s lecture explored how DI is the new approach to identity management on the internet and the biggest privacy breakthrough of the next decade. 

Dr Ashley’s 7 insights were: We need to improve privacy in the surveillance economy. Data breaches have eroded consumer trust in organizations that collect and store their personal data and users urgently need to regain control. Decentralized identity enables selective disclosure around digital proof of identity, and we’re already seeing rapid uptake of this in government uses such as licenses and everyday applications such as gym memberships. Verifiable credentials (VCs) have quickly become the “killer feature” of DI because they’re making selective disclosure of PII possible and will drive DI’s ubiquity across the internet in the next decade.  The rise and rise of VCs has led to the widespread need for the identity wallet, a massive growth area. Blockchain is the anchor of trust in decentralized systems, and therefore DI is one of the most important use cases for blockchain technology.  DI standards are rapidly evolving to drive interoperability of DI solutions across the internet.  Block’s tbdex decentralized exchange protocol is a useful case study for organizations interested in this space. 

Learn more in our whitepaper – Innovating Identity and Access Management with Decentralized Identity.

See how your organization could use our Sudo Platform APIs to power next-generation customer apps with privacy, cyber safety and decentralized identity capabilities. 

Our blog covers DI from many angles. You might like:

Can Decentralized Identity Give You Greater Control of Your Online Identity? Simple Definitions for Complex Terms in Decentralized Identity How You Can Use Sudo Platform Digital Identities and Decentralized Identity Capabilities to Rapidly Deliver Customer Privacy Solutions What our Chief Architect said about Decentralized Identity to Delay Happy Hour 7 Benefits to Enterprises from Proactively Adopting Decentralized Identity Our whitepapers

The post 7 Insights Into Progress on the Path to Decentralized Identity: Dr Paul Ashley appeared first on Anonyome Labs.


1Kosmos BlockID

What Is Magic Link Authentication? Benefits & Challenges

Many organizations are turning to passwordless authentication solutions to secure their systems and remove vulnerabilities from identity management. Discover magic link authentication—a secure, passwordless login method that simplifies user access and enhances security through unique URLs. What Is Magic Link Authentication? Magic link authentication is a passwordless authentication method that all

Many organizations are turning to passwordless authentication solutions to secure their systems and remove vulnerabilities from identity management.

Discover magic link authentication—a secure, passwordless login method that simplifies user access and enhances security through unique URLs.

What Is Magic Link Authentication?

Magic link authentication is a passwordless authentication method that allows users to log in to a website or application using a unique, one-time-use URL sent to their registered email address.

When users want to sign in, they enter their email address, and the system generates a “magic link,” which is then emailed to them. The user clicks on the link and is automatically logged into the website or application.

Magic links, like one-time passwords, are often time-sensitive, expiring after a certain period or once they have been clicked. This helps enhance security and reduces the chances of unauthorized access. Using magic links simplifies the login process but relies on the user’s email account, which could be a potential point of vulnerability in case of phishing attacks or email service issues.

How Does Magic Link Authentication Work?

Magic link authentication uses a “magic link,” or one-time-use URL sent to a user’s registered email address. This link facilitates specialized and time-based authentication that doesn’t rely on passwords.

Here’s a step-by-step breakdown of how it works:

User Access: Users who want to log in to a website or application enter their email address and request a magic link. Link Generation: The system checks if the provided email address is associated with a registered user. If so, the system generates a unique and time-sensitive link that contains a token specific to that user and session. Delivery: The system sends an email containing the magic link to the user’s registered email address, assuming only the user has access. Token Validation: When the user clicks the link, the website or application receives the token via the user’s web browser and validates it across several criteria, including user credentials, time (has it expired?) or use (has it already been validated?). Authentication: If the token is valid and corresponds to the correct user, the system logs the user in, granting them access to their account without requiring a traditional username and password. Token Expiration: If the magic link is used for authentication or if a predetermined time passes, the token is invalidated. Any future attempts to use the link will result in rejection.

Some magic link authentication systems will use SMS alongside or in place of email. However, these are rare in widely-adopted consumer platforms (although they might be more realistically implemented in enterprise systems).

Are There Drawbacks to Using Magic Link Authentication?

This process seems relatively straightforward and, ideally, secure. It removes the necessity of using passwords which, as we’ve talked about in previous articles, removing passwords can significantly reduce an organization’s attack surface.

While magic link authentication offers several benefits, there are also some potential drawbacks to consider:

Dependency on Email: Magic link authentication relies on the user’s email account, which can be an issue if the email service is down or the user loses access to their email account. It also requires users to access their email whenever they want to log in, which might only sometimes be convenient. SMS can provide an alternative to circumvent this issue. Vulnerability to Phishing Attacks: If users become more comfortable clicking links in emails, they may mistake phishing attacks for authentication emails. Therefore, it’s also critical that organizations offer training and education around technology. Context-Specific Security: Magic link authentication might not be ideal for specific situations, such as when a high level of security is required or when compliance and regulations call for a specific approach to authentication. Magic links also do not support identity verification independently and can open up issues when such proof is required. Delays: Users might experience a delay in the authentication process since they have to wait for the magic link to be delivered to their email inbox. The email containing the magic link may also be misidentified as spam and end up in the spam folder, complicating the login process. Token Management: The server-side implementation of magic link authentication requires proper management of tokens, including generating, storing, and verifying them securely. This adds complexity to the system and could introduce vulnerabilities if not done correctly.

While magic link authentication has advantages, these drawbacks should be considered when deciding whether it’s the best choice for a particular website or application.

How Does Magic Link Authentication Support Passwordless Login?

Magic links support passwordless authentication by providing a secure and convenient alternative to traditional username and password combinations. Instead of relying on passwords, users authenticate themselves using unique, one-time-use URLs sent to their registered email addresses. Here’s how magic links enable passwordless authentication:

Simplicity: Magic links eliminate the need for users to remember, manage, and enter passwords, making the authentication process more straightforward and user-friendly. Security: Since magic links are time-sensitive and generated on-demand, they offer a secure authentication method. They expire after being used or after a predetermined period, reducing the risk of unauthorized access. By not requiring users to enter passwords, magic links also help mitigate the risks associated with weak passwords, credential leaks, or password reuse. Improved User Experience: Magic link authentication streamlines the login process, making it faster and more efficient. Additionally, if the user has access to their email via a mobile device (or they use SMS-based authentication), they can essentially authenticate themselves wherever they are. Reduced Reliance on Passwords: Magic links reduce the dependence on passwords, which can be problematic due to forgotten or reused passwords and the need for regular password updates. Passwordless authentication through magic links can also reduce the burden on users and IT support.

With magic links, passwordless authentication offers a more convenient and potentially more secure method for users to access their accounts without relying on traditional passwords. However, it is essential to remember that this method depends on the user’s email account, which could be a potential point of vulnerability in case of phishing attacks or email service issues.

Make Your Users’ Lives Easier with 1Kosmos Magic Link Authentication

1Kosmos provides several critical features to organizations that want high-level security and identity management for their employees, passwordless authentication being one of these features. The 1Kosmos BlockID system allows you to use magic links alongside strong identity management, compliant identity verification, and other authentication functions.

With 1Kosmos BlockID, you get:

Identity-Based Authentication: We push biometrics and authentication into a new “who you are” paradigm. BlockID uses biometrics to identify individuals, not devices, through credential triangulation and identity verification. Cloud-Native Architecture: Flexible and scalable cloud architecture makes it simple to build applications using our standard API and SDK. Identity Proofing: BlockID verifies identity anywhere, anytime and on any device with over 99% accuracy. Privacy by Design: Embedding privacy into the design of our ecosystem is a core principle of 1Kosmos. We protect personally identifiable information in a distributed identity architecture and the encrypted data is only accessible by the user. Private and Permissioned Blockchain: 1Kosmos protects personally identifiable information in a private and permissioned blockchain, encrypts digital identities, and is only accessible by the user. The distributed properties ensure no databases to breach or honeypots for hackers to target. Interoperability: BlockID can readily integrate with existing infrastructure through its 50+ out-of-the-box integrations or via API/SDK. SIM Binding: The BlockID application uses SMS verification, identity proofing, and SIM card authentication to create solid, robust, and secure device authentication from any employee’s phone.

Sign up for our newsletter to learn more about how BlockID can support real security and help mitigate phishing attacks. Also, make sure to read our whitepaper on how to Go Beyond Passwordless Solutions.

The post What Is Magic Link Authentication? Benefits & Challenges appeared first on 1Kosmos.


Dark Matter Labs

Laudes x Dark Matter Labs: A just transition of Europe’s built environment (Part 4)

#BendNotBreak This is the final blog in a 4-part series where we have been exploring the emerging work that we are undertaking in partnership with Laudes Foundation. The overall goal of the collaboration is to look with fresh eyes at the challenges and opportunities presented by an equitable decarbonisation of Europe’s land and buildings. We are heartened by the response that this blog series has

#BendNotBreak

This is the final blog in a 4-part series where we have been exploring the emerging work that we are undertaking in partnership with Laudes Foundation. The overall goal of the collaboration is to look with fresh eyes at the challenges and opportunities presented by an equitable decarbonisation of Europe’s land and buildings. We are heartened by the response that this blog series has generated so far and we will incorporate the insights into our future thinking and work. If you have any thoughts or reflections on this final blog (or on blogs one, two or three) then please share them to progress this critical just transition conversation.

Part 4 (of 4): Network building: strategies to support and amplify the network of innovative organisations operating within Europe’s built environment Boldly inviting a more regenerative and inclusive future to be discovered
“It is exceptionally easy to become adjusted to catastrophe” — Adrienne Buller

In Blog 3 of this series, we highlighted some inspiring examples of actors innovating in the social and ecological spaces of Europe’s transition. These sparks of fierce hope must be celebrated and bolstered, but we must also challenge ourselves to recognise and lean into the strategic gaps between them. If we want to move from lifeboats and beautiful exceptions to enduring social and spatial justice, we cannot allow ourselves to become complacent.

Mapping the density of alternative housing and land models in Europe

The gaps and inadequacies in the more intangible aspects of our political economy, have been surfacing with increasing frequency across many aspects of our work. Despite this initiative being focused on the specific context of Europe’s built environment, some institutional and social intelligence gaps are pervasive across all sectors. In our recent conversations we have started to think about these insufficiencies in terms of three strategic themes, as summarised below:

(1) A new cultural logic

In our view the current cultural logic for how we share and use assets and spaces in our cities is outdated and unjust. In the future, how can we think about creating a new Use Justice Economy, to unlock fundamentally different ways of coexisting in Europe’s built environment? The melting pot of our physical constraints and social thermodynamics is forging a new political landscape that we are ill equipped to navigate. We must therefore begin constructing the inclusive, everyday politics for a post-abundant yet just society. This will require designing and testing new generative public engagement processes, together with facilitating the R&D that will be needed to support and evolve them.

(2) An interconnected understanding of civic value

The dominant economic theory of value is stifling appropriate adaptation and mitigation responses. Outdated accounting frameworks are no longer capable of capturing and presenting the intangible value of buildings and land (for example, the reduced levels of depression that are associated with co-housing models). They also ignore many of the associated risks and liabilities (such as the negative shared impacts on air quality of privately owned buildings continuing to be heated using fossil fuels). In reality, there are entire classes of assets that are missing from our investment and scenario plans; why for example are intangible public assets like preventative health still considered as a cost rather than an investment in governmental budgets? It is becoming apparent that the accounting, legal and investment frameworks of a just future are missing, whilst the existing versions will become increasingly obsolete as the constraint pressures escalate.

(3) Next generation institutions

As we have been writing these blogs a constant nagging question is the yes but…..How will many of the pathways we are suggesting be given the legitimacy and teeth to be impactful at the city, regional or country level? There is a clear need for an entire suite of next generation institutions; agile, multidisciplinary governance and decision-making vehicles, capable of building strategies and responses for the years of transition and beyond. Examples might include city land trusts (to collectively own and regulate the real estate prices), public-city banks (to receive and allocate funding on behalf of residents) or carbon and material city guardians (to measure and regulate the use of limited resources).

People who live in glass houses shouldn’t throw stones

It is very easy to sit behind a screen, talking and writing about the opportunities and challenges of a just transition. We might leave our comfort zone at times, sharing our models for feedback and inviting others to critique and improve our thinking, but the consequences of our actions are limited. Yet what would our response be if we were actually held to account? If Dark Matter Labs was somehow granted carte blanche over EU decision making for a day, what policies or interventions would we introduce? Let’s suppose we had a large endowment; how would we allocate those funds and why? Or, if we were asked to play an orchestration role in coordinating the transition landscape, how would we proceed?

In reality, any strategy is restricted by the quality of the inputs that shape it. If we had the required resources and authority, we would therefore seek to rapidly increase the number and diversity of people who are able to interact with the problem space. Many people have an intuition of what could and probably should be happening, but having the time or resources to engage deeply with such ideas is a privilege reserved for the minority. We are well aware that writing blogs such as this one, is a well-intentioned yet wholly inadequate contribution towards democratising the response dialogues. In our view, socialising the constraints and implied response strategies, together with actively engaging citizens from across Europe in their design, is an absolute priority. If we had the agency we would co-create a playbook for the future, a kind of break glass in case of emergency set of policies, designed by the citizens of Europe for our shared futures.

Photo by iSawRed on Unsplash The evolving funding landscape: opening a space for collaborative dialogue

From a funding standpoint, our instinct is that any collectively intelligent strategic response could (and in fact should) include varied and co-existing approaches. If we start from a constraints frame and fan out towards an ecosystem of desired outcomes, there is the potential to build coherence across the system without limiting the scope of imaginative possibility. It seems likely that there are areas of interest, geographical points of focus, or thematic investments that can be interwoven to support the overall transition. Or, perhaps by collating narratives, ideas, case studies and conversations we could encourage new connections to form that are beyond our current perception. It feels important that we move beyond conversations about distributed power, to open space for aspects such as the equity and inclusivity of who is included (and / or centred). In our experience, there is a delicate balance between the desire to be regenerative by design and the constrictive arrogance that can be contained within that aspiration. This is an interesting point of tension that we aim to respect without surrendering our agency or accountability.

Encouragingly, we have recently observed an increasing appetite amongst funders to discuss and coordinate their strategies. This is an exciting and emerging area of our wider work and is explored in depth in our BeyondTheRules publications (for example in this grantmaking provocation). At this point however, it is important to acknowledge that all the funders we interact with, already have deeply thoughtful and considered strategies in place. The following four groupings are therefore solely offered as prompts for discussion and reflection:

(1) Horizontal knowledge networks

How can we think about building the deep expertise needed to reimagine core components of the built environment? An example would be Community Land Trusts which have the potential to fundamentally change the unjust economic logic of land speculation. Encouragingly, the European CLT Network will launch in June 2023 (with support from Laudes Foundation and World Habitat) with the aim of scaling and deepening the impact of the CLT model across Europe. Despite this potential many challenges remain; for example, what kind of legislative changes will be needed across different geographies to support their wider uptake? How can we ensure they are not co-opted or exploited in the future? Will we need new accounting methodologies to represent their holistic value?

In 2022, the Young Foundation partnered with the Community Land Trust Network to bring together a group of researchers, to highlight barriers and opportunities in relation to diversity and inclusion in the CLT movement¹. A similar idea would be to analyse international best practice to identify existing and potential future pathway gaps. For example, a strong collaboration might be between the Grounded Solutions Network in the US, Community Land Scotland and a university research team.

(2) Moving from new economic demonstrators to full system demonstrators

System demonstrators intersect at multiple levels of the system; from mapping available land to designing new forms of finance or advocacy, thus shifting the boundaries of political possibility. If we think about eliminating vacant space in cities, what kind of actors would be needed to create viable operational models? How could they be replicated (or adapted) for varied geographies and contexts? How can we ensure robust knowledge sharing is built into their blueprint?

We conceive a system demonstrator to represent four intersecting elements:

The physical, tangible demonstration of a new way of organising. For example, innovative multipurpose space use in a central district; The system innovations that service it, such as holistic (multi-value) business plans and care agreements (as opposed to transactional contracts); The market making infrastructure that will enable scaling (e.g. public-commons institutions or inclusive planning laws); The cultural conversations that are necessary to expand the Overton window and shift market dynamics.

The coordination of these different threads is nuanced and complex. The space for creative divergence needs to be balanced with supportive, committed orchestration to drive the overall mission forwards. This is a critical area of collective capability which we believe will only be realised via deeply rooted place-based experimentation, combined with the development of technical tools and institutional reforms. Laudes currently supports some specific demonstrator projects, for example the EU Cinco collaboration which is showcasing projects for bio based and circular construction in Madrid and Milan. In the future, could we think about connecting demonstrators relating to different scales and contexts, to systemically amplify alternative visions and cultural attractors?

(3) Coordination / orchestration hubs

There is a vital requirement for actors who can play system facilitation roles in the built environment. Developing system capabilities is deep, complex work (with infinite unfolding directions and feedback loops) and thus requires reimagining our networks of governance. How can we begin designing and testing new types of highly agile institutions and equitable decision-making processes for this sector? For example, could cities with strong administrative innovation capacity (e.g. Amsterdam’s Innovation Unit or Chemnitz in Germany) be facilitated to support hubs in emerging locations (e.g. EKUL in Estonia or Restart Ukraine)? Similar existing examples are the EU EUBACT that supports cities to coordinate and share knowledge and the ALT/BAU Transfer Network that connects cities to activate unused and decaying housing stock.

(4) Distributed communities of practice and geographical hubs

Is there an opportunity to focus on core organisations who are centred on the inclusivity of ownership (e.g., CLTs, cooperative housing, other shared equity models) and support them to make the pathways for their operations (e.g, materials, governance, legals) and scaling (e.g, publicity, finance, policy) both more inclusive and regenerative? An example could be to look at the inclusivity of materials used in retrofitting and whether nature-based solutions could (a) improve the outcomes for low income groups in hot countries by reducing the costs of cooling (b) make the buildings more sustainable through carbon capture. Or, a cooperative housing network such as MOBA could potentially collaborate with an organisation in the same region (such as Brda in Poland) to improve the equitable and regenerative elements of their material procurement processes. In some cases this network could be enabled across contexts as a type of distributed community of practice. In others, a geographical capability hub might be more effective. In both cases the goal would be to strengthen or create the support pathways needed, for inclusive and regenerative scaling of the anchor model.

Alternatively, organisations who are creating these pathways could be supported individually, on the assumption that they will be needed for the future development and scaling of any collaborative housing models (including those that do not exist yet). These foundational pathway organisations could be conceptually clustered by theme; for example to address specific constraints (e.g. dematerialisation), or by leverage points in the system such as inclusive financing models.

Tracing potential networks of coherent collaboration The end comes before the beginning and the beginning comes after the end

At times during this project, looking for organisations who are working at the intersection of economic, ownership and governance issues, whilst concurrently striving for social inclusiveness and a regenerative future for Europe’s built environment, has felt like looking for a needle in a haystack. This is strangely reassuring! If the responses needed were clear and neatly packaged, we would most likely have missed the point. We are living through a complex, epoch-making moment that demands a planetary level of collective intelligence. In many respects, our understanding of the scale of Europe’s just transition challenge is only beginning to crystallise as we conclude this 4-part blog series.

We are grateful and honoured to be part of this community and look forward to continuing the conversation. Some immediate steps to move this work forwards are as follows:

We will write a follow-up blog to respond to challenges, thoughts and suggestions that we have received in response to this initial blog series; If you have an innovative model or idea that requires funding to scale or develop, then Laudes invite you to contact Laudes Foundation via this contact page and address the enquiry to Alice Haugh in the Built Environment team; Laudes is supporting a group of organisations to develop a collective theory of change to scale up the principles of collaborative housing across Europe: Sostre Civic, MOBA, UrbaMonde and NETCO. This initiative is led by ODS and they will be attending the International Social Housing Festival in Barcelona (7–9th June) and Urban Future Festival in Stuttgart (21–23rd of June). If this area of innovation appeals to you and is geographically practical then it could be a good opportunity to connect to this new network.

The Dark Matter project team who co-authored this blog are Emily Harris (emily@darkmatterlabs), Aleksander Nowak (Aleks@darkmatterlabs), Vlad Afanasiev (vlad@darkmatterlabs) and Indy Johar (indy@darkmatterlabs).

References:

(1) https://www.youngfoundation.org/peer-research-network/projects/diversity-and-inclusion-in-community-land-trusts/

Laudes x Dark Matter Labs: A just transition of Europe’s built environment (Part 4) was originally published in Dark Matter Laboratories on Medium, where people are continuing the conversation by highlighting and responding to this story.


Identosphere Identity Highlights

Identosphere 135: EIC Highlights • IIW Book of Proceedings • Working Draft: BBS Cryotosuite v2023 • TBD Launches New Web5 Toolkit

We curate the latest in decentralized identity, with upcoming events, use-cases and developments in open standards and the public sector. Thanks for supporting our efforts by Paypal or Patreon
Identosphere’s Weekly Highlights:
We Gather, You Read! We’ll keep aggregating industry info. Show support by PayPal, or Patreon! Upcoming

Recruiting participants for the Digital Credentials & Digital Trust Services Initiative Standards Council of Canada - Participation requested 5/25 

[Virtual Option] London Event: MEF CONNECTS Personal Data & Identity 5/25 Identity Praxis, Inc. ←-Rebekah CEO of Numeracle presenting

REDeFiNE Tomorrow featuring: Wayne Chang, CEO & Co-founder, Spruce Systems  5/25-26

How MyData conference became the most impactful event for personal data sharing 5/31-6/1

did:hack - decentralized identity hackathon for people to learn, collaborate, and build a project centered around decentralized identity. 6/5-8

Digital Identity unConference Europe 6/7-9 Zurich ← Registration open

[NYC] Velocity Network Foundation® 2023 General Assembly 6/19-20 

DICE: bringing the IIW format to Europe IdentityWoman.net 6/7-9

I’m really excited about an event coming up in a few weeks. The Digital Identity unConference Europe – DICE. Heidi, the primary IIW producer and I one of the three IIW founders are working with Procivis, Digital Trust and DIDAS to put the event on. It is happening June 7th-9 in Zurich.

Report Digital Identity and Verifiable Credentials in Centralised, Decentralised and Hybrid Systems DDX Digital ID Working Group, 2022

Includes a Big Overview of government adoption of VCs for Digital ID Across the globe

The international scan identified that DIWG member countries have varying levels of adoption and employment of verifiable credentials with similar variance in levels of governance models. Most established systems are operating within existing governing structures, with those nations within the European Union soon to enjoy the benefits of the European Digital Identity Framework. 

[Report, InternetNZ] DIGITAL IDENTITY IN AOTEAROA: Identity and Trust in an Increasingly Digital New Zealand DigitalIdentity.nz

This report gathers and shares people’s perspectives on digital identity in Aotearoa. While the focus is on business and technology perspectives, there is a clear message about the crucial role of building trust to enable beneficial innovation. People who have doubts about a system will not trust it, and will not use it.

Explainers

[KYC] Breaking Down Decentralized Identity and Know your Customer Entrust - by decentralizing the KYC process, we can create a more secure, transparent, and efficient online ecosystem

[Podcast] Micro-credentials in a Minute Episode 15: What Makes Verifiable Credentials so Powerful? Microcredentials Universe - while digital certificates or credentials are easy to forge and alter, VCs provide a means for anyone to quickly verify the accuracy of the data presented without compromising security and privacy.

Streaming Trust Phil Windley

Federated identity stores are like music CDs: large troves of data to be consumed from a single source. Self-sovereign identity is like streaming: share just what you want, just when it's needed.

What is “digital identity” ? SSI Embassador [Explainer] DID - Putting Control Back Into The Hands of Users Avant Blockchain Capital

The workflow would look something like this;

  ● The DID subject decides to create a DID to be shared with others this would contain the document itself;
  ● A timestamp is created;
  ● Metadata related to delegation and authorization;
  ● Cryptographic proof of the validity with public keys;
  ● List of services where the DID could be used;
  ● A JSON-LD signature to verify the integrity of the document (Off chain attestations – JSON Files or On chain attestations held in smart contracts)

Standards FIRST PUBLIC WORKING DRAFT: BBS CRYPTOSUITE V2023 W3C

The Verifiable Credentials Working Group has published a First Public Working Draft of BBS Cryptosuite v2023. This specification describes the BBS+ Signature Suite created in 2023 for the Data Integrity specification. The Signature Suite utilizes BBS+ signatures to provide the capability of zero knowledge proof disclosures.

Curated List Best of Digitial Identity - Decentralized Identity

Including: “Identity Wallets, Self-sovereign identity and decentralized identity projects”

This curated list contains 110 awesome open-source projects with a total of 560K stars grouped into 9 categories. All projects are ranked by a project-quality score, which is calculated based on various metrics automatically collected from GitHub and different package managers.

Use Case [Real World] Decentralized Identity in the Real World TBD

LinkedIn recently announced a new free verified badge feature that will be used to prove the validity of your employment, education, and other achievements. 

[video] Use Cases for Decentralised Credentials: Status Quo and Future Applications of Credentials DiscoXYZ CEO Evan McMullen on New Forum

Credentials vs SBT, NFT's, data primitives, public immutibility & transferability, Proof of Humanity on ETH, Contribution tracking, Social Capital, Value Identity, User experience and web-based reputation

Kaliya blogs at IdentityWoman.net Podcast with NEWFORUM comparing Web3 and Decentralized Identity IdentityWoman

including open standards for digital identity, the benefits and drawbacks of using standards for digital identity, the confluence between web3 and decentralized identity, and many more.

My Government Funded Research / Reports IdentityWoman

Gaps in Government Funded Identity Research [...] The stories are categorized according to Stakeholder Needs and placement in the Digital Identity Model. The results were compared against the funded portfolio projects to identify gaps in the funded portfolio

Decentralized Identity: Keynote Panel at Hyperledger Global Forum IdentityWoman

regarding decentralized identity, the level of adoption among companies and customers, and the factors that will ultimately lead to ecosystem acceptance.

Recap

IAt IIW, Chance McGee brought a unique look at humor, visual representation and storytelling to the SSI space:

IIW Book of Proceedings KuppingerCole EIC Highlights

[Video] Judith Fleenor - Decentralized Identity - Why is it all the Rage? Judith Fleenor, KuppingerCole

[Video] Verifiable Credentials for the Modern Identity Practitioner Vittorio Bertocci, Kuppinger Cole

[Linkedin, Entra Verified ID] Microsoft Putting Decentralized Identities Into Practice Kuppinger Cole

It builds on standards for interoperability.

It is supported by a growing partner ecosystem.

It neatly integrates with Microsoft Entra (including Azure Active Directory), LinkedIn, etc., and can be integrated with further solutions.

It provides concrete value to the users.

It helps in creating a critical mass of users.

Company News Microsoft Entra – Digital Identity for a Multi-Cloud World DigitialID Service

A flagship example of the service in action is how it enables verification of organizations on Linkedin, with members able to display this verification on their profile. With a few taps on their phone, members can get their digital employee ID from their organization and choose to share it on LinkedIn.

Introducing the Universal Namespace (UNS) GlobalID

This will allow users to own and control a unique name associated with their digital identity through an independent third-party namespace rather than one managed by GlobaliD.

Universal Namespace (UNS) UNS

The Universal Namespace allows users to create names for themselves that transcend the individual platforms where accounts are held. The same name can be used across platforms, simplifying the process of identifying accounts, while still allowing users to use multiple names if they want to hide their cross-platform presence from others.

New Dock Wallet Feature: Sorting and Filtering Credentials Dock Trusted Digital Transactions: Send and Receive Verifiable PDFs with Lissi Lissi

This technology is a game-changer for organisations, which deal with sensitive information and want to secure their customer relationship. In this article, we will focus on the Lissi Agent, which organisations use to interact with ID-Wallets such as the Lissi Wallet.

[Disco] dApp Dive: Disco & Decentralized Identity Zerion

Think of your Disco account as your “data backpack” with all your identifiers. You can connect your Ethereum address, your Twitter, Discord, and more.

Organization Four things everyone should know about the fair data economy MyData

It is therefore crucial for everyone to understand its basics. In the same way as we all learn about the basics of government, biology and history at school as subjects of universal relevance, so it is becoming more and more urgent for more and more people – citizens, consumers, professionals and individuals generally – to learn about the data economy.

DIF Newsletter #33 DIF

DIF is delighted to announce the appointment of Andor Kesselman as Technical Steering Committee Chair. Andor is the co-founder and CTO of Benri, a company that specializes in equipping developers with tools and infrastructure for onboarding onto the decentralized web. 

We’re extremely excited to announce the formation of the new DIF Korea Special Interest Group (SIG). Post by thechair of this new group Park Kyoung-Chul.  

Social Web Growing the Open Social Web: Why it's good that social media is suddenly confusing again Slifka

What most people understand as “social media” is actually the closed social web (CSW). We have an opportunity to begin an era of the open social web (OSW) - if we move fast enough.

Bluesky BlueSky FAQ BlueSky

This is a user guide that answers some common questions.

For general questions about the Bluesky company, please visit our FAQ here. If you’re interested in learning more about the protocol Bluesky is built on (the AT Protocol), please refer to our protocol documentation or our protocol FAQ.

Bluesky Social just took a big open-source step forward ZDNet

Paul Frazee, Bluesky's product developer and protocol engineer, said, "I want to give everybody a shout-out for how incredibly you've already engaged on the app codebase. You've all been extremely kind and helpful. You've honored my requests about the kinds of contributions. And, there have already been multiple awesome PRs merged. yall. rule."

TBD Jack Dorsey-backed TBD Launches New Web5 Toolkit to Decentralize the Internet Coindesk

The full Web5 platform is expected to launch later in 2023, but with this initial release, developers can start building decentralized applications on TBD’s developer platform.

WEB5: A DECENTRALIZED WEB PLATFORM: Putting you in control of your data and identity TBD

Web5 brings decentralized identity and data storage to your applications. It lets devs focus on creating delightful user experiences, while returning ownership of data and identity to individuals.

Web3 User Guide: Get started with Polygon ID Zero-Knowledge Credentials in the Verida Wallet Verida

Last week, Verida launched a new integration for the Verida Wallet, the first mobile crypto wallet supporting zero knowledge credentials issued through Polygon ID.

Biometrics EDPB adopts final version of Guidelines on facial recognition technology in the area of law enforcement EDPB Europa.eu

Among others, the guidelines stress that facial recognition tools should only be used in strict compliance with the Law Enforcement Directive (LED). Moreover, such tools should only be used if necessary and proportionate, as laid down in the Charter of Fundamental Rights.

FTC to crack down on biometric tech, health app data privacy violations SCMagazine

The FTC vote followed a second enforcement action taken under the HBNR against the makers of Premom on May 17 to resolve a host of privacy allegations, including that the fertility app and its parent company, Easy Healthcare, deceived users by sharing their personal and health data with third parties.

AI Watch

How Google is improving Search with Generative AI  Google - taking more of the work out of searching, so you’ll be able to understand a topic faster, uncover new viewpoints and insights, and get things done more easily.

TikTok creators use AI to rewrite history: ​​A viral trend imagines alternate timelines in which Western imperial nations never came to power.

India’s religious AI chatbots are speaking in the voice of god — and condoning violence RestofWorld

Really intense from Rest of the World Desperate migrants are ordering Uber Eats through the U.S.-Mexico border wall - One woman paid $100 for a whole chicken because the driver did not carry change — let alone in U.S. dollars. Thanks for Reading!

Read more \ Subscribe: newsletter.identosphere.net
Please support our efforts by Patreon or Paypal
Contact \ Submission: newsletter [at] identosphere [dot] net


auth0

Introducing Auth0 Templates for .NET

Create your .NET applications secured with Auth0 in less than a minute with Auth0 Templates for .NET.
Create your .NET applications secured with Auth0 in less than a minute with Auth0 Templates for .NET.

Forgerock Blog

Introducing the ForgeRock Experience Center

Have you been involved with acquiring a software product for your company? If you have, you know that finding and purchasing a solution that meets your specific enterprise needs can be a challenging process. We understand the difficulty that buyers face when trying to match vendor capabilities with business requirements. After reviewing multiple vendor solutions, product web pages, and data sheets

Have you been involved with acquiring a software product for your company? If you have, you know that finding and purchasing a solution that meets your specific enterprise needs can be a challenging process. We understand the difficulty that buyers face when trying to match vendor capabilities with business requirements. After reviewing multiple vendor solutions, product web pages, and data sheets, they all start to sound the same. And videos often focus more on the muzak and narration than on educating the viewer on a specific point. Free trial accounts provide rudimentary access to the product that does not give the user the ability to explore more complex use cases.

Better Understanding Leads to Better Decisions

Today we introduce the ForgeRock Experience Center. The Experience Center provides an interactive environment that lets users and administrators get a more immersive experience with the various aspects of our Platform, right from the website. Each experience leads the user through an educational journey that combines hands-on interaction with the ForgeRock user interface, including contextual explanations using text and video.

The ForgeRock Experience Center debuts with a library of experiences to illustrate important capabilities of the ForgeRock platform:

Passwordless login for consumers and workforce using passkeys Passwordless login for Enterprise workstations and infrastructure User registration with progressive profiling to maximize registration Multi factor authentication with choices of factors Self-service password reset to reduce user friction and helpdesk costs

Over the coming months we will continue to grow this library with additional experiences, including industry specific examples for use in financial services, healthcare, insurance, manufacturing, retail and others.

See and try the Forgerock Experience Center here.


Ontology

Ontology Weekly Report (May 16–22, 2023)

Highlights Ontology has updated the partners page on the official website. More than 70+ partners are now listed. We love partnerships and the collaborative spirit that fuels the Web3 movement. Latest Developments Development Progress ● We are 85% done with the high ledger memory usage optimization. ● We are 85% done with the EVM bloom bit index optimization. ● We are 75% done with the optim
Highlights

Ontology has updated the partners page on the official website. More than 70+ partners are now listed. We love partnerships and the collaborative spirit that fuels the Web3 movement.

Latest Developments Development Progress

● We are 85% done with the high ledger memory usage optimization.

● We are 85% done with the EVM bloom bit index optimization.

● We are 75% done with the optimization of ONT liquid staking.

Product Development

● ONTO announced the partnership with Manta.

● ONTO made an AMA event together with Dreamix.

On-Chain Activity

● 164 total dApps on MainNet as of May 22nd, 2023.

● 7,535,950 total dApp-related transactions on MainNet, an increase of 10,384 from last week.

● 18,683,191 total transactions on MainNet, an increase of 18,758 from last week.

Community Growth

● We started our Weekly Community Call Series. We talked about “How Web3 intersects people’s daily life”. Users expressed their insights around this and actively engaged.

● We held our Telegram weekly Community Discussion led by Ontology Loyal Members, discussing Meme coins. Participants got the chance to win Loyal Member NFTs.

● As always, we’re active on Twitter and Telegram, where you can keep up with our latest developments and community updates.

Global News

● We made a detailed guide on using iZUMI to swap ONG into ONT.

● We held a TwitterSpace chat with Gameta. Discover the future of GameFi, the power of DID in onboarding millions to Web3.

● We held a TwitterSpace to delve into iZUMIi ‘s iZiSwap support , exploring the expanding possibilities of DeFi on Ontology.

● Continuing our ‘Meet the Team’ series, we’re very pleased to ask Ontology Head of Social Media, Christina a few questions.

● We accepted the invitation from NOWNodes and discussed Ontology. Donny, Ontology harbinger, attended the TwitterSpace on behalf of Ontology.

● It’s time for our latest OWN101, as part of our OWNInsights series. This week, we bring you “Immutable”.

Contact us

- Ontology official website: https://ont.io/

- Email contact: contact@ont.io

- GitHub: https://github.com/ontio/

- Telegram group: https://t.me/OntologyNetwork

Ontology Weekly Report (May 16–22, 2023) was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


IDnow

The 3 pillars of responsible gambling.

We sit down with Martin Lycka, SVP for American Regulatory Affairs and Responsible Gambling at Entain to discuss Brazil’s upcoming gaming regulations, advice for operators, and the importance of integrity in the emerging LATAM market.  There is a great disparity in global gambling regulations, in terms of scope and severity, how do you compare the […]
We sit down with Martin Lycka, SVP for American Regulatory Affairs and Responsible Gambling at Entain to discuss Brazil’s upcoming gaming regulations, advice for operators, and the importance of integrity in the emerging LATAM market.  There is a great disparity in global gambling regulations, in terms of scope and severity, how do you compare the European regulatory landscape with emerging markets like LATAM? 

Europe, at the risk of stating the obvious, is way ahead Latin America (LATAM) in terms of gambling regulation. However, despite a few false dawns, Brazil is firmly on the path to regulating sports betting in a near future.  

Columbia is a great example of a very successful LATAM regulatory regime, which has drawn a lot of inspiration from gaming regulation in Spain. I think this is partly because of similarities in language and culture. It’s also worth bearing in mind that although individual jurisdictions are different and follow different kinds of sports and markets, we are talking about sports betting, so basic product mechanics will always be the same to reflect requirements in regulation. 

Which other LATAM countries are currently in the process of regulating the industry? 

Most Latin American countries have already regulated the land-based sector in one way or another, so it really comes down to regulating their online space. Some Argentinian provinces have regulated online gambling, and, like Canada, appear to be taking it province by province. Peru and Chile are also looking into potentially regulating online gambling and I’m hopeful it will happen in the foreseeable future, but as they’re both facing a political crisis, this is unlikely to happen any time soon. 

Although Peru passed legislation that would permit the regulation of the online gambling sector last year, recent riots have again stalled any developments. In Chile, President Gabriel Boric has stated that he’s keen on revamping the Chilean constitution, but that process failed, and as a result, he’s in the process of picking up the pieces and pondering what to do next.

Gambling regulations 101: Europe and the UK. Discover how gaming operators can keep pace with the ever-growing multi-jurisdictional landscape in Europe and beyond. Get your free copy Brazil is on the cusp of regulation – when is Brazil’s gaming regulation likely to be passed?  

The regulation will go to the Congress as a legislative bill, with constitutional urgency. This means that the Congress will have to debate and vote on it in the next 3-6 months. So, if the bill is approved, it’s likely the regulated market will launch in early 2024. As the bill is at an extremely high level, it authorizes the authorities to kick off the licensing process. There will be an uncapped number of licenses, an open licensing system, which provides for additional taxation measures. Regulation is likely to cover the process of instituting licensing, payment processing, and the requirement for internet service providers to block the internet connection for operators who choose not to become regulated. 

What is your advice to operators based in, or targeting the Brazilian market? How should they prepare for upcoming regulation?

It should by no means come as a surprise. Brazilian sports betting regulations have been debated since 2018, when the government was authorized by Congress to regulate sports betting. My advice to operators is to keep a close eye on proceedings and start getting themselves ready. In the absence of the final iterations of regulations, however, it’s still advisable to exercise caution until regulations are passed. 

It’s very likely that the regulation will require operators to incorporate an entity in Brazil. I would not expect an obligation to locate the actual premises on the ground, as that could effectively kill the industry, but operators should be prepared to provide the Brazilian authorities with access to the data of Brazilian customers in one way or another. This could be remote, or by means of having a data server on the ground. Regarding customer verification, I believe this will be linked to their CPF identification number (a tax number attributed by the Brazilian Federal Revenue to Brazilians and foreigners who pay taxes in Brazil). There is also likely to be a focus on sports integrity, and on responsible gambling, not only because of recent match-fixing scandals that have marred Brazilian football 

The government and regulator have reviewed existing regulations in other countries, so it’s also likely to take inspiration from those. 

There’s a lot of talk about Responsible Gambling in the UK government’s gambling white paper, and it’s likely to feature heavily in Brazilian regulations. What does Responsible Gambling mean to you? How do you interpret it? 

Responsible gambling is critical to the long-term sustainability of this industry. I’ve been personally working very closely with the regulator on responsible gambling standards and fully in intend to continue to do so. When the Brazilian market is finally regulated, that will provide both the operators and authorities with tools to more efficiently address the kinds of match-fixing scandals that we have seen recently.  

The onus is on us as an industry to protect our customers, and, of course, help our customers to help themselves. That still gets sort of underestimated. If customers are beyond the point of being able to do that, that’s where we need to step up even more. The more individual care we can provide problem gamblers the better. Because what works for one person may not work for the other. Just like on the consumer side or promotional side of things: what may not be of interest to you, may be of interest to someone else, and vice-versa. This underscores the need to individualize and personalize care. That’s what I would love to see in the years to come throughout Latin America. 

In a tangible sense, what would these kinds of measures look like?

In my view, it’s a combination of three critical pillars.  

Education. We need to find even more efficient ways of educating everyone involved, which goes well beyond our customers. It’s the public at large, including key decision makers, which is why Entain has launched programs here in the States (and in other countries), with the likes of the NFL, MLS, and most recently with the NHL alumni. I’ve also worked with retired athletes such as Charles Oakley and Jayson Williams to serve as responsible gambling ambassadors to convey even more efficient messages about responsible gambling. 

Prevention. We need to provide our customers with tools to help themselves, and if they cannot do that anymore, we need to step in, take over control of the gambling platform – we call them interceptors. Or even just by giving customers a ring. 

The third pillar is to put customers on a firm path toward healing; a firm treatment path. Again, the onus is on us, the industry. 

Why Ontario’s iGaming model will be an industry game changer. Discover how IDnow can help you tackle Ontario’s complex iGaming market. Get your free copy Regarding the Canadian gaming sector, do you think it’s only a matter of time before other provinces follow Ontario and open the market to private operators? 

Yes, it is the short answer to that, I expect Alberta to follow suit very soon, and then British Columbia, but I’m not going to speculate as to the timetable. I think if other provinces choose to go down the Ontario route, it would be the most sensible thing to do.  

Why do you think some countries, for example Canada and Argentina, favor such a province-by-province approach to implementing nationwide regulations? 

It’s a constitutional and legacy issue in both instances. In Canada, for example, back in 85 or 86, the federal government devolved competencies to regulate all forms of gambling, not just online gambling, to individual provinces. That is the setup, which is like the US. New Jersey, where I am based, recently won a very hard-fought for victory to regulate sports betting and online gaming, and it’s the same in Canada.

If you’re interested in more insights from industry insiders and thought leaders, check out one of our interviews from our Fintech Spotlight Interview series below.

Roger Redfearn-Tyrzyk, Director of Global Gambling & Sales at IDnow Viky Manaila, Trust Services Director at Intesi Group Brandi Reynolds, CAMS-Audit, CCI, CCCE at Bates Group David Gyori, CEO of Banking Reports David Birch, global advisor and investor in digital financial services

By

Jody Houton
Content Manager at IDnow
Connect with Jody on LinkedIn

Our highly configurable platform for identity proofing in the gaming market allows operators to keep abreast of constantly changing regulations, and new fraud schemes and scams, while ensuring a safe and secure gaming experience.


UbiSecure

Facilitating the Future of Finance: Open Banking & Open Finance with Michelle Beyo, FINAVATOR – Podcast Episode 91

Let’s talk about digital identity with Michelle Beyo, CEO and Founder of FINAVATOR. In episode 91, Oscar is joined by Michelle Beyo,... The post Facilitating the Future of Finance: Open Banking & Open Finance with Michelle Beyo, FINAVATOR – Podcast Episode 91 appeared first on Ubisecure Customer Identity Management.
Let’s talk about digital identity with Michelle Beyo, CEO and Founder of FINAVATOR.

In episode 91, Oscar is joined by Michelle Beyo, CEO and Founder of FINAVATOR. They discuss how Opening Banking and Open Finance is facilitating the future of finance and the role digital identity has within this. Join Michelle and Oscar as they explore what open banking and open finance are, benefits and potential privacy issues. Alongside sharing success stories from around the world and what we can except to see in the future.

[Transcript below]

“Open finance layered in with a digital identity can truly help us plan better, execute, have better offerings, save money, and be able to plan better for our future.”

Michelle Beyo is the CEO & founder of FINAVATOR, an award-winning Payments and Future of Finance Consultancy. She is also a strategic advisor to FinTechs, a Money 20/20 Rise Up alumni, a Global Council Member of Women in Payments, the Membership Chair at Canadian Prepaid Providers Organization, a Payment Advisor at National Crowdfunding and FinTech Association of Canada, and a Board Member at Open Banking Initiative Canada.

Michelle started FINAVATOR as she is passionate about payments and financial inclusion. She has 20 years of extensive industry experience driving innovation across the retail and payments industry. Michelle Beyo was named the “Top 30 Best CEOs of 2021” by The Silicon Valley Review and FINAVATOR was awarded “Most Influential Leader in FinTech Consulting – Canada” in 2020.

Find out more about FINAVATOR at www.finavator.com or Michelle Beyo at www.michellebeyo.com.

Connect with Michelle and FINAVATOR on LinkedIn.

We’ll be continuing this conversation on Twitter using #LTADI – join us @ubisecure!

Go to our YouTube to watch the video transcript for this episode.

Podcast transcript

Let’s Talk About Digital Identity, the podcast connecting identity and business. I am your host, Oscar Santolalla.

Oscar Santolalla: Hello and thank you for joining a new episode of Let’s Talk About Digital Identity. And today, we’ll hear some new ideas about open finance, open banking and definitely a bit more.

For that, we have our special guest today who is Michelle Beyo. She is the CEO and Founder of FINAVATOR, an award-winning payments and future of finance consultancy. She’s also a strategic adviser to FinTech’s, a Money 20/20 Rise Up alumni, a Global Council Member of Women in Payments, the Membership Chair at Canadian Prepaid Providers Organisation, a Payment Advisor at the National Crowdfunding and FinTech Association of Canada and a Board Member at Open Banking Initiative Canada.

Michelle started for FINAVATOR as she is passionate about payments and financial inclusion. She has 20 years of extensive industry experience, driving innovation across the retail and payments industry. Hello, Michelle.

Michelle Beyo: Hi, Oscar. How are you?

Oscar: Very good. I’m really happy to have you here in the show.

Michelle: Happy to be here as well.

Oscar: Excellent. So, Michelle, let’s talk about digital identity. I want to start hearing a bit about yourself and your journey to the world of identity.

Michelle: Yeah, I’m happy to share a little bit. I actually spent 20 years in the corporate space. Six years in telco and eight years in online shopping affiliate marketing. Ran Alaska, Lufthansa, Delta, United online shopping mall platforms. I really got to understand the relationship between customer and loyalty infrastructure.

And then I moved into the payment space. Working for the largest prepaid company globally, called InComm, out of their international office for 30 countries. And was running sales and marketing, launched their B2B division, got to see what was happening in innovation across these 30 other countries, including Singapore, Australia, UK. Helped launch WeChat in North America at 711 through the Gift Card rail, QR payment system. And truly realised – a little fearful that my kids were going to end up with Asian banking. Due to the advancements, and how far beyond where we are in North America, that Asia basically was from a banking infrastructure set in 2017.

And I took a leap into the startup world focused on blockchain digital identity at a startup as a Chief Client Officer in 2018. And after a year with them and helping with Bahama digital ID infrastructure and helping consent on blockchain. I actually won Money 20/20 Rise Up. Where they picked out of 500 women, 30 women to come into the Vegas largest payments conference in the world and have a separate accelerated track.

And as soon as I found out that I had won one of this coveted 30 spots, I quit my job at the startup and started FINAVATOR. Which is actually now four years ago in July. And starting this consultancy, did not have any consultancy experience. But did have all of my background, which I felt was touching the future of finance from telco infrastructure to affiliate marketing, online shopping. The move to digital prepaid payment infrastructures, how they were backing all new challenger bank infrastructure, BaaS infrastructure, and then digital ID.

So FINAVATOR truly became my ability to try and help banks, credit unions, FinTechs and corporations move to the future of finance. And really have enjoyed my journey out on my own.

Oscar: Yeah, definitely quite interesting, because you have been involved in several industries that are pretty different itself. So many are, yeah, oriented to interacting with the customer. So, understanding how the customer – what the customer needs, etc. And then just in the last year, you came to identity. So now you have this amazing experience and you’re doing your own consultancy.

As you mentioned, you have been working on a lot of payments and that is leading you to the future of finance. So, the topics we’d like to start addressing today are open banking and open finance. So, if you can give us what are these two terms in a nutshell, what would you say?

Michelle: Yeah. I think, at its simplest point – open banking, which started in the UK in 2017. Is a safe and secure way to share data in an ecosystem. So, thinking of back to my telco days, when I started. You would sign up to one provider for three years, and you couldn’t leave. If you left, there was a penalty, and your number was owned by that telco. So, if you went to a different telco because they had a better service, you’d basically lose your identity, which was your phone number. And have to send an email to all of your friends with your new number. And they would have to reprogram your phone number in their phones.

There was something called Open Telco. Or at least number portability, that was mandated in Canada and many other countries around 2015. And this allowed to empower the consumer to officially own their phone number. So, if I left one telco to go to another, I didn’t have to lose my identity, which I had built for, let’s say, 10 years. As this phone number represents myself. So, I was able to port it to a competitor to get better service.

So, to me, open banking is that same concept of having a safe and secure way to port my data. From one bank to another bank, from one bank to a FinTech, from a bank to a wealth advisor. So really just giving me the freedom that – the information that is mine, that defines me, can be utilised to help me get a better loan. Help me get a better rate, help me get the service that is customised to myself. Based on the data that happens to live with my current bank.

So open banking was a regulated movement that started in the UK to force the CMA 9, which is the nine biggest banks in the UK, to create an API that was standardised. To allow for safe and secure data sharing, that was all based on consumer consent. As well as create competition, by allowing FinTechs or third-party providers to hit a certain bar of the certification to be allowed in the system.

So, let’s say Revolut. If you were a Lloyds customer, and you wanted to go to Revolut. And you wanted Revolut to have these five pieces of data to offer you a different product that maybe had better pricing. You were able to do that through consent through the Revolut app. And that data was then able to safely port from Lloyds to Revolut.

And the biggest point, I think on all of this is – in open banking there is a right to delete your data. So that data can then be deleted and to me, this is creating less data in the world. And having more control over it as a consumer. As well as empowering new services, new offerings, new companies to help serve the underserved and help serve the current market in a better, more efficient way.

Oscar: Yes. And I like your analogy. You started talking analogy also in telecommunication in the mobile, consumer mobile networks. The mobile number portability, which is something I think at this point, I’m not sure it’s everywhere in the world. But I think it’s by large in many countries, it’s available and it’s something that today we take for granted. But it was very painful, not long ago, it was very painful as you have described.

So just the idea of having a similar easiness in translated to the to the banks sounds like a dream for the ones who still have not experienced. I have not experienced something like that yet. Yeah, so definitely it sounds like a great thing to keep it spreading. And you have summarised saying that this open banking is in a nutshell is securely sharing data of the consumers. So, one consumer can move from one bank to another, or even a FinTech as you mentioned so.

Michelle: Yeah, and I think the evolution of that is open finance. Which I would say is a hot topic in today’s market. The UK is moving to PSD 3, which is bringing them to open finance. Australia started with open data as a concept through a Consumer Data Right for all citizens across five industries. Which I think is the most concise vision across all countries. So, they started with open banking, moved to open finance, open telco, open energy, and then they’re going to land in open data. And it’s all centred around a Consumer Data Right across all data.

Very empowering vision coming out of Australia, that many countries are just starting with open finance. Turkey, Nigeria, Saudi Arabia, Brazil, just moved to open finance. So just to describe it – it really is, instead of just being banking, FinTech, third-party payments data or bank account data. It’s broadening the spectrum to the insurance, wealth, mortgages. Kind of more of a holistic view of anything that touches your finances. So, it’s really expanding to allow you to port your data from multiple different aspects of finance.

Oscar: OK. So, the key here in open finance is that you do similar – let’s say portability. We use it, we use the same word between different services. Not necessarily financial services, but as you said, that touch some financial data, correct?

Michelle: Yes. So if you want to use some data from your Lloyds account to help you get a faster, cheaper, better mortgage that’s more customised to you. Maybe that mortgage provider is not a bank, but they’re a licensed mortgage provider that has certified in the system. Then you’d be able to facilitate that data sharing, same example to a wealth provider or an insurance provider.

Oscar: Alright. And besides that, benefits of the portability that we can, I can even visualise on my mind. What are the other benefits that there are for both the consumers and for businesses?

Michelle: Yeah, I would say one of the biggest ones is – when you think of FinTechs trying to get certain aspects of data. And not having to get data they don’t need – so only getting the five pieces of data, with clear consent from the customer. And the customer not having to screen scrape this data out of their account without their knowledge.

So, a lot of screen scraping issues are when open banking first came to fruition in the UK. It’s largely because 1 million UK citizens were screen scraping. Which is a service that is being utilised where it looks like you’re logging into your bank. You’re putting in your passcode, and then it’s giving access to that FinTech to look at your overarching account and scrape the whole data. To only grab the five pieces they need to push it into the system.

So, what this does is [A] it’s unsecure. [B] the customer has no idea they’re breaching their bank agreement by using the service. And then the FinTech ends up with all this data that they don’t need, or want. Have to store it safely and securely, when they only needed the five pieces.

So, when you get to an open banking system, they request the five pieces, they get the five pieces in a safe, secure type of API. And then, therefore, they’re able to delete those five pieces of data, because the way that it was coded into the system, if so requested by the customer. So, it’s a data management system, all based on consent.

Oscar: Yeah, it sounds pretty good absolutely. Because imagine that all my data that is on my bank is passed to the – let’s say insurance. And then the insurance has the duty to delete whatever they don’t need as well, sounds terrible. Because you know, the less data that is transferred, the less data that is stored somewhere, the lower the risk of so many data breaches that are happening nowadays.

Michelle: Yeah, on the data breach point, I always like to bring up unfortunately Marriott because they had 7.1 million data breach occurrences at one time, and it was an internal issue. They were like layering in some accounting, or loyalty system and it was an internal data breach. And this was back I think in 2018. They didn’t compensate any of the users. But think about anytime you check into a hotel. At this point, they asked for your driver’s license or your passport, plus your credit card. The amount of data a hotel has on you is pretty concerning, considering they don’t have the data security standards that you would have at a bank.

So, if we can get to a world – getting to your digital identity questions. Where a QR check in doesn’t actually have them store any of my data, but just validate I am who I say I am. So that they don’t need to actually hold my actual passport image with all of my sensitive data. In a non-secure, I don’t want to say non-secure, but not highly secure infrastructure.

Oscar: Yeah, exactly. Another good example, obviously, the hotels. They will benefit, both the businesses and the consumers would benefit with open finance. And yes, I start – while you explain this idea, I was, OK, some of the data passes from one, let’s say from the bank to the insurance company. But just a minimum should be passing, so that – also thinking from the identity point of view. I’m imagining the federation, right? So, at this point, what is on your view the role of identity on this paradigm that you just described?

Michelle: Yeah, I think it’s quite paramount as a base layer to most systems. Because if you can authenticate you are who you say you are, that’s the most important part of any one transaction. Especially a transaction that has to do with your data or has to do with your finances. So, I think it’s quite crucial that we find a way that authenticates ourselves. Especially with AI, and all of this machine learning infrastructure, cybersecurity challenges.

How do we ensure that we are the only entity that is Michelle Beyo and that I can then surely authenticate myself? Before I do a data share from one bank to the other, or before I do a financial transaction. And we’re going to have to layer up from our six-digit code being sent to a phone text to authenticate yourself. As we move forward in the future of finance. So, I think digital identity is crucial. And has to be put into a system, in a way that ensures that there’s only one identity for any one person.

Oscar: Yeah, indeed. There has to be some level of strong authentication, that that is a must. And as you have mentioned a bit earlier also, always with a consent, inevitably, data sharing transactions.

Now, moving into what are the standards to also understand – without going into too much detail. You mentioned that this started in UK and in UK, there more implementations. This is really happening in real, but what are the main standards that are making this possible? Or are going to make this even more possible if we think of open finance?

Michelle: Yeah. So you know, what’s interesting is – as you look at the world at the moment, and you look at open banking, open finance. Not all countries have a digital identity infrastructure. So, what that does is makes the open banking infrastructure more complex, harder to authenticate. And I think even more than open banking –  real-time rail infrastructure needs the authentication. Digital identity for any type of fraud reduction of authenticating you are who you say you are, and it’s going to an entity who is authenticated. So that we can remove the scams out of the system.

I’d say the best digital identity infrastructure is probably the Indian-based UPI. It was government issued; it was a mass amount of people. And it was done very early on, on a global scale. It’s not the exact model that probably should be utilised for other countries. But they have definitely – through their digital identity framework, have been able to even. There’s homeless people in India with QR codes and a bank account due to their digital identity infrastructure. And when you pass them in the streets or you pass a tiny shop selling something, they have QR-based payment infrastructure that is largely attached to their digital identity. Which creates a more financial inclusive infrastructure.

In Australia, they have a digital identity framework but it’s not as widespread to the same degree as India. The UK is still working on their digital identity infrastructure. So not every country has lined up, open banking, digital identity and real-time rail. But these are three very crucial aspects to the future of finance because the authentication from digital ID is a safety point. The real-time rail is the fast and secure movement of the funds. And the open banking is the safe, consent-driven data sharing aspect. So, once you have all three of them, you’re really setting yourself up to be facilitating the future of finance.

Oscar: You mentioned one term that maybe is not so familiar, at least for me, you mentioned real, real-time rail. What is that exactly?

Michelle: Yeah, they’re real-time rail is an instant payment system, sometimes called that. And the first one ever created was in Switzerland, actually, in 1989, 66 countries have faster payment systems. The UK launched quite a long time ago. But the US just launched their FedNow, that is what it’s called in the US. Which is their real-time instant payment rail, just this year. And Canada hasn’t launched theirs just yet. So, there’s many countries who have this payment infrastructure. When you look at the US last year, or Canada, still, it takes three days, three to five days for bank payments to clear and that’s just the older infrastructure of payment settlement.

Oscar: OK, OK. Perfect. Yes, indeed, you have emphasised that all these components needed in, of course, the authentic the national digital identification is a key point. You are correct, not many countries in the world have something, I will say, suitable enough for doing this open finance. I was – in terms also of authentication that reminded me that, for instance, the FinTechs has been for a while. And not long ago the authentication was just username and password, nothing else. So, of course, now, most of the FinTechs have something better than that. But yeah, I can see something that it takes time. All this component takes time to come together to make possible some of these use cases.

So, if you can tell us some of these success stories, now seeing from the perspective of use cases. Let’s say success stories from, if you can, from different part of the world also to illustrate it better.

Michelle: Yeah, so if we’re talking digital identity, I think Scandinavia has done probably one of the best jobs. I think Estonia was one of the first. The other really crucial part of digital identity is you can’t have CBDC, or digital currency in a very safe and secure way without a digital identity framework. So, I think there’s some great examples down that front.

When we’re talking open finance, open banking, the countries I’m most impressed by, obviously, is Australia. They are a country that has five major banks. They are kind of an oligopoly in the sense that those five banks hold quite a bit of the customer base. But they took an initiative past open banking, past open finance, to embed a consumer data rights to every citizen across five different industries with a roadmap to start with open banking. Moved to open finance, open telco, open energy, land with open data, which is really future proofing their country, for the future of the ecosystem. A digital ecosystem, which every business is now turning into a digital business.

So, they’re going to have a really great base layer of understanding that the customer owns the data, the customer is able to port the data, and the customer is able to delete the data. So, by creating a data right infrastructure, and then porting it across multiple industries. I think they’re going to have incredible innovation and eyes are definitely on them as they’re enabling this ecosystem. That really is kind of the future of any one country’s vision of how do you enable digitisation of an economy.

The other country that I’m pretty impressed by is Brazil. In the sense that in the middle of the pandemic, they made their first move to open banking. They made a 12-month mandate that they were going to hit an open banking live ecosystem within 12 months. And open access to their Central Bank of Brazil. And therefore, by opening the access to registered TTPs, which are Third-Party Providers. Companies, like Pix, were able to create a FinTech that reduced the cost of sending money and took the underbanked, underserved in Brazil, and gave them a digital bank with faster, more affordable payments. And I don’t have the exact number. But I believe they’re past 7 million customers and doing billions of transactions on a daily basis. And I believe they reduced the cost something like by 40%, by being able to have direct access to the central bank and fall directly in line with the open banking system.

And after 12 months of being enabled to an open banking system, they immediately started working on an open finance system, and are launching that within 12 months. So, I think the alignment, the passion, and the execution out of the Brazilian market is pretty impressive. And just the pure enablement of new FinTechs that are more affordable services. And finding ways to serve the underbanked, underserved, they’ve done a phenomenal job.

Oscar: Yeah, it sounds like that – it sounds definitely amazing. Among all these, well, existing use cases and what comes in the future for open banking and open finance, what are some potential privacy issues that you could tell us?

Michelle: Yeah, I think every system has to be truly based in a liability model. This liability model has to be extremely clear to everybody within the system. There has to be protection on that liability model. And I think it’s just ensuring that the certification system that allows for third parties to come into the system is robust, is reviewed, that these parties that have been certified inclusive of banks are always looked at to ensure that they’re continued to be certified to have access to the system.

But I do foresee in the future that customers are going to choose to have some type of insurance on their data. That they so choose, just like you have insurance on your travel, or insurance on your health, like actual data privacy insurance.

Because – think of the Marriott issue, or gosh, there’s data breaches every day of the week, and none of the data breaches have to do with open banking, open finance, they’re internal data breaches or external data breaches, or hacks. That there’s no real repercussion to the customer, like, or to the actual party who has had this data breach. There might be a fine, but there’s no settlement to the actual end user whose data has been potentially put on the dark web or given to different parties.

There’s got to the point where, if we have enough of a safe and secure data sharing infrastructure, we should be able to insure our data and be safe and secure. And if it’s breached, have some type of offset. But we have to get to a much safer secure infrastructure of how data is shared in the first place.

So, I just truly see that open banking, open finance is creating the pipes for the water to go through and be able to turn them on and turn them off. And we just don’t have those pipes today in every country. And I think it’s just super important for the next layer of the future of finance.

Oscar: Yeah, indeed. Now, if we look at the future, what kind of use cases or what open finance can do in the future? Something that we are not seeing today.

Michelle: Yeah. So, in some countries, they started to enable dashboards, like holistic dashboards of your financial health. So, in these dashboards due to open banking, you would be able to see what your mortgage is. And then it would be able to AI predict what other offerings you should potentially layer in. To add to this product, or tell you that your current mortgage is not serving you. And that there’s three or four other offerings that would be a better mortgage based on your current finances, or the market. And then they’d be able to offer you three different companies that you might want to look into.

So, what this service can then do is you can actually put in your loans – this dashboard would be personalised, just for you to see kind of your financial health. It would help people have an ability to plan better, understand their finances a little bit better. From that perspective, I think it’s going to also create a whole bunch of things we haven’t even thought of, new services, new opportunities, and new ways to ensure you’re saving for your retirement.

Just kind of like round up did in the sense of, you know, if you’re paying for coffee, and it’s $1.50, rounding it up, or if it’s $1.40, rounding it up to $1.50 and then putting that in your pension plan or putting that into a robo-advisor. So that you’re earning money by saving without knowing it, or without feeling it, kind of perspective. So, I think just like the Internet has changed so many ways of what we are doing, and made our lives easier, in many ways. I do think that open finance, layered in with a digital identity can truly help us plan better, execute, have better offerings, save money, and really just be able to plan better for our future.

Oscar: Yeah, sounds definitely a lot to expect for the future what open finance will bring us. So, Michelle, last question for you, for all business leaders that are listening to us now, what is the one actionable idea that they should write on their agendas today?

Michelle: Yeah, I think what they should write is that innovation is driven by ideas. And that there’s an opportunity, especially now that the world has gone digital, to listen in to panels, topics that interests you, but you don’t have all the details on, similar to this podcast.

There’s panels happening in Australia on open finance, or Brazil, that you could listen into. You don’t actually have to travel to these conferences. But you can truly grasp the innovation that’s happening in other countries. And then think about how you can create something for your citizens, for your company. To pivot and start moving towards the future of finance, by learning from other countries who are already there.

Oscar: Yeah, definitely, I couldn’t agree more. And that’s really one, learning more about these interesting topics that are going to impact us mostly positively in today, in the future is also one reason why we invited you. So, thank you. Thanks a lot for being with us and this was really fascinating conversation with you, Michelle. If people would like to follow the conversation with you, or know more about what you’re doing, what are the best ways for that?

Michelle: Yeah, definitely to follow me on LinkedIn. Simply find Michelle Beyo, follow FINAVATOR on LinkedIn and Michelle Beyo as well as reaching out to us on our website at finavator.com.

Oscar: OK, excellent. Many ways to do it. So again, Michelle, it was a pleasure talking with you, and all the best.

Michelle: Thank you so much, Oscar. It was a pleasure being here. Have a wonderful day.

Thanks for listening to this episode of Let’s Talk About Digital Identity produced by Ubisecure. Stay up to date with episode at ubisecure.com/podcast or join us on Twitter @ubisecure and use the #LTADI. Until next time.

The post Facilitating the Future of Finance: Open Banking & Open Finance with Michelle Beyo, FINAVATOR – Podcast Episode 91 appeared first on Ubisecure Customer Identity Management.


IDnow

IDnow VideoIdent obtains the first LSTI and CLR Labs ISO/IEC 30107 certificate

Saint Malo/La Ciotat/Munich, May 24, 2023 – CLR Labs, a European laboratory dedicated to the evaluation of biometric and security technologies, and LSTI, a conformity assessment body (CAB) specializing in cybersecurity and data protection, are pleased to officially launch their ISO/IEC 30107 conformity assessment scheme, which enables them to evaluate any product or service using […]

Saint Malo/La Ciotat/Munich, May 24, 2023 – CLR Labs, a European laboratory dedicated to the evaluation of biometric and security technologies, and LSTI, a conformity assessment body (CAB) specializing in cybersecurity and data protection, are pleased to officially launch their ISO/IEC 30107 conformity assessment scheme, which enables them to evaluate any product or service using biometric technologies. IDnow, one of the leading European identity proofing platform providers, obtains the first LSTI ISO/IEC 30107 certificate for its VideoIdent solution.

The first European certification scheme for biometric technologies

Published in 2017, the ISO/IEC 30107 standard defines safety measures and presentation attack tests on remote identity proofing systems, in order to protect them and prevent these attacks. The conformity certificate is issued by LSTI, based on CLR Labs’ evaluation work –  which involves both presentation attacks and biometric data injection tests, such as facial recognition attacks based on deep-fake technology. Until now, the only certification scheme available was from the United States. In Europe, only compliance projects have been carried out so far.

Therefore, this collaborative initiative is unique in Europe, as it defines a guaranteed level of performance assessment of presentation attack detection given by biometric technologies, acknowledged by the cyber industry.

A partnership between two major names from the cyber assessment and evaluation industry

Together, LSTI and CLR Labs designed a common assessment scheme which can be applied to multiple products and services, such as enrolment stations or kits, border automatic doors, biometric readers, entry-exit systems, digital wallets, Trust Service Providers, and any other biometric tech products.

This conformity assessment scheme allows the European cyber ecosystem to acquire the necessary capabilities to evaluate and certify biometric products including presentation attack detection, with a made in Europe label, thus ensuring the promotion of European excellence in cybersecurity.

The first product certified ISO/IEC 30107 according to this scheme

In this European framework, it is the leading identity proofing platform provider IDnow, who becomes the first company to obtain an ISO/IEC 30107 certificate for its VideoIdent product. IDnow VideoIdent is an expert-assisted video identity verification product that is facilitated by trained agents in conjunction with backend AI controls.

The VideoIdent process validates the user’s face biometrics and performs liveness checks to detect and prevent presentation attacks. The ISO 30107 evaluation certifies that IDnow VideoIdent meets such strict requirements on biometric identification. LSTI has assessed the level of VideoIdent as “substantial”.

Statements from the companies’ representatives

“This certificate is the result of the successful partnership between CLR Labs and LSTI, but also of the amazing conformity upgrade work done by IDnow on their product VideoIdent”, says Armelle Trotin, CEO of LSTI group.

“We were thrilled when we saw the first certificate emitted by LSTI, as building an ISO/IEC 30107 conformity certification scheme in Europe was one of our goals since the creation of our lab”, declares Stéfane Mouille, CLR Labs Director.

“We are excited to have received the certification for the international ISO/IEC 30107 standard. With the support of LSTI and CLR, we were able to successfully test the strength of our VideoIdent presentation attack detection during the biometric capture step of the remote identity verification process. This certification raises the bar for our fraud prevention measures even further and marks an important step for our company in becoming the leading digital identity provider in Europe,” says Armin Bauer, Co-Founder and Managing Director Technology at IDnow.

About CLR Labs, Cabinet Louis Reynaud group

CLR Labs is the European laboratory dedicated to the evaluation of biometric and security technologies founded by multidisciplinary industry experts with a century of experience in biometrics and security based at La Ciotat (France). Many manufacturers, implementers of complex systems and French and European Trust Service Providers trust them to assess their products and solutions using biometric technologies in the context of border crossing, secure payment, physical access control for companies, online electronic authentication and more generally in the field of digital identity management and verification. CLR Labs is supported by TEAM @ Mines Saint-Étienne, the technological maturator of the Ecole des Mines of Saint-Étienne, France.

Press contact CLR Labs:

Stéfane Mouille, Lab Director

stefane.mouille@cabinet-louis-reynaud.fr

+33 (0)6 81 82 35 92

About LSTI

LSTI is a conformity assessment body (CAB) specializing in cybersecurity and data protection. Created in 2004, LSTI has developed a real expertise in information security assessment, and is recognized as one of the major CABs in Europe for the assessment of Trust Service Providers regarding the eIDAS regulation, the ISO/IEC 27001 standard and the French ANSSI standards. Assessing remote ID proofing service providers is rightfully part of their activities, particularly as part of their eIDAS assessment offer. LSTI is a member of Apave Digital since September 2021, and has an European franchise – LSTI Worldwide – based in Ireland.

Press contact LSTI:

Manon Mix, Communications Officer

manon.mix@lsti.fr

+33 (0)6 66 96 35 5

Tuesday, 23. May 2023

KuppingerCole

Speeding Up Zero Trust Delivery Using Managed Services

Join security experts from KuppingerCole Analysts and iC Consult as they discuss how to combine a Zero Trust security model with IAM Managed Services to achieve an optimal cybersecurity posture for ensuring that all IT networks and information systems are protected from cyber-attacks. Martin Kuppinger, Principal Analyst at KuppingerCole Analysts, will talk about where, why, and how managed servi

Join security experts from KuppingerCole Analysts and iC Consult as they discuss how to combine a Zero Trust security model with IAM Managed Services to achieve an optimal cybersecurity posture for ensuring that all IT networks and information systems are protected from cyber-attacks.

Martin Kuppinger, Principal Analyst at KuppingerCole Analysts, will talk about where, why, and how managed services can be utilized to speed up the Zero Trust journey. He will also highlight important requirements, including a high degree of standardization of services and the enforcement of Zero Trust principles such as "always verify".

Heiko Hütter, CEO of Service Layers, will share his perspectives on the benefits and challenges of Zero Trust and IAM Managed Services, give some real-world examples to show the benefits of a combined strategy, discuss how best to implement this strategy to reduce the risk of data breaches, and give an overview of iC Consult’s related expertise and services.




Shyft Network

MiCA Paving the Way: From Liability to Transparency

The EU’s new MiCA regulation holds cryptocurrency exchanges directly accountable for customer losses, representing a major shift towards increased protection and transparency in the sector. MiCA regulation introduces significant compliance challenges for crypto exchanges while promising enhanced user protection, potentially boosting confidence and cryptocurrency adoption. The MiCA regulati
The EU’s new MiCA regulation holds cryptocurrency exchanges directly accountable for customer losses, representing a major shift towards increased protection and transparency in the sector. MiCA regulation introduces significant compliance challenges for crypto exchanges while promising enhanced user protection, potentially boosting confidence and cryptocurrency adoption. The MiCA regulation could set a global precedent for cryptocurrency regulation, emphasizing the delicate balance between protecting investor interests and fostering innovation in the rapidly advancing field of cryptocurrencies.

Europe’s ascent as the “World’s biggest cryptocurrency economy” is underscored by several telling statistics. In 2020 alone, the continent received over 870 billion Euros in crypto. Moreover, the Central, Northern, and Western Europe regions accounted for a quarter of the global cryptocurrency activity.

Significantly, between July 2020 and June 2021, transfers from large institutional investors mushroomed from 1.2 billion Euros to 39.6 billion Euros, highlighting the escalating crypto activities in the region and setting the stage for imminent regulatory interventions.

In response to this burgeoning crypto economy and a series of high-profile incidents leading to significant customer losses, the European Union has enacted a sweeping set of regulations. The newly introduced Markets in Crypto Assets (MiCA) regulation holds cryptocurrency exchanges directly accountable for customer losses.

This fundamental shift in the handling of digital assets is aimed at providing a protective shield for investors. By imposing higher standards of transparency, data security, and operational integrity on exchanges, the rules promise to enhance protection and reliability for customers.

Beyond Europe, these pioneering regulations could act as a blueprint for the global crypto industry, stimulating broader acceptance of such rules while presenting significant compliance challenges for crypto exchanges.

As the EU navigates this new terrain, the balancing act between protecting investor interests and fostering innovation could shape the future of the crypto industry worldwide.

EU’s Landmark Crypto Rules

On 16th May 2023, the European Union adopted a regulation on Markets in Crypto Assets (MiCA), initiating a continent-level legal framework for the crypto industry for the first time. The regulatory framework would apply to crypto assets, asset issuers, and crypto-asset service providers.

The provisions on assets issuers and e-money tokens will apply from 12 months after entry into force, expected to be spring 2024. Other provisions of MiCA will apply from 18 months after entry into force (i.e., in the second half of 2024).

Significance of the Rules

The rules will hold providers liable for losing investors’ crypto assets. All 27 EU member states need to comply with these rules from 2024 onwards.

Under these new rules, providers mean utility token issuers, issuers of asset-referenced tokens and stablecoins, providers of trading venues, and wallet services.

The aim behind introducing the regulations is to safeguard investor assets, maintain stability in the system, and allow the sector to become more attractive for its trustworthiness and innovation.

A Brief History of the EU Cryptocurrency Regulations

The attempts to regulate cryptocurrencies do not start with the introduction of MiCA. While the EU presented the MiCA proposal in September 2020, regulatory attempts toward crypto have been prevalent since 2013.

2013: The first official statements and analysis on virtual currencies came from the European Banking Authority. 2014: The EBA identified more than seventy risks arising from virtual currencies for all possible sorts of market participants. The EBA also suggested measures to deal with governance, capital requirements, and segregation of client accounts. 2016: The European Central Bank analyzed virtual currency schemes and acknowledged the potential advantages of virtual currencies despite an overall negative assessment. 2017: The European Securities and Markets Authority (ESMA) issued a report on Distributed Ledger Technology and its application to Securities markets. Early 2018: The European Parliament commissioned two reports. One was on virtual currencies and Central banks’ monetary policy. The second was on the fight against the illicit use of cryptocurrencies. Late 2018: The Financial Stability Board released a paper on the crypto assets market and its financial stability implications. 2019: The International Monetary Fund singled out supervisory and regulatory elements to assist policymakers. 2020: It was only on 24th September that the European Commission introduced the Markets in Crypto Assets (MiCA) proposal as part of a larger digital finance package. 2021: On 24th November 2021, the council adopted its negotiating mandate on MiCA. 2022: On 31st March, the tripartite co-legislative discussions started. The discussions ended with a provisional agreement struck on June 30th, 2022. 2024: By July 2024, stablecoin-related provisions will come into effect. 2025: By this year, the remaining provisions of MiCA, other than those related to the issuance of asset-referenced tokens and e-money tokens, are anticipated to come into effect.

The rules discussed in this article are seen as the formal adoption of the regulation.

What led to These new Regulations?

The need for crypto regulations, chronicled in the timeline above, has always been there in the Europe market. Yet, a few incidents in the global crypto market made it all the more necessary and relevant.

One of the most prominent of these incidents was the collapse of FTX, one of the world’s largest cryptocurrency exchanges. Not only did it impact the European crypto market, but it also sent shockwaves at a global level as the founder Sam Bankman-Fried’s fortune depleted from nearly US$16 bn to zero within days.

Another incident that made people aware of the potential risks of volatility in the crypto market was the collapse of stablecoin Terra. Within a week in 2022, the stablecoin Terra and its sister token Luna collapsed, wiping out as high as nearly half a trillion USD from the crypto markets.

The New Regulations Details

One has to remember that nearly all of these funds lost in the incidents cited above were customer assets, either in the form of investments or deposits. In a bid to prevent such losses, the new rules hold crypto-asset service providers liable for customer losses.

Crypto Exchange Liable for Customer Losses

The rules want the issuers of asset-referenced tokens to have an adequate custody policy for their reserve assets to prevent loss and preserve the value of the assets.

The crypto-asset service providers should also be held liable for losses incurred from incidents relating to Information and Communication Technology (ICT). Such incidents might include cyber attacks, theft, or malfunctions.

Service providers, issuers of asset-referenced tokens, and e-money tokens will also be held liable for the information they have offered in a crypto-asset white paper.

In Focus: Traceability of Crypto Assets and Crypto Consumer Protection

Apart from holding exchanges liable for customer losses, other key components of the regulation deal with the traceability of crypto assets, consumer protection, etc. For instance, the rules want to ensure that crypto assets stay traceable.

The council also wants its rules to create adequate arrangements for “enhanced consumer protection” and “safeguards against market manipulation and financial crime.”

The EU also wants crypto providers to share their energy consumption details. Moreover, all providers will need to have a license to issue, trade, and safeguard crypto assets, tokenized assets, and stablecoins.

Crypto Licensing and Non-compliance Penalties

Crypto service providers operating without a license will have to go through inclusion in an ESMA (European Securities and Market Authority) public registry. In this registry, these unlicensed providers will document their non-compliance.

Implications for Exchanges

The most radical step is that exchanges will be held liable for losing investors’ crypto assets. Also, in ensuring traceability, the exchanges need to ensure that they have a mechanism to trace crypto assets.

MiCA traceability requirement is not a new concept, however, as it is similar to how banks keep a trace of money transfers.

But it cannot be denied that traceability in crypto assets is a double-edged sword. While it enhances security and regulatory compliance, it also raises critical concerns about user privacy.

Although exchanges have no option but to implement traceability, as MiCA makes it mandatory, they must ensure meticulous data governance and robust encryption methods to protect user data.

Transparency about these practices is crucial, with users having the right to know what data is collected, its use, and protective measures in place. It is paramount to strike a balance between security, regulation, and privacy in this ever-evolving cryptocurrency space.

In case of suspicious transactions, exchanges should ensure they have appropriate blocking mechanisms in place.

Besides these, the exchanges will also have to declare their energy consumption details and apply for operating licenses.

Potential Crypto Exchange Challenges

Putting up a robust compliance process could always be challenging. The exchanges will have to ensure zero scopes of technological vulnerability. Additionally, they need to ensure that operations are at par with what has been committed in the whitepapers.

Compliance Strategies for Crypto Exchanges

To cope with several compliant requirements enforced under MiCA, exchanges have no options but to make operational adjustments. For instance, the rules necessitate enhanced monitoring of exchanges’ source and destination addresses.

Adhering to FATF regulations could no doubt contribute to a robust and streamlined compliance process. However, exchanges will require a Travel Rule Solution, such as Shyft Veriscope, to comply with the FATF Travel Rule requirements while protecting their customers’ user experience and privacy.

More on Shyft Veriscope’s capabilities: Shyft Veriscope — The Critical Infrastructure Underpinning FATF Travel Rule

Implications for Customers

The rules do not apply to P2P transfers. However, transactions above 1,000 Euros from self-hosted wallets come under their purview as and when they connect to wallets hosted by crypto asset service providers.

Crypto Consumer Protection

Authorities insist that these rules seek to reduce opportunities for intrusion into the system. Hence, hacks, breaches, and data security concerns — all could be reduced if these sets of rules are properly followed.

Impact on Crypto Consumer Confidence and Behavior

Enhanced consumer protection and transparency in the system would increase the sector’s attractiveness. It will improve credibility and trust, potentially resulting in improved traction and adoption of crypto in the country.

Impact on the Global Crypto Industry

Regulation of crypto asset exchanges would involve collecting and transferring data between different jurisdictions.

In many cases, one party would be a jurisdiction outside Europe. However, successfully completing the transaction would require exchanges in that jurisdiction to stay compliant with European regulations.

Global Influence of EU Crypto Regulations

With the EU getting its house in order, other parts of the world, such as the United States and the United Kingdom, will have to keep up with the EU regulations.

Other countries in the European region, such as Switzerland and the UK, known for their happening local crypto scene, will be impacted as well, despite not being a member of the 27-nation bloc.

Switzerland

In Switzerland’s case, given the intricate web of business relationships with the EU, Swiss crypto companies must check if MiCA or another EU rule, MiFID II, affects them, especially when dealing with certain tokens like securities.

Also, MiCA brings unique rules for “stablecoins,” requiring issuers to have an EU-based office and a special license.

These changes demand Swiss crypto businesses to quickly understand MiCA’s impact and determine which services they can offer from Switzerland to EU countries and what tokens need a special document, the “crypto asset whitepaper,” to be recognized in the EU.

The United Kingdom

The UK has decided on a phased regulatory approach. It might start with stablecoins and then continue with unbacked crypto assets. This strategy is set to interplay with the European Union’s Market in Crypto-Assets (MiCA) regulations. Even though the UK is no longer an EU member, the ripple effects of MiCA regulations can’t be ignored.

With MiCA creating a unified framework for crypto assets across the EU, UK-based companies offering crypto services in the EU will need to adapt. Compliance with MiCA, regardless of the UK’s local regulations, will become essential for firms looking to operate within the EU.

On the flip side, the far-reaching and comprehensive nature of MiCA may influence the UK’s phased regulatory approach. It could also serve as a potential blueprint for UK regulators, particularly when moving towards unbacked crypto assets.

Middle East & Asia

The new framework could potentially reconfigure cross-border digital asset transactions involving EU, Middle East, and Asian entities. Given the substantial number of Asian and Middle Eastern businesses and individuals engaged in crypto-related activities within the EU, the MiCA regulations could potentially impact asset flows, particularly those dealing with certain crypto assets subject to more stringent MiCA requirements.

Moreover, the increased regulatory oversight under MiCA may instigate a realignment of crypto asset flows, potentially diverting some towards jurisdictions with less stringent regulations. Crypto businesses or investors looking for a more flexible regulatory environment might consider migrating their operations or investments to territories with more lax rules.

The Middle East and parts of Asia, which have been comparatively less stringent in their approach to crypto regulation, could become attractive destinations for such shifts. However, this could also lead to a double-edged sword scenario where the increase in crypto activities could prompt these regions to establish more robust regulatory frameworks, echoing the rigorous standards of the EU’s MiCA.

United States

Major US crypto businesses, like Genesis, Gemini, and Bittrex, have had legal troubles, making others in the industry worry about the future. Big player Coinbase is now asking for clear rules for crypto companies, pushing the US Securities and Exchange Commission (SEC) to make a decision.

In contrast, the EU has been working hard to create rules that protect customers and promote new ideas in the crypto industry. Their rulebook, MiCA, provides clear instructions for companies working with crypto. This not only helps protect people using these services but also builds trust in crypto businesses. It even encourages more use of the technologies behind crypto, like blockchain.

Other countries, including the US, may look to MiCA as an example when making their rules. While the US is currently dealing with different rules from different bodies, it could learn from the EU’s clear and practical approach. It’s crucial for the US to find a balanced and flexible way to manage crypto, promoting new ideas while ensuring companies follow the rules.

The Potential Impact of MiCA Regulations on DeFi Protocols

The new MiCA regulations have sparked questions within the crypto community, particularly about their impact on DeFi protocols. The language of the regulations is complex, causing some debate about whether they even apply to DeFi. Some reports suggest they don’t.

Despite this, it’s wise for funds looking to invest in decentralized finance and DeFi protocols to prepare for compliance. Even if certain decentralized activities seem unaffected by the rules, the broad scope of the regulations could still touch various aspects of DeFi operations.

Frequently Asked Questions: Fast Check on Everything MiCA Q1: What is the new MiCA regulation introduced by the EU?

The Markets in Crypto Assets (MiCA) regulation is a new set of rules introduced by the European Union to regulate crypto assets, their issuers, and service providers. A vital component of this regulation is that it holds cryptocurrency exchanges directly accountable for customer losses.

Q2: Why has the EU introduced this regulation now?

The introduction of this regulation comes in response to the growing prevalence and influence of cryptocurrencies within the European economy, as well as several high-profile incidents that have led to significant customer losses. The intent is to protect investors and provide a more stable and trustworthy framework for the operation of the crypto sector.

Q3: What are the implications of these regulations for crypto exchanges?

Crypto exchanges will now be held to higher standards of transparency, data security, and operational integrity. They will be directly accountable for any customer losses, which means they will need to ensure robust security measures and operational protocols are in place.

Q4: How will these regulations impact customers?

Customers stand to benefit from enhanced protection and greater reliability in the crypto sector. These regulations aim to reduce the risk of losses due to incidents like hacks, breaches, and data security concerns.

Q5: Do these regulations have any implications outside of the EU?

Yes, these regulations could potentially shape the global crypto industry. They could act as a blueprint for other significant players like the United States and the United Kingdom, stimulating broader acceptance of such rules.

Q6: What challenges can crypto exchanges expect as a result of these regulations?

Compliance with these regulations will be a significant challenge for crypto exchanges. They will need to ensure they have zero technological vulnerabilities and that their operations are in line with commitments made in their whitepapers. They will also need to apply for operating licenses and declare their energy consumption details.

Q7: How will these regulations impact the future of the crypto industry?

The regulations represent a significant step in the evolution of cryptocurrency regulation. By striking a balance between protecting investor interests and fostering innovation, these rules could shape the future trajectory of the crypto industry not just in the EU but worldwide.

Looking Ahead: The Future of EU and Global Crypto Regulation

As the landscape of cryptocurrency continues to evolve, the trajectory of these new regulations remains to be fully discerned. If judiciously implemented and navigated with an understanding of their subtleties, these rules could potentially set a regulatory precedent for the rest of the world to follow.

The task ahead for the EU is multifold: it must diligently safeguard investor interests and curb potential misuse of funds. At the same time, it carries the responsibility to ensure that regulatory frameworks do not stifle innovation but instead foster growth in this rapidly advancing, technology-driven field of cryptocurrencies. The delicate balance between regulation and innovation will define the future of the crypto industry not just in Europe but globally.

MiCA Paving the Way: From Liability to Transparency was originally published in Shyft Network on Medium, where people are continuing the conversation by highlighting and responding to this story.


Fission

IPFS Thing 2023: WNFS - Versioned and Encrypted Data on IPFS

Protocol Engineer Philipp Krüger gave a presentation on WNFS, or Webnative File System, at IPFS Thing 2023 in Brussels, Belgium. In his presentation, he gave a comprehensive yet approachable breakdown of how we constructed our versioned and encrypted file system built on top of IPFS. When Fission set out to create an encrypted file system that would respect user agency, it needed to meet three cr

Protocol Engineer Philipp Krüger gave a presentation on WNFS, or Webnative File System, at IPFS Thing 2023 in Brussels, Belgium. In his presentation, he gave a comprehensive yet approachable breakdown of how we constructed our versioned and encrypted file system built on top of IPFS.

When Fission set out to create an encrypted file system that would respect user agency, it needed to meet three criteria:

Users would be able to take their data with them anywhere and control who had access to it It would be available both online and offline, ensuring access no matter the user's level of connectivity It could be shared on different devices and types of apps (for example, you could share your data on a music app on your mobile phone and a photo app on your desktop)

Developers are used to running a server to hold user data when publishing an app online. During the prototyping process, dummy data is often stored in the browser. We realized that if we made the browser capable of running production data, it would ensure the data was local-first and private (because the data would never have to leave the device itself).

Our Frenemy, The Browser

So we began working with the browser to make this a reality. The first step was working with the WebCrypto API, which makes non-extractable key pairs possible. In addition, we wanted to ensure revocability, so we created asymmetric key pairs per app/device.

Access Control

Once decrypted, data is readable in the hostile browser environment. Therefore, we needed to give not just access control but fine-grained access control. We go by the Principle of Least Authority (POLA) to ensure the user only shares what they want to share.

To do this, we encrypt all of the files. However, all those keys can be difficult to manage, so we use the skip ratchet (our CTO Brooklyn's invention!) to organize them. This also gives us temporal and snapshot access control.

Source: Philipp KrügerSource: Philipp Krüger Data Persistence and Key Recovery

The browser may delete local storage, so data persistence and key recovery are two important considerations.

We use IPFS to persist encrypted data outside of the browser. IPFS also has a garbage collection feature, so to be absolutely sure that data is not deleted, the user may want to store their data on a machine they keep online (i.e., their own personal node) or utilize a pinning service. Fission does not run a pinning service, but we currently persist all user file systems for free.

Key recovery can be handled by linking a device with more reliable permanent storage for your secret keys (this can be done using the AWAKE protocol). Another option is to download a recovery kit from the dashboard app (or, in the future, from any app that supports something like it).

Concurrent Writes

A file system should have concurrent writes so editing can occur offline and multiple authorized users can edit without issue. We achieve this using a cryptographic tree structure (cryptree) that gives every file and folder a unique key. This enables offline access control while keeping the data encrypted and portable.

But what if two valid writes need to be consolidated? How do we determine which is the "correct one"? WNFS will merge them both by creating a directory that has both changes while also linking back to the previous data (using CRDTs) so nothing gets deleted. Then we leave it up to each app developer to determine how they want to query the history in the API and figure out how to do app-specific merges.

Source: Philipp Krüger Privacy & Security

Previously we explained that it is important for users to have fine-grained access control because it supports user agency and resolves the issue of decrypted data in the browser.

When the user does share data with someone else, whether it's an individual or an app, this action should reveal as little metadata as possible. That's why we verify valid writes without read access (using UCANs), scramble the file hierarchy, create a flat namespace (using the CRDT structure), and split files into chunks so no one can distinguish file size.

Getting Started

The Rust 🦀 implementation of WNFS (rs-wnfs) is available now.

You can also join our working group and attend the monthly community calls or view the project's roadmap to track our progress, get involved, and learn about upcoming features.


UNISOT

FASHION 2.0 – TRANSPARENCY & SUSTAINABILITY

There’s been a huge shift in the way consumers demand transparency when it comes to ethical sourced and sustainable fashion. It’s fascinating to see that more people are paving the way to more openness and demanding more eco-friendly manufactured clothing. Fast and wasteful fashion is no more! The post FASHION 2.0 – TRANSPARENCY & SUSTAINABILITY appeared first on UNISOT.

Consumer trends and why fashion industries are changing their brand position

There’s been a huge shift in the way consumers demand Transparency when it comes to ethical sourced and sustainable fashion. It’s fascinating to see that more people are paving the way to more openness and demanding more eco-friendly manufactured clothing. Fast and wasteful fashion is no more!

When increasingly more people are changing the way they consume and behave, the fashion industry has to advance as well. Therefore we see a shift in how production is changing back to locally produced goods, true sustainability, more focus on fairly paid workers and distributed companies.

The devastating impact on our environment

The use of toxic dyes, pesticides, the tremendous waste of natural resources, farming, harvesting, processing, manufacturing and the transportation for mass production all over the world is really having a devastating impact. The extravagant usage and waste of water that is needed for production is completely irresponsible.

The UNISOT SaaS solution can therefore be used to prove and provide honest information, that guarantees that the company is preventing waste and unethical practices, thus enabling customers to make even more sustainable and environmentally friendly choices.

Smart Digital Twins in Blockchain, Web3 and Federated Learning in Fashion

Due to the intrinsic nature of public blockchain technology supply chains will become more transparent. This brings about a whole new level of incentives for companies and how they will do business.

With public blockchain technology we can create a link between the physical object and a digital one: a smart digital twin is created. This link is used to create a Digital Product Passport to state the origin of the actual product; blocking any attempts of counterfeit or diversion. Each action in the supply chain is recorded and due to the distributed nature of public blockchain technology no stored data can be altered, destroyed or diverted. Furthermore, UNISOT’s Smart Digital Twins functionality enables full product life cycle history feedback.

All kinds of exciting technologies are slipping into fashion industry, and garment companies are progressively changing into tech companies.  UNISOT applications combined with IoT technologies, Web3 and Federated Learning can make a big change in how the Fashion Industry goes forward – in real time – for each node in the supply chain.

Let´s change the fashion industry together

The post FASHION 2.0 – TRANSPARENCY & SUSTAINABILITY appeared first on UNISOT.


Elliptic

Chinese Businesses Fueling the Fentanyl Epidemic Receive Tens of Millions in Crypto Payments

Key points: Most fentanyl trafficked into the United States is manufactured using precursors imported from Chinese suppliers. Elliptic researchers received offers from more than 90 China-based companies to supply fentanyl precursors, 90% of which accepted cryptocurrency payments. Many mentioned that they have shipped the same chemicals to Mexico. Many of these bus
Key points:

Most fentanyl trafficked into the United States is manufactured using precursors imported from Chinese suppliers.

Elliptic researchers received offers from more than 90 China-based companies to supply fentanyl precursors, 90% of which accepted cryptocurrency payments. Many mentioned that they have shipped the same chemicals to Mexico.

Many of these businesses were also willing to supply fentanyl itself, despite this being banned in China since 2019.

Elliptic’s blockchain analysis shows that the cryptocurrency wallets used by these companies have received thousands of payments, totaling just over $27 million, and that the number of transactions has increased by 450% year-on-year. $27 million would purchase enough precursor to produce fentanyl pills with a street value of approximately $54 billion.

Analysis of blockchain transactions also shows that a known fentanyl trafficker paid tens of thousands of dollars in Bitcoin to one of these suppliers. 

Fentanyl overdoses are now the leading cause of death for those aged 18-45 in the United States. Cheaper to produce than heroin and 50 times more potent, this synthetic opioid has fueled an epidemic over the past decade, as well as becoming a huge source of profits for international drug cartels.


For many years, China was the main source of illicit fentanyl. But in 2019 following intense diplomatic pressure from the US, the Chinese government regulated the drug – effectively banning its export.

However, far from stemming the tide, illicit fentanyl imports into the US have soared. That’s because Mexican drug cartels have stepped in to fill the void and seize the market – manufacturing their own fentanyl using precursors (chemical ingredients) imported from China. 

The US government has placed increasing emphasis on disrupting the financial activity of fentanyl traffickers and their supply networks. In a fact sheet published on April 11th, the White House indicated that it intends to “expand its efforts to disrupt the illicit financial activities that fund these criminals by increasing accountability measures, including financial sanctions”. 

Three days later, on April 14th, the US Department of the Treasury sanctioned several individuals and businesses in China for supplying precursor chemicals to drug cartels in Mexico for the production of fentanyl intended for the US market. The sanctions also listed cryptocurrency wallets used by these businesses to receive payments. 

The Elliptic research team identified more than 90 China-based chemical companies that were willing to supply fentanyl precursors, and that are offering cryptocurrency as a payment method.

Our researchers received offers to supply large quantities of one particular fentanyl precursor. This chemical is not used in the manufacture of any other products, and is a controlled substance in most countries. However, it remains unregulated in China. 

 


auth0

Startup Stories: SafeBase

How SafeBase Uses Auth0 to Lead the Way in Customer Trust
How SafeBase Uses Auth0 to Lead the Way in Customer Trust

Coinfirm

GDPR and VASPs: Lessons Learned from Meta’s Historic Fine

GDPR and VASPs: Lessons Learned from Meta’s Historic Fine  In light of the recent historic 1.2 billion euro fine levied against Meta Platforms Ireland Limited, Virtual Asset Service Providers (VASPs) worldwide should take heed. The unprecedented penalty, the highest ever under the General Data Protection Regulation (GDPR), sends a strong message about the EU’s commitment […] The post GDPR a

GDPR and VASPs: Lessons Learned from Meta’s Historic Fine 

In light of the recent historic 1.2 billion euro fine levied against Meta Platforms Ireland Limited, Virtual Asset Service Providers (VASPs) worldwide should take heed. The unprecedented penalty, the highest ever under the General Data Protection Regulation (GDPR), sends a strong message about the EU’s commitment to data protection and privacy. 

Facebook’s Record Fine Sends a Strong Signal to VASPs: EU Regulations and Compliance are Crucial

Reason Behind Facebook’s Fine 

Meta’s predicament stemmed from its systemic transfer of personal data from the EU to the U.S. on the basis of standard contractual clauses (SCCs) since 16 July 2020. The GDPR requires an adequate level of protection for data transferred outside the EU, and Meta’s data handling practices were deemed non-compliant, leading to the severe fine. 

Key Aspects of GDPR 

The GDPR is a comprehensive legislation aimed at protecting the data and privacy of EU citizens. Key principles that organizations should adhere to include: 

Data Minimization: Processing only the necessary amount of data for a specific purpose.  Purpose Limitation: Ensuring that data is only used for a clearly stated and legitimate purpose.  Accuracy: Maintaining up-to-date and correct data, and promptly correcting or deleting any inaccuracies.  Storage Limitation: Storing data for no longer than necessary and in a way that permits identification of data subjects. 

Beyond MiCA: Other Regulatory Considerations for VASPs 

While the recent MiCA regulation is an important consideration for VASPs planning to operate in the EU, it isn’t the only regulatory framework they need to abide by. GDPR compliance is equally crucial, and overlooking this can lead to serious penalties, as seen in Meta’s case. 

GDPR Essentials for VASPs 

To stay compliant with GDPR, VASPs should: 

Understand and apply GDPR principles in their data handling practices.  Keep track of where and how they store and transfer data.  Implement security measures to protect data.  Be transparent with users about data collection, use, and storage.  Choosing the Right Partners for GDPR Compliance 

A critical part of maintaining GDPR compliance is choosing the right partners. When it comes to data protection and storage, partnering with companies that understand the complexities of GDPR and have a robust data security infrastructure is essential. Coinfirm, an EU-based firm, ticks all these boxes. Coinfirm’s servers are located within the EU, ensuring GDPR compliance and providing VASPs with peace of mind. 

Coinfirm, Your Partner in Compliance 

The substantial fine against Meta serves as a warning signal to VASPs: data protection isn’t just a legal requirement, it’s an integral part of responsible business practices. While the regulatory landscape may seem daunting, Coinfirm is here to help. As a GDPR-compliant partner, Coinfirm is not only equipped to provide clear, actionable blockchain analytics services, but also to guide VASPs through the complexities of operating within the EU. With Coinfirm, VASPs can confidently navigate the intricacies of GDPR, safeguarding their operations while respecting user privacy. 

The post GDPR and VASPs: Lessons Learned from Meta’s Historic Fine appeared first on Coinfirm.


Exclusive Webinar: Mastering Crypto Compliance

The world of cryptocurrency is evolving rapidly, and so are the regulations surrounding it. As the industry grows, it’s essential for businesses operating in the crypto space to stay ahead of the curve when it comes to compliance. Sanction Scanner and Coinfirm, two companies at the forefront of regulatory innovation, have joined forces to help […] The post Exclusive Webinar: Mastering Crypto Com

The world of cryptocurrency is evolving rapidly, and so are the regulations surrounding it. As the industry grows, it’s essential for businesses operating in the crypto space to stay ahead of the curve when it comes to compliance. Sanction Scanner and Coinfirm, two companies at the forefront of regulatory innovation, have joined forces to help businesses navigate the complex world of crypto compliance.

We are excited to announce an exclusive webinar, “Mastering Crypto Compliance,” presented by Claudia Naiba, Head of Regulatory and Training at Coinfirm, and Mario Duron, Crypto Compliance Expert from Sanction Scanner.

In this webinar, you will gain invaluable insights into

the general framework of crypto regulations, how to balance innovation and compliance, mastering compliance for crypto businesses, exploring compliance strategies and best practices.

This is a must-attend event for anyone working in the crypto industry or looking to expand their knowledge of crypto compliance.

Webinar Details:

Title: Mastering Crypto Compliance

Presenters: Claudia Naiba, Head of Regulatory and Training at Coinfirm, and Mario Duron, Compliance Expert from Sanction Scanner

Date: 31.05.2023

Time: 3:00 PM – 4:00 PM CET


Don’t miss this opportunity to learn from the experts at Sanction Scanner and Coinfirm. To register for the webinar, please fill out the form below:

Registration is closed Panelists

Claudia Naiba, Regulatory and Training Lead, Coinfirm  

Claudia Naiba is a highly accomplished Regulatory and Training Lead at Coinfirm, where her extensive expertise in Anti-Money Laundering (AML), Countering the Financing of Terrorism (CFT), and sanctions compliance has made her an invaluable asset to the team. With a strong background as a Senior Fraud Analyst, Claudia has dedicated over 13 years of her career to navigating and mastering the complexities of both FIAT and crypto-related industries.

Mario Duron, Compliance Expert, Sanction Scanner

Mario is a 13-year Compliance Subject Matter Expert with a high level of experience in traditional banking and FinTech compliance frameworks. His focus is to detect financial crimes, mitigate AML risks and inherent gaps in compliance operations. His expertise includes crypto & fiat investigations, risk assessments, sanctions screening, systems testing and reporting.

Hosted By

Monika Godek, Global Commercial and Business Development Director, Coinfirm

About Sanction Scanner and Coinfirm Partnership:

 

Sanction Scanner and Coinfirm have teamed up to bridge the gap between compliance in the crypto and traditional finance (TradFi) worlds. Sanction Scanner, a provider of AML and CFT solutions, offers AML screening for customers and transactions, AML transaction monitoring, KYB, and 360-degree customer risk assessment solutions to institutions. Coinfirm, the world leader in blockchain analytics and RegTech solutions, protects entities from being tainted with funds originating from illicit activities like ransomware hacks, human trafficking, and terrorist financing. This partnership aims to revolutionize crypto and TradFi sectors’ compliance, utilizing AI and machine learning in compliance to create a trusted and secure environment for businesses operating in both sectors[1].

The post Exclusive Webinar: Mastering Crypto Compliance appeared first on Coinfirm.


PingTalk

Breaking Chains: Blockchain and Sidechains in the Age of Decentralized Identity | Ping Identity

Decentralized identity (DCI) systems rely on public key infrastructure (PKI) cryptography to secure and manage identity information. When using Distributed Ledger Technology (DLT) each transaction represents a block in an ever-growing chain. All transactions in the chain are immutable so they cannot be tampered with and are distributed to all nodes in the distributed ledger. However, DCI does not

Decentralized identity (DCI) systems rely on public key infrastructure (PKI) cryptography to secure and manage identity information. When using Distributed Ledger Technology (DLT) each transaction represents a block in an ever-growing chain. All transactions in the chain are immutable so they cannot be tampered with and are distributed to all nodes in the distributed ledger. However, DCI does not require blockchain or other DLTs to function - more on that later. Now let’s break all this down a bit further. 


TBD

Hello DWAs - Building the decentralized future of PWAs

How DWAs are the decentralized future of PWAs

Whether or not you know the term, you know the experience of using a Progressive Web App (PWA): it’s the slick experience that enables you to install a web experience to your native device and enjoy the speed of use of a website with the tight system integration of a native app. They allow for experiences like recreating Starbucks’ native app in web, or for creating a highly 2G-optmized experience like what Uber created for some of their customers. PWAs take advantage of native features such as on-device storage, push notifications, and even windowing without requiring the overhead of a native app and the ability to install them without going through an app store.

In the same way that HTML5 and PWAs changed the way users expected to consume their apps through the minor addition of manifest, an even larger shift has arrived in the form of Decentralized Web Apps (DWAs), which combine PWAs with Web5. While PWAs redefined the relationship between the client and server by enabling richer on-device experiences through native integrations, DWAs are going to completely redefine data storage for apps resulting in a seismic shift in user privacy, ease-of-use, and data portability.

What changes in a DWA?

The basic client-server model that PWAs use is and has been the presumed standard for decades now. Users expect to open up their PWA, a client, that routinely makes API calls to a 3rd party service, the server, where all their data is stored and the business logic of the app lives. For example, a basic ToDo app PDA would provide you a UI to interface with all your action items by calling the API service that accesses those items; fundamentally the data and API layers live outside of the user’s control and ownership.

In DWAs, users own their own data and, as a result, their apps no longer need an API service because apps connect directly to the data source. This shift leads to virtually no UI/UX change, but triggers a radical change in app architecture because developers are now directly interfacing with a user’s data store instead of with app-owned server storage via API.

In DWAs, app data is stored in Decentralized Web Nodes (DWNs), which are user-owned and redundant data stores capable of transmitting data that provide strong promises of privacy and guaranteed ownership. DWNs can be permissioned publicly or privately related to the querying decentralized identifier (DID) using permissions and protocols to allow select access to all the data stored in them.

Going back to our ToDo app example, a user would be able to store all of their todo items and app data on their own DWN rather than on a specific 3rd party’s servers, and all the app would do is focus on creating a presentation layer and additional features around the app’s concept.

Upgrading to DWAs

If you want to migrate a PWA to a DWA, the only thing you’ll need to do is make sure that whatever data you were originally storing remotely and transmitting via API calls is now stored on the user’s DWN and is accessed using CRUD operations on the user’s DWN. For example, your ToDo app may have previously made an API call to write a new item to your list, but now you’d replace that API call with:

// Write a plain text record to the in-memory DWN

const myRecord = await web5.dwn.records.create(myDid.id, {
author: myDid.id,
data: todoItem,
message: {
dataFormat: 'text/plain',
},
});

You might expect this section on how to migrate your PWA to a DWA to drag on, but that’s really it! Regardless of your front-end frameworks, you can easily turn your app into a DWA using our web5.js library with minimal code changes. So long as your application treats the user’s DWN as its data storage and allows the user to own their own data, your PWA is now a DWA.

Benefits of a DWA

Even though replacing all your API calls with DWA calls may be relatively easy work, convincing your users to get on board with Web5 and DWNs may be a challenge. So what’s in it for you and them when you build a DWA?

User ownership of data - users don’t have to worry about a server outage or be fearful of a company shutting down and losing all their data. Data portability - because users are storing their own data in PWAs, they no longer need to be tied to a single service. Imagine, for example, a world with a decentralized Spotify and Tidal. If a Spotify user were to decide to leave the platform and migrate to Tidal, they’d own all their own playlist data and could therefore then give playlist access to Tidal so that all their data migrates seamlessly. Reduced operational overhead - you no longer need to maintain a massive array of databases, clusters, and other infrastructure to host your simple PWA. Instead, a user can visit your DWA, plug in their DWN, and get to work. What’s Next?

If you’d like to try upgrading your PWA to a DWA or building your own from the ground-up, you can use our web5.js library to interface with DWNs and DIDs. Additionally, you can find support on your developer journey by joining our Discord channel or by engaging in our GitHub page.

Monday, 22. May 2023

Finicity

Freddie Mac Expands Digital Capabilities to Help Lenders Reach More Qualified Borrowers

Freddie Mac (OTCQB: FMCC) today announced enhancements to its ground-breaking automated income assessment tool that allows lenders to assess a homebuyer’s income paid through direct deposit to also include the borrower’s digital paystub data.… The post <strong>Freddie Mac Expands Digital Capabilities to Help Lenders Reach More Qualified Borrowers</strong> appeared

Freddie Mac (OTCQB: FMCC) today announced enhancements to its ground-breaking automated income assessment tool that allows lenders to assess a homebuyer’s income paid through direct deposit to also include the borrower’s digital paystub data. This detailed information can help lenders calculate income faster and more precisely to improve loan quality, simplify the mortgage process and, most importantly, expand access to credit.

“Over the last year, we’ve consistently rolled out innovations to ensure our digital tools are improving speed and efficiency, reducing risk and, ultimately, helping us serve our mission by reaching more qualified borrowers. Today’s innovation further automates income assessment by using historical direct deposit pay patterns and current gross income from recent paystubs, which can help more families achieve homeownership.”

Kevin Kauffman, Single-Family Vice President of Seller Engagement at Freddie Mac.

This new AIM capability will be available to Freddie Mac-approved Sellers using Loan Product Advisor beginning June 7, 2023. Finicity, a Mastercard Company, is the initial service provider supporting Freddie Mac’s AIM for income using direct deposits plus paystub.

Read more about this inclusive innovation here.

The post <strong>Freddie Mac Expands Digital Capabilities to Help Lenders Reach More Qualified Borrowers</strong> appeared first on Finicity.


Tokeny Solutions

Webinar recording #1

The post Webinar recording #1 appeared first on Tokeny.

Shyft Network

Veriscope Regulatory Recap — 9th May to 21st May

Veriscope Regulatory Recap — 9th May to 21st May Welcome to another edition of Veriscope Regulatory Recap, your biweekly dose of crypto regulatory news from around the world. Crypto regulatory developments are often a rollercoaster ride filled with unexpected twists and turns, as we saw during and the last two weeks were no exception. The EU cleared the path with the unanimous approva
Veriscope Regulatory Recap — 9th May to 21st May

Welcome to another edition of Veriscope Regulatory Recap, your biweekly dose of crypto regulatory news from around the world.

Crypto regulatory developments are often a rollercoaster ride filled with unexpected twists and turns, as we saw during and the last two weeks were no exception.

The EU cleared the path with the unanimous approval of the Market in Crypto-Assets (MiCA), marking a significant stride in cryptocurrency regulations. Meanwhile, South Korea has been emboldened to take decisive actions amidst political scandals, stepping up their crypto regulation game.

In the United States, the landscape continues to evolve, shaped by rigorous debates over the clarity of digital asset rules and who holds the reins over “stablecoins.”

So without further ado, let’s dive straight into it.

Global Crypto Regulation Rundown: The Balance Between Safety and Innovation Across EU, US, and South Korea

The European Union (EU) is making major strides, unanimously approving a new law known as the Market in Crypto-Assets (MiCA). With MiCA, the EU aims to offer clear guidelines for people in the crypto market, providing a safety net for everyone involved.

But it’s not all rosy. The demands from crypto wallet providers and exchanges are high and might put a damper on new and creative ideas.

Meanwhile, over in the United States, the Securities and Exchange Commission (SEC) is taking heat. The issue? They aren’t being clear about their digital asset rules. Leading the protest is the United States Chamber of Commerce and Coinbase, a giant in the crypto world. They worry that this ambiguity could hurt the digital economy and hinder its growth.

On the other side of the globe, South Korea is taking decisive action. Prompted by some scandals involving cryptocurrencies, they’ve put their stamp of approval on a new crypto regulation bill. This sudden move showcases the urgency for a legal framework around digital asset trading. The heart of this bill is about making things clearer and more responsible in South Korea’s expanding crypto scene.

Back to the United States, a fascinating debate unfolds around “stablecoins.” The bone of contention here is who should be in control: the states or the federal government? Both sides have distinct views, but there’s a shared understanding that stablecoins are part of our future. And most agree that the U.S. should make the rules so businesses don’t move away.

Across the globe, the chorus is growing for clear, easy-to-follow rules in the world of cryptocurrency. We need protections for those in the digital asset market, but it’s also vital to encourage fresh ideas and industry growth. How well these new rules can strike this balance will be the key to their success.

Decoding the EU’s New Crypto Rules: Mica’s Impact and Insights

The European Union (EU) has taken a significant step towards comprehensive cryptocurrency regulation with the unanimous approval of the Market in Crypto-Assets (MiCA) legislation.

This development comes after the Economic and Financial Affairs Council, comprised of finance ministers from all EU member states, voted in favor of the MiCA bill and amendments to related regulations and directives.

Earlier in April, the European Parliament endorsed an unprecedented set of regulations for the crypto industry, adding weight to the MiCA legislation.

With an overwhelming majority of the members voting in favor, this initiative’s primary aim is to protect consumers from potential risks linked with crypto assets and ensure providers can be held accountable for any loss of investors’ assets. This regulatory effort signifies an important shift in the EU’s approach to crypto asset management and the broader financial industry’s future.

(Image Source)

In line with the MiCA approval, the European Parliament greenlighted two additional pieces of legislation, further contributing to the detailed regulatory framework. These supplementary rules aim to regulate the information accompanying fund transfers and certain crypto assets, emphasizing transparency and accountability across the board.

One key provision of the MiCA regulation is that crypto platforms are now obliged to provide clear and comprehensive information regarding the risks associated with their services.

Though this is a win for consumers’ trust, the stringent licensing prerequisites for crypto wallet providers and exchanges could impact the pace of innovation and competition.

Another impactful aspect is the requirement for stablecoin issuers to hold substantial reserves. This protective measure can safeguard the market from potential shocks but also introduces a cap on the daily transactions of larger stablecoins, which could influence their growth.

The overall picture painted by these legislative measures is of a cautious yet forward-thinking EU seeking a balanced approach to harnessing the benefits of crypto assets while mitigating associated risks.

South Korea Takes Lead in Cryptocurrency Regulations Amid Scandal

South Korea’s National Policy Committee took a bold step forward by approving a landmark cryptocurrency regulation bill, the first of its kind in the nation’s history.

This action is a direct response to a controversy involving a high-profile opposition lawmaker, who has been accused of heavy cryptocurrency speculation. Now consolidated from 19 similar drafts, the bill is headed for the Legislation and Judiciary Committee, where it will face a thorough review.

According to legislators, this bill aims to protect digital asset investors, thwart unfair cryptocurrency trading, and establish a reliable framework for trading stablecoins. And in a significant move, it empowers the Financial Services Commission, South Korea’s premier financial regulator, to oversee cryptocurrencies.

Although the current bill is primarily centered around investor protection and fair trade, a separate bill focused on the intricacies of initial coin offerings, public disclosures, and market regulations is still under committee review.

Fueling the swift bipartisan support for the cryptocurrency regulation bill is the scandal involving Representative Kim Nam-kuk, who allegedly dabbled in sizeable speculative investments in various cryptocurrencies.

The disclosure of his extensive digital asset holdings has sparked a call for increased transparency, compelling both political parties to draft legislation that mandates lawmakers and public officials to disclose their cryptocurrency assets.

This shift in policy brings South Korea closer to the regulatory standards of the US and EU, where comparable disclosure laws were established in 2018 and 2020, respectively.

The legislation will undoubtedly profoundly impact the South Korean cryptocurrency ecosystem, offering a host of benefits and presenting certain challenges. Investors stand to gain from improved protection measures, thus fostering a safer environment in the crypto space.

The law also takes aim at unfair trading practices and establishes a robust structure for trading stablecoins, which could increase public confidence in and participation in the cryptocurrency market.

However, stricter regulatory oversight may hinder the pace of innovation and deter potential new entrants to the crypto market. On top of that, the new legislation also could infringe on the privacy of lawmakers and public officials who will be obligated to disclose their crypto assets.

Interesting Reads

The Shyft Perspective — Hong Kong’s New Crypto Climate: the Regulated Road Ahead

Why Shyft Network?

What is Shyft Network?

Key Elements of Shyft Network

__________________________

VASPs need a Travel Rule Solution to comply with the FATF Travel Rule. Have you zeroed in on it yet? Check out Veriscope, the only frictionless crypto Travel Rule compliance solution.

Visit our website to read more: https://www.shyft.network/veriscope, and contact our team for a discussion: https://www.shyft.network/contact.

Also, follow us on Twitter, LinkedIn, Discord, Telegram, and Medium for up-to-date news from the world of crypto regulations. Also, sign up for our newsletter to keep up-to-date on all things crypto regulations.


Ocean Protocol

Control Over the OCEAN Contract To be Revoked Soon: Technical

Remaining OCEAN tokens will be minted. OCEAN will be fully decentralized and unpausable. It will vest to community over decades Overview article [here], technical article [this post]. May 25 update: it’s done:) 1. Summary The OCEAN token contract has 1.41B tokens capped supply. By the end of May 2023, the Ocean core team will mint the remaining 56.518% in the supply. A multisig wallet
Remaining OCEAN tokens will be minted. OCEAN will be fully decentralized and unpausable. It will vest to community over decades

Overview article [here], technical article [this post]. May 25 update: it’s done:)

1. Summary

The OCEAN token contract has 1.41B tokens capped supply. By the end of May 2023, the Ocean core team will mint the remaining 56.518% in the supply. A multisig wallet holds this newly minted OCEAN; Ocean core team members are a minority. All of the newly minted OCEAN is for the community; most will be transferred to smart contracts that vest OCEAN over decades to fund Ocean Data Farming (DF). DF incentivizes for growth of Data Consume Volume in the Ocean ecosystem.

With this minting done, we are then able to further reduce the OCEAN risk surface by renouncing ownership of the OCEAN token contract. This will make it un-pausable, and fully decentralized & censorship-resistant.

The rest of this article is organized as follows. Section 2 is an overview of system-level design and implementation phases. Section 3 elaborates on the implementation phases, with emphasis on OCEAN-related actions. The core team has completed the first two phases; the third and fourth is underway. Section 4 concludes.

2. Introduction 2.1 High-Level Overview

The image below shows the Ocean system-level design with an emphasis on DF funding flows. It overlays DF Main implementation phases.

Ocean system design + DF main implementation

The Ocean system-level design follows the Web3 Sustainability Loop. The green circle in the image is the heart of this loop: teams in Ocean ecosystem do *work* to drive traction to the Ocean ecosystem in terms of DCV [0], protocol revenue, and OCEAN. Protocol revenue loops back to burn OCEAN and to further fund teams. Ocean Data Farming (DF) is the main approach to incentivize / fund the teams.

The image overlays the five implementation phases of DF Main implementation, shown in colors of the rainbow, red → orange → yellow → green → blue: (A) Vesting & Splitter Contracts, (B) Recycling Multisig, (C ) Actions around OCEAN contract, (D) Security Audit, (E) Remaining Deployments.

Aside: The implementation phases A-E are orthogonal to DF Main 1–4. Phases A-E are about implementation work, whereas DF Main 1–4 are about OCEAN vesting schedule. We aim to complete all implementation phases A-E during DF Main 1.

2.2 Overview: System-Level Design

In the system-level design, Ocean Data Farming is the main approach to incentivize / fund the teams.

For Data Farming, OCEAN flows from top to bottom as follows:

From the initial OCEAN contract (top left), to the 51% multisig [one-time] From 51% multisig to vesting wallets Vesting 0, Vesting A, Vesting B, Vesting C, Vesting D [one-time] From vesting wallets to Splitter contract [weekly, over decades]. More precisely, Vesting 0 and Vesting A vest for DF Main 1 [12 mos]; Vesting B for DF Main 2 [6 mos]; Vesting C for DF Main 3 [6 mos]; Vesting D for DF Main 4 [decades]. From Splitter contract to DF addresses DF passive, DF active, and DF future [weekly, over decades]. From DF addresses to Ocean core team, Ocean ecosystem teams. Work from these teams drives traction in the Data Ecosystem to output protocol revenue, which goes to Community Multisig Wallet, then Rev-Splitter then (i) buyback & burn and (ii) Splitter contract. 2.3 Overview: DF Main Implementation

(This section is adapted from “DF Main is Here” blog post. We repeat it in order to expand on it.)

Ratchet Principle. As DF Main involves a huge amount of OCEAN, we take extra precautions and follow the principle “ratchet up value-at-risk over time”. What this means: rather than sending all this OCEAN directly to the vesting contracts, we “buy time” to more thoroughly verify the system

Implementation over time. We can ratchet up value at risk as we deploy more components, and put OCEAN into them. Here’s the order of operations.

Deploy canary, wire it up. First we will deploy a “canary” vesting contract for DF Main 1, funded with just 1000 OCEAN. Verification. This allows us to test the system live, in production yet with a small amount of funds at risk. At the same time, we will initiate a bug bounty program and security audit for all these components. Weekly rewards keep rolling. OCEAN payouts for DF29, DF30, etc will be as scheduled for DF main. A tiny amount will come from the “canary” vesting contract, and the rest will be topped up manually by the Ocean core team. Deploy the rest. Once the verification is complete, then we will deploy the remaining contracts and move the remaining OCEAN accordingly.

Final outcome. In completing the implementation work above, OCEAN vesting will be fully automated and on-chain. Then we will tackle decentralizing DF-main reward calculations, leveraging advances in decentralized compute infrastructure. This will serve the Ocean ecosystem well in the decades that follow, with more transparency, stability, and composability.

We now elaborate on each of the implementation phases A-E.

3. DF Main Implementation Phases 3.1 Implementation Phase A: Vesting & Splitter contracts

This phase implements the red🟥 phase in the image. This phase is complete.

First, we developed the VestingWalletLinear contract, which inputs tokens and releases those to a target address according to linear vesting schedule (constant every week). We developed the VestingWalletHalving contract, which vests according to a Bitcoin-style exponential schedule. Finally, we developed a Splitter contract, which splits any inbound income to multiple output addresses. We moved those contracts to the Ocean contracts repo.

Then, we did the work to “Deploy canary, wire it up.”

We deployed a VestingWalletLinear contract as the Vesting 0 Wallet. We deployed the Splitter contract. All deployments go to Ethereum mainnet. We funded it with 1000 OCEAN. It will vest until the end of DF Main 1, with an equal amount every week. We wired it together with the rest of DF stack, including Gelato for Web3 automation. It now dispenses through components to DF passive & active rewards addresses. Weekly payouts draw from this plus the main DF payment multisig; the latter being far larger. 3.2 Implementation Phase B: Recycling Multisig

This phase implements the orange🟧 phase in the image. This phase is complete.

Here, we simply deployed the recycling multisig wallet. This multisig — and the others — uses Safe (nee Gnosis Safe). The Appendix has multisig details.

3.3 Implementation Phase C: Actions around OCEAN contract

This phase implements the yellow🟨 phase in the image. This phase is in progress.

The OCEAN token contract has 1.41B tokens capped supply. 796,900,859 OCEAN (56.518%) hasn’t been minted yet. It is earmarked for the Ocean community, as follows.

719,100,000 OCEAN (51%) has always been earmarked for community incentives. Of this amount, 503,370,000 OCEAN (70% of the 51%) is for Data Farming with vesting over decades, held by the respective vesting wallets. 215,730,000 OCEAN (30% of the 51%) is for future community incentives projects, held by the 51% multisig wallet. 77,800,859 OCEAN (5.518%) is other community-earmarked OCEAN that hadn’t yet been minted. This will be managed from the the OceanDAO community wallet (also multisig).

We (Ocean core team) will the following steps to mint remaining OCEAN for the community, and renounce ownership of the OCEAN token contract, by the end of May 2023. Update: this was completed on May 25, 2023.

On OCEAN contract, call mint() of 77,800,859 OCEAN, sending funds to OceanDAO community wallet. [Completed tx] On OCEAN contract, call mint() of 719,100,000 OCEAN (51% of OCEAN supply), sending funds to 51% multisig wallet. [Completed tx] On OCEAN contract, transfer the pauser role to 0x00..00. [Completed tx]. On OCEAN contract, transfer the minter role to 0x00..00. [Completed tx] On OCEAN contract, call renounceOwnership(). [Completed tx]

In doing this, the OCEAN token contract will fully decentralized and censorship-resistant. It can no longer be paused.

Then, the 51% multisig and OceanDAO wallets hold the newly minted OCEAN.

3.4 Implementation Phase D: Security audit etc

This phase implements the green🟩 phase in the image. This phase is in progress.

The work for phase D is:

Security audit. Line up security auditors. Get the audit results. Fix smart contracts as needed. Bug bounty. Add Vesting contracts & related to Ocean’s long-running Immunefi bug bounty. 3.5 Implementation Phase E: Remaining deployments (once security ok)

This phase implements the blue🟦 phase in the image. This phase will be performed once phase D is complete, and we’re comfortable with the stability of DF & Vesting, from bug bounty and otherwise.

Whereas implementation Phase A deployed & funded Vesting Wallet 0 as a “canary”, in this phase we deploy & fund remaining vesting wallets: Vesting A, B, C, D.

The work for phase E is:

Deploy a VestingWalletLinear contract as the Vesting A wallet, for DF Main 1. Wire it up side-by-side with Vesting 0 wallet, via Gelato. Fund it with OCEAN. How much to fund: 150,000 OCEAN / week; this depends on # remaining weeks in DF Main 1. (Up to 7,800,000 OCEAN, if it ran for all of DF Main 1). Wait 1+ months to ensure that Vesting A is ok. Then… Deploy a VestingWalletLinear contract as the Vesting B wallet, vesting over 6 months starting at DF Main 2. Send it 7,800,000 OCEAN, such that it dispenses 300,000 OCEAN / week over 52/2 weeks. Deploy a VestingWalletLinear contract as the Vesting C wallet, vesting over 6 months starting at DF Main 3. Send it 15,600,000 OCEAN, such that it dispenses 600,000 OCEAN / week over 52/2 weeks. Deploy a VestingWalletHalving contract as the Vesting D wallet, vesting over 6 months starting at DF Main 4. Send it 472,170,000 OCEAN [1], which dispenses >1.1M OCEAN / week to start, then decays over time according to its halving-based schedule. 4. AMA

To field questions by the community, the Ocean core team will hold an AMA about the OCEAN actions, on May 25 at 12.30 UTC. Follow @oceanprotocol on Twitter for more details.

5. Conclusion

The Ocean core team will mint the remaining 56.518% OCEAN tokens and transfer them to a multisig where the core team is a minority.

All of the newly minted OCEAN is for the community, for Data Farming and more. Most of it vests over decades.

OCEAN token contract will be un-pausable, fully decentralized & censorship-resistant.

The core team continues to execute on DF Main implementation phases and related OCEAN actions. Three of five phases are already complete.

Notes

[0] “Data Consume Volume” (DCV) is the USD$-denominated amount spent to purchase data assets and consume them, for a given time period (e.g. one week)

[1] Amount going to Vesting D wallet is calculated as: (70% of 51% supply) minus (amount to DF Main 1,2,3) = 503,730,000 OCEAN — (7,800,000 maximum + 7,800,000 + 15,600,000 OCEAN). It will be a bit more if Vesting A doesn’t run for all of DF Main 1. And we already know this to be the case.

Updates

[May 25, 2023] The transactions to revoke control of OCEAN have been completed. They’re listed as “[completed tx]” in section 3.3.

About Ocean Protocol

Ocean was founded to level the playing field for AI and data. Ocean tools enable people to privately & securely publish, exchange, and consume data.

Follow Ocean on Twitter or Telegram to keep up to date. Chat directly with the Ocean community on Discord. Or, track Ocean progress directly on GitHub.

Control Over the OCEAN Contract To be Revoked Soon: Technical was originally published in Ocean Protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.


Control Over the OCEAN Contract To be Revoked Soon: Overview

Remaining OCEAN tokens will be minted. OCEAN will be fully decentralized and unpausable. It will vest to community over decades Overview article [this post], technical article [here]. May 25 update: it’s done:) 1. Introduction Ocean Protocol is ready to take the next step in our evolution. Our utmost priority is to nurture the Ocean community and protect holders of the OCEAN token. Ju
Remaining OCEAN tokens will be minted. OCEAN will be fully decentralized and unpausable. It will vest to community over decades

Overview article [this post], technical article [here]. May 25 update: it’s done:)

1. Introduction

Ocean Protocol is ready to take the next step in our evolution. Our utmost priority is to nurture the Ocean community and protect holders of the OCEAN token. Just as one cautiously feels their way barefoot across a stone riverbed, we have always worked to take the interests of token holders as our priority — taking cautious steps forward.

2. Need for a Native OCEAN Token

In the Ocean whitepaper, we envisioned that a Data Economy could be kicked off through the use of programs and incentives that allow people to join the community and contribute meaningfully. Crypto-tokens represent a novel innovation in giving people a small slice of an emerging protocol. Crypto-tokens also can help to align activities amongst a globally dispersed group of people who want to move an idea forward. When used appropriately, tokens can help to sustain and drive the growth of protocols.

3. Tokens Nurture the Ocean Community

Over the past six years, we have had many programs such as Shipyard, Ambassadors, OceanDAO Grants, and Data Challenges to encourage people to explore the value of data and to try out new business ideas. We learned that many of the programs require significant coordination and overhead to weed out scammers and reward true contributors who are aligned with the mission and values of Ocean.

The objective is to encourage independent, non-correlated activities at scale that are the sign of healthy, sustainable protocol communities. Our goal has always been to set up long-term incentive mechanisms that can outlast the involvement of the founding team and serve the interests of the Ocean community.

We realized that the most responsible way to encourage participation in the network with the least amount of administrative overhead was to tie incentives to the publishing of data assets and algorithms, and real consumption of assets within the Ocean ecosystem. The blockchain doesn’t lie and the truest form of proof is when people put their tokens behind their convictions. The result of this is Data Farming.

4. Data Farming as a Long-Term Sustainable Incentive Program

Data Farming incentivizes both active and passive participation in the Ocean ecosystem. Token holders earn OCEAN passively for locking up their tokens for a fixed time period. Once their OCEAN is locked, they can then actively stake the derivative veOCEAN token on data assets, to help signal quality and earn Data Farming rewards. The Data Farming program is being automated so that the administrative overhead is minimal and manageable as adoption of Ocean technology scales.

With Data Farming in place and road tested for almost one year, a core piece of the incentive puzzle is now in place. Over the next few years, we will improve and secure the mechanism but after nine months of testing Data Farming and the previous years of gingerly testing out incentive programs, we’re now confident that we can finally mint the remaining OCEAN in the token contract to a multi-signature wallet and revoke any pause or destruct capabilities on the OCEAN token contract. By revoking all rights, OCEAN token holders can rest assured that their holdings are completely safe and will live indefinitely on the Ethereum network.

Find out more here.

5. Crypto as a General-Purpose Technology

Our hypothesis is that crypto is a general-purpose technology. Crypto is a 14-year-old general purpose technology kicked off in 2009 with Bitcoin and if our hypothesis is correct, the journey to integrate it into society likely requires another 20 years.

In 2005, Dr. Lipsey et al posed the idea that all human progress can be attributed to 30 odd technologies that drive innovation and new products. Generally, these technology innovations are so disruptive and wide-reaching, that it takes at least 30 years to weave itself into the fabric of society and become ubiquitous.

As a comparison, work on the Internet began in the 1960s but it was only in 1989 when Tim Berners-Lee invented the world wide web that the internet could begin its rapid rise to ubiquity as a consumer and business essential. Counting forward 30 years from 1989, we land at 2019. Most people can agree that by 2019, the Internet was ubiquitous and inseparable from our modern society.

For Ocean, the interplay between crypto, data and AI is obvious now with the release of ChatGPT, Bard and other large language models (LLMs), and the rate of innovation could happen faster than expected.

Our core hypothesis at the founding of Ocean was that, in a world where AI agents are economic agents, the means of exchanging value would be over blockchains, smart contracts and crypto tokens.

AI agents will use web2 services once web2 services integrate with web3 to allow native blockchain based identity and profiles — and delivery of value will be notarized and orchestrated on blockchains, along with the payments.

These are the reasons why we launched Ocean Protocol. To give an incentive structure for AI, data and blockchains to work together beside human agents. This is also the reason why we have a long-term, Bitcoin-like emission schedule of OCEAN to encourage the adoption of Ocean technology over time — because these fundamental business and mindset changes take time.

6. Allocation of OCEAN Tokens

The Ocean token contract has a maximum cap of 1.41 billion OCEAN.

In 2019, 613.1 million OCEAN was minted with the majority of tokens distributed to token acquirors, the Ocean community via grants and Ocean Protocol Foundation for administrative expenses. The single mint in 2019 represented 43.5% of the total allowable token supply, leaving 797 million OCEAN or 56.5% of the token supply remaining to be minted.

As of May 2023, 100% of the OCEAN token supply will be minted with the proceeds being deposited into oceanDAO multisignature wallets with seven signees. The signees are made up of core Ocean team members, Ocean community members and founders of other web3 projects, with the core team members being in the minority of signees. The final allocation of all 1.41 billion OCEAN is shown in Diagram 1 below:

Diagram 1: Allocation of Ocean Tokens

When 100% of OCEAN is minted, 807 million OCEAN representing 57.2% of the total supply remains to be distributed for Data Farming and incentive programs according to a multi-decade emission schedule. There is also funding for existing community programs and administrative expenses.

Diagram 2: As indicated by the dotted line, 57.2% (807 million) Ocean Tokens remain undistributed as of 5/2023. The rest of the OCEAN tokens have been distributed. 7. Inflation Rate of the OCEAN Token Supply

There will be no major injection of OCEAN into the circulating supply at this stage. The number of tokens disbursed will be gradually ratcheted up in lockstep with adoption, so the inflation rate in the OCEAN token supply will be at a level to attract new participants while protecting existing holders.

The main source of token inflation moving forward occurs as part of the oceanDAO Data Farming program. The emission schedule is modelled on Bitcoin and serves as a long-term, multi-decade program to incentivize passive and active participation in the Ocean ecosystem. Most importantly, Data Farming is designed to be automated with little or no intervention from the core team.

The Data Farming program has been running for nine (9) months and we have given ourselves an additional 2 years to test and harden Data Farming. If the adoption requires more time or the incentive mechanisms need more time to mature, less OCEAN will be disbursed.

Assuming the mechanisms work as designed and adoption of Ocean Protocol tools proceeds as planned, the maximum disbursement of Ocean to the curators and network starts in late-2025 and continues for 4 years (Diagram 3). At peak emissions in Year 2, the inflation rate would be 8% — 52 million OCEAN emitted into a total liquid supply of approximately 650 million OCEAN. At the first halvening at Year 6 (Week 330), the emission rate falls to 500,000 OCEAN per week and the resulting the inflation rate drops to 3%.

Diagram 3: Data Farming Emission Schedule

You can find out more about Data Farming and the emission schedule here.

We believe that crypto-adoption will continue but between now and 2025, there are broader macro-economic headwinds due to existing global disruptions, the ongoing banking crisis and unsustainable national debt loads. In 2 Years, the adoption of AI and web3 technologies could be ripe and align with the ratcheting up of OCEAN token emissions to capitalize on the long-term trend.

8. Valuation of the OCEAN Token Ecosystem

With the current circulating supply of OCEAN at 605 million, the value of the network is $200 million assuming today’s price of $0.33/OCEAN.

With the minting of the remaining 56.5% of the supply to reach maximum cap, the fully diluted value of the Ocean network is $465 million. Meanwhile, the OCEAN token contract is fully decentralized, censorship-resistant and can no longer be paused or destroyed.

The Ocean core team will notify the main coin portals CoinMarketCap and Coingecko and provide quarterly updates to the token supply so that community members can rely on them.

AMA. To field questions by the community, the Ocean core team will hold an AMA about the OCEAN actions, on May 25 at 12.30 UTC. Follow @oceanprotocol on Twitter for more details.

9. Upwards and Onwards

Trent and Masha McConaghy posed the question 10 years ago on whether digital art could truly be owned, traded and transparently tracked. This exploration led directly to Ocean Protocol - a realization that AI, data and blockchains were naturally suited to unlock immense value and human potential.

No one can predict the future, but if the past is any indicator, many of our initial hypothesis’ were proven correct with the rise of NFTs, decentralized markets for tokens and an explosion of AI + data startups.

The OCEAN token is a fundamental part of the Ocean ecosystem. With this minting of OCEAN tokens and the revocation of all control over the OCEAN token contract, it’s now up to all Ocean community members (including the core team) to imbue value in the token through real technology advances and real adoption one data scientist and user at a time.

About Ocean Protocol

Ocean was founded to level the playing field for AI and data. Ocean tools enable people to privately & securely publish, exchange, and consume data.

Follow Ocean on Twitter or Telegram to keep up to date. Chat directly with the Ocean community on Discord. Or, track Ocean progress directly on GitHub.

Control Over the OCEAN Contract To be Revoked Soon: Overview was originally published in Ocean Protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.


PingTalk

No Bluffing at Identiverse: Go All-In with CIAM | Ping Identity

Ping Identity is excited to see our customers, partners, and other IAM leaders at Identiverse 2023 in Las Vegas from May 26th to June 2nd. So, what does the Entertainment Capital of the World and Customer Identity Access Management (CIAM) have in common? Las Vegas casinos and CIAM programs both aim to balance security and convenience by:   Delivering delightful customer experienc

Ping Identity is excited to see our customers, partners, and other IAM leaders at Identiverse 2023 in Las Vegas from May 26th to June 2nd.

So, what does the Entertainment Capital of the World and Customer Identity Access Management (CIAM) have in common? Las Vegas casinos and CIAM programs both aim to balance security and convenience by:

 

Delivering delightful customer experiences while managing risk

Verifying the identity of customers and authorizing transactions

Profiling and progressively getting to know customers to understand preferences, connect them to services they desire, and provide benefits

Guiding customers on a journey from unknown to loyal

Evaluating behavior to separate the bad actors from the good

Protecting against fraud, counterfeiting, and impersonation

Supporting omnichannel experiences across physical locations (casinos) and digital properties (gambling apps)

Ensuring resiliency across operations

 

In today's digital-first world, we must strongly identify, authenticate, and authorize users in order to boost security and protect our customers. This is true whether you’re an online business or a casino on the strip. Hackers and fraudsters, like casino cheaters, are constantly changing tactics to steal money. We must aggressively and purposefully consider new solutions and different approaches to stay one step ahead and prevent fraud. 

 

To combat threats, more companies are offering multi-factor authentication (MFA) options to safeguard users. And their customers are noticing! Surveys indicate consumers put more trust in brands that offer additional security measures. To meet this expectation, businesses should build consumer adaptive MFA into digital services and leverage it as an eventual springboard to passwordless. Or, if possible, companies should leapfrog right to passwordless for new and existing customers. We need to use tools like MFA to slow down suspicious users when risk is high and, conversely, speed up trusted customers when risk is low, striking the elusive balance between security and convenience.

 

The term "playing with house money" basically means gambling without risk of loss. This is not the case with most digital relationships today. Casinos employ experts to  simultaneously profile and surveil their customers to separate good actors from suspected bad actors. Similarly, the latest threat detection tools can evaluate risk signals in real time throughout the user session to determine trust and ensure valid users can transact quickly and easily. That way, your business won’t have to gamble on possible fraudulent activity.

 

In summary, the threats that Vegas casinos face are as vast and varied as the digital threats facing businesses today. We all must focus on securing people, businesses, and transactions without sacrificing experience to grow loyalty and revenue. It’s the only way to give our customers extraordinary experiences that keep them coming back for more. 

 

During his keynote speech on the opening day of Identiverse titled Identity Under Attack, Andre Durand, Ping’s Founder & CEO, will discuss how AI is changing cybersecurity. We also invite you to join us at the Founders Party at On the Record inside of Park MGM on June 1st from 7:00–11:00 PM.

 

Plus, don’t miss an opportunity to give back at Identiverse with our latest Ping4Good initiative! Summer may have just started, but local charities are gearing up to support families in need by providing them with classroom materials. That’s why Ping Identity is partnering with A Precious Child to provide 3rd-5th grade students school supplies, and all attendees at Identiverse can help! Just stop by the Ping Identity Booth (Booth 1303) to build a bag for kids in need.

 

To request a 1:1 meeting or learn more about all the Ping-related fun, check out our event page for Identiverse.

Sunday, 21. May 2023

KuppingerCole

Analyst Chat #173: Controlling the Accelerator for Secrets Management

Graham Williamson, Fellow Analyst with KuppingerCole, shares his insights and expertise with our host Matthias Reinwarth as they discuss the lessons learned from Graham's research on secrets management. They also explore the concept of "Machine Identity" and why it's important for businesses to understand. Finally, they discuss how companies can best utilize the information presented in Graham's r

Graham Williamson, Fellow Analyst with KuppingerCole, shares his insights and expertise with our host Matthias Reinwarth as they discuss the lessons learned from Graham's research on secrets management. They also explore the concept of "Machine Identity" and why it's important for businesses to understand. Finally, they discuss how companies can best utilize the information presented in Graham's research to improve their secrets management strategies.



Friday, 19. May 2023

Entrust

Zero Trust is More Than a Slogan

When it comes to Zero Trust, the conversation has moved from being a nebulous term... The post Zero Trust is More Than a Slogan appeared first on Entrust Blog.

When it comes to Zero Trust, the conversation has moved from being a nebulous term several years ago to a well-understood framework with a true north that has broad acceptance across the cybersecurity industry.

That said, confusion abounds among C-suite executives and boards. If you walked around the exhibition at the 2023 RSA Conference, you could hear from a networking specialist that Zero Trust is all about IP, and then from an identity vendor that it’s all about credentials. Defining Zero Trust narrowly runs the risk of creating a false sense of security – if you just secure the network or the credentials, you’re done with the Zero Trust journey.

That’s why the recent release of the latest CISA Zero Trust Maturity Model in April 2023 provided a much-needed reference point for government agencies as well as enterprises on how to develop Zero Trust strategies. Zero Trust is broader than any one vendor’s solution. It is an approach that can govern user identity and access management (IAM), and access and security of devices, networks, application workloads, and data, with a risk-based access control model that can be implemented across an organization’s entire user base and IT stack.

CISA’s Zero Trust Maturity Model Pillars

When we talk to our customers, we talk about Zero Trust foundations. Regardless of which vendors you choose, Zero Trust is a journey, one that you can track along the maturity model. This includes making sure you’re securing the network and credentials – for humans, machines, and applications. In today’s post-perimeter environment, it means implementing dynamic authentication that is verified rather than assumed and secures your data assets with encryption and key management best practices, and ensuring development code is signed and approved.

In fact, that last item is a great case in point when it comes to these potential Zero Trust blind spots that arise when you don’t think holistically. When developing applications, code is brought in via APIs, libraries, and repositories and analysis shows us that as much of 70% of an organization’s code is assembled rather than written. An application deployed within an organization and the software bill of materials that make them up aren’t always thought of when it comes to Zero Trust planning, but they absolutely should be. How you certify code that comes from outside your organization and the risk profile you apply to it should be part of that framework.

Ultimately the shift in thinking boils down to moving from the idea of trust to verification. So instead of trusted identities and applications we have verified identities and applications. The second part of that shift is the delta, the change over time, which means we don’t just verify once, but continuously. This is change that can be measured over time, geography, and access entitlement and that is aligned with the risk profile for any given user, machine, or application.

This approach is a reflection of the Zero Trust maturity model, which has gained significant attention from cybersecurity professionals and organizations alike. For most organizations, the current state of Zero Trust maturity is still in its early stages.

We recognize that implementing a Zero Trust security model requires a significant investment of time and resources. It involves assessing the current state of security controls and infrastructure, identifying gaps and weaknesses, and developing a plan to address those gaps. While many have a long way to go before they can claim to have a fully mature Zero Trust security posture, there are excellent resources available to help map out that journey, as well as partners like Entrust that can offer the guidance and broad portfolio needed to get there.

The post Zero Trust is More Than a Slogan appeared first on Entrust Blog.


Indicio

Polly Wants Self-Sovereign Identity – Taking Control of Your Digital Identity w/ Indicio

This Week in Enterprise Tech The post Polly Wants Self-Sovereign Identity – Taking Control of Your Digital Identity w/ Indicio appeared first on Indicio.

Global ID

Introducing the Universal Namespace

At GlobaliD, we’ve been on a mission from day one to provide every individual with a digital identity that they own and control. Last year, we took that one step further, giving users the ability to own and control their money with the release of our non-custodial wallet. It’s also why, today, we’re announcing that we’re transitioning to the Universal Namespace. This will allow users to own and c

At GlobaliD, we’ve been on a mission from day one to provide every individual with a digital identity that they own and control. Last year, we took that one step further, giving users the ability to own and control their money with the release of our non-custodial wallet.

It’s also why, today, we’re announcing that we’re transitioning to the Universal Namespace. This will allow users to own and control a unique name associated with their digital identity through an independent third-party namespace rather than one managed by GlobaliD.

A namespace is simply a set of unique names that are tied to different types of things so they can be easily identified. In the context of the internet, the Domain Name System or DNS namespace tied domain names to relevant IP addresses so that people could easily find the websites they were looking for.

The concept of a namespace is key to the mainstreaming of digital
identity — just as it was for the mainstreaming of the internet. The early internet was a fragmented mess. The only way to find anything was through hard-to-memorize IP addresses. With the arrival of DNS, we could now visit wikipedia.com rather than 80.190.205.15.

In a sense, the DNS namespace made the internet more human.

The same goes for digital identity, which is even more complex than simply visiting a website. Someone you want to interact with may have many public keys, URLs, or platform accounts you need to keep track of. Now, all you need is their unique name.

At GlobaliD, we’ve been thinking deeply about namespaces over the last few years — we published our first white paper on the subject in the beginning of 2019. As an early mover in the digital identity space, we’ve often had to come up with original concepts and terms in order to help develop and push the industry forward.

That’s also been the case when it came to a namespace for identity. As such, we originally built a namespace in-house. But the long term vision was always for the namespace to be managed independently outside of GlobaliD. That’s something we laid out in our white paper. Ultimately, we believe the namespace should be a public good.

We now have the opportunity to make that vision a reality.

For GlobaliD, this is a big step toward interoperability with other digital identity platforms. For instance, GlobaliD may support additional namespaces in the future, giving users a choice over their service provider. Regardless of which namespace you choose, your unique name will always be attached to a verifiable credential that you control.

In all, this is an incredibly exciting evolution for GlobaliD’s digital identity offering.

The transition will require one additional step for you, the user, but we truly believe that the end destination — where individuals control their name, their identity, and their money — is worth the effort.

The simplest way for this transition to work is for existing users to re-onboard with GlobaliD and claim a new name through the Universal Namespace.

This transition does mean that users will lose their existing identity history on the platform once they re-onboard with a new name. That includes things like chat history, vouches, and verifications.

This wasn’t an easy decision to make and we’re incredibly empathetic to our early adopters, but ultimately, it became clear that this was the best path forward, one most aligned with our values and vision.

For GlobaliD, this clean slate is also an opportunity for us to provide major upgrades to our messaging and wallet platforms as well as a bevy of new features that will arrive with our next generation digital identity platforms, which will be released in the coming weeks.

In the meantime, users will still be able to access their wallets via their private key, but they will no longer be able to access it through the app.

For now, you can head over to the Universal Namespace (/uns) to reserve your unique name. Make sure you sign in with the same phone number you’ll use with GlobaliD so that we can link your name to your GlobaliD identity.

You should also keep an eye out for notifications from the current GlobaliD app that will include prompts and instructions which will help you throughout the re-onboarding process.

In all, we’ve got plenty in store for all of you in 2023.

Q&A Will I be able to keep any part of my current digital identity?
No, you will lose access to all of the data that was associated with your old name including chat history, groups you joined, and funds in your wallet. However, you can transfer funds out of your wallet now before the transition takes place, or you can use your keys to move your entire wallet to another non-custodial client. If you are locked out of your wallet before you have a chance to migrate it, you can reach out to our support team for help exporting your non-custodial wallet recovery key. Will I be able to keep my old username?
No. The Universal Namespace (UNS) is a new service which will manage GlobaliD names moving forward. All existing users of GlobaliD will re-onboard with UNS by claiming a name on this separate, independent namespace. Because UNS has been managed independently of GlobaliD, some names have already been reserved and owners of GlobaliD names may not have their names available on UNS. If your old name is available in UNS, then you can claim it and re-onboard using that name. Will the Universal Namespace reserve my existing name?
No. You need to reserve your new name on your own. You will be able to do this within the updated GlobaliD app upon release, but you do not have to wait. You can go ahead and reserve your new name now by visiting the Universal Namespace website. What if I have money or assets in my GlobaliD Wallet?
You will need to transfer funds from your current GlobaliD Wallet to a new wallet by May 30, 2023 in order to maintain access to your money. Alternatively, you can export your wallet recovery phrase to host your wallet with another app. A new wallet tied to your username will be made available to you upon release of the updated GlobaliD app. What is the Universal Namespace?
The Universal Namespace (UNS) is a discoverability platform that enables people to securely connect their online accounts and data with a unique name. With UNS, users can determine how they are found online by building a public profile that connects all of their digital identities to one universal name they own and control — making it easier for them to be discovered across different ecosystems. UNS is managed independently of GlobaliD and provides name services to multiple providers, in addition to GlobaliD. You can obtain a name by signing up with your phone number on the Universal Namespace website. Can I export any of my data before it is deleted?
Yes. If you are locked out of your wallet before you have a chance to migrate it, you can reach out to our support team for help exporting your non-custodial wallet recovery key. Otherwise, go to your Wallet tab and tap on the key icon in the upper right-hand corner to get your wallet recovery phrase. You can also contact our support team if you would like us to export your chat history. All other data such as your credentials, verifications, and public bio information will be lost. What is the deadline to reserve my new username and transfer funds from my wallet?
We will release the updated GlobaliD app within the next few weeks and notify you when it is available so that you can re-onboard. However, you do not have to wait for this release in order to claim your new name. You can go ahead and reserve your new name now by visiting the Universal Namespace website. You can also go ahead and transfer the funds from your current GlobaliD Wallet to a new wallet in order to maintain access to your money.

Introducing the Universal Namespace was originally published in GlobaliD on Medium, where people are continuing the conversation by highlighting and responding to this story.


Fission

Fission Fridays: May 19th, 2023

Announcements Passkey support has arrived for ODD SDK! Check out the blog post and demo videos, or visit the GitHub repo to get started today! Next week we're sharing the recorded Causal Islands talks on our YouTube Channel. Subscribe to be notified when they are posted! ​New On The Blog Introducing Passkey Support WebAuthn API, or Passkeys, is a user authentication system that replaces pas
Announcements Passkey support has arrived for ODD SDK! Check out the blog post and demo videos, or visit the GitHub repo to get started today! Next week we're sharing the recorded Causal Islands talks on our YouTube Channel. Subscribe to be notified when they are posted! ​New On The Blog Introducing Passkey Support

WebAuthn API, or Passkeys, is a user authentication system that replaces passwords using cryptography. In this post, we dive into how passkeys work, how they differ from other passwordless options, and why developers should consider building passwordless apps in the first place.

Fission's Origin Story

This is the story of how we built an open source company that specializes in developing the identity, data, and compute protocols for the future of the Internet.

Thank you, and we'll be back next week with more exciting updates!

-The Fission Team


auth0

Build a Beautiful CRUD App with Spring Boot and Angular

Learn how to build a secure CRUD app with Spring Boot and Angular. You'll use Auth0 for authentication and authorization and Cypress to verify it all works.
Learn how to build a secure CRUD app with Spring Boot and Angular. You'll use Auth0 for authentication and authorization and Cypress to verify it all works.

Indicio

HOPPER, MAKEMYTRIP, INDICIO ON DATA SECURITY WITH CHATGPT

PhocusWire The post HOPPER, MAKEMYTRIP, INDICIO ON DATA SECURITY WITH CHATGPT appeared first on Indicio.

Northern Block

The Digital Universal Credential Adaptor

Northern Block Inc. and Canadian Bank Note Ltd. partner to enhance convenience for Canadians by combining digital credentials such as Mobile Driving Licences and Verifiable Credentials using Hyperledger Aries technology. This collaboration aims to provide a more user-friendly and interoperable solution, focusing on privacy preservation and high-assurance personal identity credentials. The post T


Note 1: Inspiration for this blog post was taken from a conversation I had with Tim Bouma about the role of digital wallets

Note 2: I made some edits on 5/26 and removed mentions of voltage, as they were not suited for the analogy.

 

An Analogy for better understanding interoperability requirements for Issuers, Verifiers and Holder Wallets

As I prepare for an upcoming trip from Canada to Europe, I’m reminded of the different power standards between continents. As usual, I will be carrying my laptop and phone, with their chargers, along with an essential universal travel adapter to keep them powered up while I’m there.

This is how it will all work so that I can seamlessly use my devices while I’m abroad:

The European electrical socket, my source of power while abroad, varies in shape, and frequency compared to what I’m used to back home in Canada. It’s the origin of the electrical energy I need to charge my devices.

Various electrical sockets

My universal travel adapter is the unsung hero of my tech travel kit. It’s designed to connect with the European electrical socket, which differs from Canadian standards (European outlets often have two round pins, while Canadian plugs typically have two flat parallel pins with a grounding pin). 

But I don’t have to worry! My universal travel adapter will allow my devices to consume electricity from the power sockets.

Switching gears for a minute..

At Northern Block, we’ve been recently experimenting with various credential profiles and credential exchange protocols. We’ve come to understand that different use cases necessitate distinct standards and technological choices. This work has led us to think deeply about the interoperability between different parties involved in credential exchanges.

As a company, we have a strong background in the Hyperledger Aries and AnonCreds world, a topic on which we’ve shared some writings (here and here). We have also contributed towards multiple open source Aries RFCs. 

Recently, through collaboration with some partners of ours in the digital identity community, we’ve been broadening our scope of support. This includes exchanging various credential profiles by supporting new specifications like OpenID4VC, both for issuances and presentations. We are at varying stages in our implementations of both these protocols. Further in collaboration with our partner, Canadian Bank Note, we’ve been recently working on demonstrating interoperability between DIDComm-based RFCs and the Mobile Driver’s Licence standard. We’ve written about the motivation and reasoning behind our mDL interoperability work with them here.

Now..

Keeping all this in mind, let’s use the example of electrical sockets, my devices, and universal travel adapter as a comparison to understand how different parts interact in digital credential exchanges. This comparison also helps us understand what interoperability means for everyone involved.


Issuers

In this analogy, the European electrical socket is akin to an issuer of digital credentials. 

Issuers generate and provide signed data that they’re authorized to issue, which can come in various formats.

Issuers will issue credentials in their chosen format without being forced to adhere to a specific standard, similar to the electrical socket in a specific country, which provides power according to its local standards without needing to conform to the formats used elsewhere. One issuer may pick AnonCreds and another may pick JSON-LD to issue a similar credential.

Each issuer’s specific use case plays a significant role in the choice of technology. For instance, a transportation ministry issuing mobile driving licences would have different needs and considerations than an issuer of organizational-related claims. Each issuer’s primary intended use case shapes their unique needs (e.g., importance of authenticity, confidentiality, privacy, security) and requirements, and thus, they select the technology and format that best aligns with these needs. Similar to the way my electrical socket in Canada follows its own standard, issuers will issue credentials in their chosen format, based on their main use case, without being forced to conform to a uniform standard.

The issuer’s role is considered more streamlined because they can maintain a consistent issuance method and format that doesn’t need to adapt to the formats used by other issuers. This is much like how an electrical socket in a specific country, such as Canada, maintains its own consistent standard, without needing to adapt to the differing standards of sockets in other countries, like those in Europe.


Verifiers

Next, my devices – my laptop and phone – represent the verifier in the digital credentialing scenario. They need the credentials in a form that they can understand and accept, much like how they need power through a specific power adapter.

The verifier, based on its business and other requirements, decides what type credential format the holder needs to provide to them – they place restrictions. 

Restrictions are independent from how the issuer issued a credential. Similar to how devices like laptops or phones require specific types of electrical input from a power socket. The power source or the power socket doesn’t determine the power requirements of the device. Rather, the device itself dictates what kind of power it needs, independent of the supply available from the socket.

When the wallet can provide the credential that fits the verifier’s requirements, it’s like my devices perfectly fitting into an electrical socket with a universal adapter. Everything works seamlessly.

However, if the wallet can’t meet the restrictions, the verifier must decide if it should adjust its requirements. This is similar to the scenario where my phone or laptop plug doesn’t fit the socket. Here, I would need to adjust or use a different adapter.

When credential proofs are presented by the holder, the verifier checks if they conform to its restrictions. If not, they won’t work properly and could even get damaged. Similarly, a verifier ensures the presented credentials meet their criteria for a secure and valid transaction.


Holders/Wallets

If we assume issuers will typically pick one issuance method for their credentials, and verifiers may place restrictions for what they deem acceptable to be presented based on their context, then there needs to be something that facilitates this interoperability, or conversion.

The role of my universal travel adapter mirrors that of the digital identity wallet in this scenario. Just as the adapter transforms any incoming power into a form my devices can use, the digital wallet receives these varying credentials from the issuer, manages them, and converts into acceptable formats when requested, ensuring that none of the original credential’s attributes or values are lost in the conversion process.

The holder’s wallet has the most complex role in this ecosystem. It needs to be able to receive, store, and manage credentials from multiple issuers, each potentially using a different credential format, likened to the role of my universal travel adapter, converting ‘power’ from various formats into a form that the device can use.

This allows the holder to present these credentials to verifiers in a format they can understand and accept, much like how my universal travel adapter enables me to plug my Canadian devices into European sockets, converting the European power standards into a format my devices can safely use and accept. The adapter ensures compatibility between different power grids and devices without the user needing to know the specifics. This is similar to how a wallet operates.

The wallet is essential for maintaining the separation between the issuer and the verifier. It eliminates the need for coordination between every issuer and verifier, even if they use different technologies or standards. The issuer might use any format, and the verifier may require any format, but the wallet provides the interoperability that allows the verifier to distance itself from the issuer, thus preserving privacy.


If The Analogy Is True, Then What?

I do think the analogy of a universal plug adapter aptly depicts the role of the holder’s wallet in the exchange of digital credentials. Much like the adapter, which ensures compatibility between diverse power grids and devices, a wallet manages credentials of varying formats, serving as a vital bridge between the issuer and the verifier.

Just like a plug adapter’s role, the wallet allows for seamless interoperability, shielding the verifier from the specifics of the issuer’s credential format. This operational separation, facilitated by wallets, grants scalability to the system, removing the need for direct coordination between all issuers and verifiers, regardless of the technologies or standards they employ.

However, such conversion isn’t a straightforward process..

Research is required to understand and safeguard the credential’s attributes and values during the transformation. For example, if a credential loses its privacy-preserving features during conversion, we need to clearly communicate this potential loss to the user.

Moreover, the complexity of transformation depends on the credential format. Some formats, like AnonCreds, Selective Disclosure JSON Web Tokens (SD-JWT), and MSO (ISO/IEC 18013-5), offer more enriched data structures than others such as JSON-LD.

For instance, AnonCreds and SD-JWT are rich formats where every element can be individually signed, facilitating selective disclosures. However, this feature also makes them heavier to process. In contrast, converting a rich format into a simpler one like JSON-LD doesn’t pose an issue, as there’s no loss of information. But a reverse conversion is more difficult because not every claim is signed by the issuer in JSON-LD format.

I hope this further underscores the need for in-depth research to understand the dynamics of different credential formats and to develop robust interoperability frameworks.

While we allow the standards to evolve naturally, we must focus on ensuring these frameworks can accommodate and seamlessly convert between all these formats without compromising the credential’s integrity or the issuer’s integrity.

Interoperability is the key here, and that’s where the research should be directed. We should let the standards evolve as they are, but there must be an effort to develop interoperability frameworks.

We can take inspiration from the hardware industry, where plug and play really works. We need to make our systems truly interoperable, just like hardware devices.

Proposed Call to Actions

Advance Interoperability Research: Interoperability, or the ability of different systems or components to work together, is a key concern in the realm of digital credentials. Currently, there are multiple protocols and formats, such as DIDComm, OpenID4VC protocols, and MDL formats. The need for research in this area is to understand how these different formats can interact and be converted without loss of integrity, information or functionality. This would allow for smoother interactions between holders and verifiers, regardless of their chosen credential format. Improve Wallet Versatility: Digital wallets play a crucial role as they need to support and manage credentials in multiple formats. Just as a universal travel adapter can accept plugs from different countries and convert the power into a usable form, the digital wallet needs to be versatile enough to accept, convert, and manage various credential formats. This is a challenging aspect that requires ongoing research and development. Analyze Credential Format Conversion: As different credential formats might prioritize different aspects (e.g., privacy, security, usability), it’s essential to understand what might be lost or gained during the conversion process. This involves deep technical analysis and understanding of the different credential protocols.

The post The Digital Universal Credential Adaptor appeared first on Northern Block | Self Sovereign Identity Solution Provider.


UbiSecure

How to Safeguard Organisational Data with Employee Access Management

Employee Access Management is the use of identity and access management (IAM) to provide employees with secure access to organisational systems and... The post How to Safeguard Organisational Data with Employee Access Management appeared first on Ubisecure Customer Identity Management.

Employee Access Management is the use of identity and access management (IAM) to provide employees with secure access to organisational systems and resources. It is also known as Workforce Access Management. These employee access management systems ensure that the right people have (only) the necessary access to the tools and resources needed to complete their jobs. Access levels are usually based on job roles or management levels within the company.

Importance of Employee Access Management

Secure and effective employee access management is critical to safeguarding internal and sensitive organisational information. It helps to mitigate information security risks, protect data and ensure regulatory compliance.

With the significant increase in hybrid and remote working since the pandemic, employee access management systems can enable employees to work from anywhere while still accessing applications securely. When remote employees access systems without an appropriate access management system, it risks leaving vulnerabilities within the system for hackers to find and take advantage of. This leads to data breaches, which has severe financial consequences and causes damage to the organisation’s reputation.

The introduction of the General Data Protection Regulation (GDPR) and other regulations meant that companies needed to re-assess data access internally. When all employees have access to sensitive information, including where it is not necessary to their role, it increases the risk of (accidental or purposeful) misuse of data. Implementing employee access management systems that allow access to sensitive information on an individual basis, or based on attributes like job roles, helps to reduce this risk. This is known as the Principle of Least Privilege.

Jenny Radcliffe, aka The People Hacker, joined the Let’s Talk About Digital Identity podcast to discuss social engineering and the importance of protecting your organisation. In the episode, Educate your staff or get hacked. Stories from a Social Engineer, Jenny discusses how hackers find and exploit vulnerabilities within organisations’ systems, and how identity products can improve the barriers to this happening. However, Jenny also touches on the importance on User Experience (UX) within these systems. Proper UX will ensure that while the software improves security, it should also not be too complicated, or make tasks more difficult.

Understanding Employee Access Management Requirements

When investing in access management systems, the software must support all the requirements of the organisation. An evaluation of your access needs can help to build a clear picture of your solution requirements. Factors to consider include:

Access requirements: user management, remote working, single sign-on. Increased security: authentication needs, including multi-factor authentication (MFA), authorisation. Improved productivity: reduced friction for remote workers, self-service credentials.

These are just a few factors that may affect your solution requirements. Learn more about these system options on our Workforce Identity & Access Management page.

Identity and Access Management (IAM) systems provide a centralised platform to manage access, authenticate users, and support connection to identity standards applications. Additional authentication, such as MFA, adds an extra layer of security by requesting that employees provide additional verification factors beyond passwords. Privileged access management (PAM) tools enable organisations to monitor and control privileged accounts, minimising the risk of insider threats.

Balancing Security and Usability

A key factor when looking to implement employee access management is getting the ideal balance between security and user experience. Strict access controls help to withstand external risks and comply with regulations, but if your employee access management solution is overly complex it can hinder productivity. Finding a user friendly and seamless solution that offers timely support for issues is fundamental for finding the balance between security and usability.

IAM software that is developed to also provide services for external users – aka Customer IAM (CIAM) – ensure such features are particularly user friendly and can cater for all remote working access scenarios as well.

This is where Ubisecure can help, with our Employee Access Management solutions tailored to suit your company’s access requirements. While placing importance on robust security, we also understand the need for user friendly solutions, simplifying complex identity and access requirements for workforce, B2C and B2B interactions. Contact us to find out more about Employee Access Management.

The post How to Safeguard Organisational Data with Employee Access Management appeared first on Ubisecure Customer Identity Management.


Northern Block

Digital Trust in the Age of Generative AI (with Wenjing Chu)

 🎧   Listen to this Episode On Spotify 🎧   Listen to this Episode On Apple Podcasts About Podcast Episode When I recorded this podcast with Wenjing, I was initially hesitant to delve too deeply into the current state of generative AI. The pace at which this field is evolving is astounding, with significant advancements appearing […] The post <strong&

🎧   Listen to this Episode On Spotify
🎧   Listen to this Episode On Apple Podcasts

About Podcast Episode

When I recorded this podcast with Wenjing, I was initially hesitant to delve too deeply into the current state of generative AI. The pace at which this field is evolving is astounding, with significant advancements appearing nearly every week. Even in the past couple of weeks since we recorded this podcast, numerous new products have been released. A notable example is Google’s generative AI chat product, Google Bard, which is rapidly gaining recognition as a strong competitor to ChatGPT.

Generative artificial intelligence (AI) describes algorithms (such as ChatGPT) that can be used to create new content, including audio, code, images, text, simulations, and videos.

For that reason, in our conversation I aimed to focus on more overarching “evergreen” topics that would remain relevant despite the rapid progression of these Generative AI products. I wanted this discussion to retain its value not only for this week but for time to come.

The inspiration for this conversation was first sparked by my personal experience with these AI tools. Further, at the Internet Identity Workshop (IIW) in April, I attended a session hosted by Wenjing on “Digital Trust in the Age of AI.” This session triggered many thoughts for me about the impact of AI on digital identity and trust, as well as the intersection between advancements in digital trust and the benefits and potential risks of AI. We both decided it would be interesting to have a longer discussion in a podcast format!

We began our conversation with the parable of the blind men and the elephant. In this story, a group of blind men encounter an elephant for the first time. Each touches a different part of the elephant and describes it based on their limited, individual experience. This leads to disagreements about the elephant’s true nature, illustrating how a singular perspective can limit one’s understanding of a larger concept.

Image taken from Sophia Tepe’s “The Blind Men and the Elephant” blog post on Medium.

This parable is relevant when considering artificial intelligence (AI). AI is a complex, multifaceted field. Individuals often form opinions on it based on their personal experiences and knowledge, potentially resulting in a limited or skewed understanding, much like the blind men’s perception of the elephant.

For this, it is important that we have more nuanced discussions about the perceived positives and negatives of such a disruptive new technology.

Some of the topics discussed between Wenjing and I in this podcast conversation include:

 

Exponential Data Growth and AI Systems: Discussion centered around how the volume of data, particularly from AI systems like GPT-4, is exponentially increasing. As AI starts generating more content, this could create a feedback loop leading to astronomical levels of content. Interaction with LLM as Protocol: The question arose whether the pre-trained Language Model (LLM) could be equated to a protocol and if individuals might interact directly with the LLM in the future. Digital Identity and Trust in the Age of AI and Deepfakes: Concerns were discussed regarding the rise of AI and deepfakes, particularly their implications for digital identity and trust. The challenges of bi-directional authentication and the potential risk to content-based authentication methods were highlighted. Future of Digital Trust Protocols and Authentication: The potential of AI to generate content was related to the future of digital trust protocols. The necessity of digital signatures for authentication was suggested as a possible direction for the future. Reframing Identity: A broader understanding of identity was proposed, questioning whether a reframing of identity could influence our understanding of concepts like authentication. Trust in AI Bots vs. Humans: Personal observations on the level of trust in AI bots vs. humans were shared, suggesting a quicker formation of trust with bots. The implications for future human-bot relationships were considered. Potential Risks and Benefits of Technological Advancements: The discussion acknowledged both potential risks and substantial benefits of technological advancements. A greater level of trust in bots due to perceived lesser risk was noted, along with a significant move towards open source models in the digital trust space. Open-source Model for LLMs and Other Systems: The pros and cons of adopting an open-source model for Large Language Models (LLMs) and other complex systems were questioned.

 

About Guest

Wenjing is a senior director of technology strategy at Futurewei leading initiatives focused on trust in the future of computing. His long career encompasses early Internet Routing development, optical Internet backbones, security operating system, Wi-Fi and 5G mobile networks, cloud native services and responsible artificial intelligence.

He is a founding Steering Committee member of the Trust over IP Foundation. He contributed as the primary author of the Trust over IP Technology Architecture specification in which he articulated the layered approach to decompose the trust protocol stack and defined the core requirements of the trust spanning layer. Following that work, he is currently a co-Chair of the Trust Spanning Protocol task force  proposing the Inter-Trust Domain Protocol (ITDP) as the trust spanning protocol bridging different trust domains across the Internet. He is also a co-Chair of the AI and Metaverse task force currently drafting the white paper “Digital Trust in the Age of Generative AI”.

Wenjing is a founding Board Member of the newly launched OpenWallet Foundation with a mission to enable a trusted digital future with interoperability for a wide range of wallet use cases and also serves in its Technical Advisory Council (TAC). He is a strong advocate of  human-centric digital trust as a foundation to responsible deployment of advanced artificial intelligence technologies.


Where to find Wenjing ➡️ LinkedIn: https://www.linkedin.com/in/wenjingchu/

The post <strong>Digital Trust in the Age of Generative AI</strong> (with Wenjing Chu) appeared first on Northern Block | Self Sovereign Identity Solution Provider.


IDnow

iGaming Ontario – a promising market for European operators?

With a gaming regulatory framework that has more similarities with Europe than other Canadian provinces, is the Ontario market the natural step for European operators to expand to? Ever since April 2022, when Ontario opened the iGaming market to private gaming operators, the province’s regulatory framework has been highly lauded and well-regarded in the international […]
With a gaming regulatory framework that has more similarities with Europe than other Canadian provinces, is the Ontario market the natural step for European operators to expand to?

Ever since April 2022, when Ontario opened the iGaming market to private gaming operators, the province’s regulatory framework has been highly lauded and well-regarded in the international gaming community.

Managed by iGaming Ontario (iGO) and the Alcohol and Gaming Commission of Ontario (AGCO), there is no limit to the number of private gaming operators that can compete in the province.

There are currently 45 regulated operators in Ontario that comply with Ontario’s high standards of game integrity and player protections. The AGCO has approved over 5,000 certified games for use in the province in the last year alone.

According to a study conducted in March 2023, over 85% of respondents who had gambled in Ontario in 2023 had done so via regulated sites. This is an impressive majority, especially considering that around 70% of all online gambling was done so via unregulated sites before the launch of the open market.

Since its launch in April 2022, Ontario’s igaming market has displaced the existing unregulated market and made Ontario an internationally recognized leader in this industry. We are truly proud of this strong, responsible, competitive online gaming model.

Doug Downey, Attorney General of Ontario.

One of the most pivotal steps in Ontario’s journey toward being a profitable and attractive gaming market came when the pivotal C-218 bill was passed by Ontario lawmakers in 2021. C-218 allows players to make single bets, rather than the previously allowed combination-only betting, this adds increased flexibility and opportunity for both operators and bettors, as single bets make up a sizeable portion of gambling activity and are some of the simplest to both place and offer.

Ontario was quick to act following the passing of C-218, putting out detailed guidelines for operators interested in entering the market, while also working alongside a range of industry leaders to establish responsible regulations.

For more information, read our ‘How to become compliant in Ontario’s iGaming market’ blog.

Advantages for gambling operators doing business in Ontario: Relatively unrestricted marketing opportunities
Young untapped market
Gateway to the rest of Canada
Only province in Canada to have legalized both sports betting and online casinos, which is rare in North America
No cap on the number of licences given to operators, allowing for healthy competition
iGaming sites are not tethered to land based sites
Tax rate is very competitive (around 20%)
Operators can operate out of Malta, UK and other countries Outlook on Ontario’s online gambling market.

VIXIO Gambling Compliance forecasts have predicted that Ontario’s gambling market will see rapid growth in the short term, with a forecasted revenue of $1.5 billion by 2026. These numbers put the province as one of the largest online gambling markets in North America.  

While impressive growth is forecast, that’s not the only benefit for operators. Ontario’s market also features a very competitive annual tax rate of just 20% – one of the lowest in the world.  

Another key positives to Ontario gaming sector, is its European-like approach to gambling regulations. Unlike some other markets, operators will not be required to cease offshore activities in other Canadian provinces, once registered in Ontario.  

How Ontario’s standards are raising the regulatory bar.

Ontario’s iGaming market is also raising the bar when it comes to regulatory standards. Due to an extensive list of rules and regulations, it is considered a complex market to enter, and features a core focus on customer protection.  

iGaming Ontario and the AGCO have set out a wide range of rules and regulations that need to be followed to enter the market. This goes way beyond standard Know Your Customer (KYC), and Anti-Money Laundering (AML) standards. All users and operators also required to adhere to the Proceeds of Crime and Terrorist Financing Act (PCMLTFA), which is regulated by the Financial Transactions and Reports Analysis Centre (FINTRAC). 

Guidelines include requiring operators to report suspicious transactions, prohibited players, and provide a robust KYC process with an onboarding stage that includes checking official documents, recent photos, as well as liveness. Read the full details of AGCO’s Registrar’s Standards for Gaming for more information.

The future of Ontario’s iGaming market looks very promising. Not only from a revenue standpoint, but also with regard to its high level of customer protection and regulations. If the market proves to succeed as it is forecast, it shows that stepping up security and customer protection efforts in order to become regulated, is definitely worth the effort. This is especially important in modern times, with regulators all over the globe pushing for stricter gambling rules and regulations.  
 

Why Ontario’s iGaming model will be an industry game changer. Discover how IDnow can help you tackle Ontario’s complex iGaming market. Download now What’s next for the Canadian iGaming market?

Due to the success of the Ontario model, it’s highly likely the future of Canada’s iGaming market will see more provinces open up their market to private gaming operators. In fact, it’s probable that finance ministers are already looking at Ontario, and its increased revenue streams from the regulated gaming sector, and considering how it could launch its own regulatory regime for the iGaming market. The good news for operators that are registered and regulated in Ontario is that when other provinces finally open up their market, regulations are likely to be quite similar, which will give them a headstart on regulation processes.

How European gaming operators can enter Ontario’s iGaming market.

All operators are required to register directly through the AGCO, which will provide their services through a commercial partnership with iGO. This means gaming companies work as private operators on behalf of Ontario’s governing body.  

In order to enter the market, operators must go through an official registration process, which begins with submitting an application through the iAGCO portal. Operators must submit gaming-related goods and services, gaming website information, and branding details.  

There are also comprehensive entity and personal disclosure stages, where the registrant, as well as associated individuals like parent companies and individuals holding key management or operational roles, need to submit details.  

There is, however, more to it than just going through that initial application process. In the long run, operators will need to be fully compliant with Ontario’s evolving rules and regulations.
 

IDnow’s role in easing entry into Ontario’s iGaming market.

Entering Ontario’s iGaming market will prove a challenge for any operator. The range and complexity of iGaming requirements mean operators must navigate a regulatory maze to reach compliance. (Read on: Becoming Compliant in Ontario’s iGaming Market)

Player onboarding requirements include:

Players must be 19+, physically based in Ontario
Politically Exposed Person (PEP) checks need to be passed
Personal information, including occupation must be collected
Third-party providers are needed for onboarding for initial checks

One key issue operators may face is a substandard non-compliant onboarding process. iGO’s clear AML and KYC requirements, which even exceed FINTRAC standards, and require detailed verification, tracking, and data storage systems.

This is where IDnow steps in with our verification solutions.

IDnow’s highly configurable identity verification services for gaming work across multiple regulations, industries and use cases, including gaming. Whether automated or expert-assisted, our online identity-proofing methods have been optimized to meet the strictest security standards and regulatory requirements without compromising on customer conversion or consumer experience.

Our proprietary AI-powered fraud technology can verify users’ identities in minutes, while remaining fully AML-compliant. All users need is a mobile phone, and we guide them through the rest, including Ontario’s and FINTRAC standards, with official document, selfie, and liveness verification.  

With years of experience and compliance support for over 195 countries, our automated identity verification service is not only a great tool for making entry into Ontario simpler, but also further expansion in the ever-growing iGaming industry.

By

Roger Redfearn-Tyrzyk
Director Global Gambling & Sales UK at IDnow
Connect with Roger on LinkedIn


OWI - State of Identity

Creating a unified cross-enterprise authentication layer

Secfense Chief Technology Officer, Marcin Szary, joins host Cameron D'Ambrosi to explore the current authentication landscape. They discuss why the FIDO Alliance has been a truly transformative moment for the death of the password, how Secfense sets itself apart in a crowded and competitive landscape, and Marcin's predictions for the future.

Secfense Chief Technology Officer, Marcin Szary, joins host Cameron D'Ambrosi to explore the current authentication landscape. They discuss why the FIDO Alliance has been a truly transformative moment for the death of the password, how Secfense sets itself apart in a crowded and competitive landscape, and Marcin's predictions for the future.


SWN Global

Sovereign Wallet (MetaMUI) main contributor to BIS Innovation Hub Nordic Project Polaris and…

Sovereign Wallet (MetaMUI) main contributor to BIS Innovation Hub Nordic Project Polaris and offline CBDC Photo by Jeffrey Blum on Unsplash In May 2023, the BIS Innovation Hub Nordic Centre published a handbook, discussing the key aspects for CBDC offline payments. Current research and surveys conducted through Project Polaris, the BIS showed a large majority of central banks require off
Sovereign Wallet (MetaMUI) main contributor to BIS Innovation Hub Nordic Project Polaris and offline CBDC Photo by Jeffrey Blum on Unsplash

In May 2023, the BIS Innovation Hub Nordic Centre published a handbook, discussing the key aspects for CBDC offline payments. Current research and surveys conducted through Project Polaris, the BIS showed a large majority of central banks require offline payments with retail CBDC as a necessity to adopt digital currencies as national legal tender. A survey conducted by the BIS Innovation Hub Nordic Centre shows that 49% of central banks consider offline payments with retail CBDC to be vital, while another 49% deemed it to be advantageous.

Providing offline payments with CBDC is an important requirement for many central banks for reasons such as resilience, inclusion, privacy and cash resemblance. However, its implementation is complex and involves a number of technology, security and operational considerations that need to be planned and designed for at the earliest possible stages.

The handbook provides a comprehensive overview of the key aspects of offline payments with CBDC and is intended to serve as a guide for central banks considering implementing offline payments capabilities.

Among the handbook contributors, private sector technology providers were invited to discuss their solutions covering offline-payment and CBDC related technologies. Sovereign Wallet was the very first responder to meet with BISIH Nordic last year, and presented MetaMUI offline-payment solution.

Sovereign Wallet’s CEO Phantom Seokgu Yun and CSO Cizar Bachir Brahim gave professional insights on the requirements to a successful implementation of CBDC were acknowledged in the BISIH Handbook. Recognized as the only Korean company to participate in the global project represents that company’s commitment in shaping the future of secure, digital identity platforms.

This year, more that 95 % of Central Banks around the world have been conducting a series of CBDC pilot tests in order to expand the scope of how digital currency can reshape their economies. Countries partnered with many well known organizations have showcased their successes and difficulties with implementing their CBDC initiatives for the public.

The BIS handbook is intended to guide central banks in understanding:

the latest available technologies and security measures the risk and risk management measures available the privacy issues, inclusion needs, and resilience options the design and architecture principles involved the potential operational and change management issues

Sovereign Wallet’s unique solution for Offline CBDC requires the sender to register their identity approved by trusted entities such as Bank or Government Agent. Verifying users identity establishes trust between participants and allows them to conduct offline transactions until connectivity is reestablish. The proposed solution utilizes the trusted identity of the sender to help mitigate risk while adhering to privacy policies set by regulatory organizations.

Sovereign Wallet’s involvement in the BIS Handbook for Offline Payments with CBDC, alongside leading global participants, positions the company as a key contributor to the advancement of secure, inclusive, and accessible CBDC solutions. By offering expert insights and sharing its practical expertise, Sovereign Wallet plays an integral role in shaping the industry’s best practices of the potential CBDC brings in both online and offline usage. The collaboration will contribute to the global efforts in creating a trusted digital infrastructure that supports financial inclusion, reduces fraud, and fosters digital innovation.

Sovereign Wallet Co., Ltd. is a state-of-the-art blockchain company with a focus on digital identity solutions, verification, and finance. Founders of the fourth generation blockchain MetaMUI, SWN Global have already conducted the mainnet launch on January 3rd, 2021. The company is headquartered in Seoul, South Korea, with four companies under the name “Sovereign Wallet Network (SWN) Global” present in Singapore, Sweden, and India. By leveraging advanced technologies such as identity blockchain, Sovereign Wallet’s empowers individuals and organizations to take control of their digital identities, ensuring privacy, security, and interoperability.

For more information, please refer to https://www.bis.org/publ/othp64.htm or reach out to us at support@sovereignwallet.network.


Sovereign Wallet announces new logo

Sovereign Wallet Network’s new logo Sovereign Wallet Network (SWN), a leading provider of secure and decentralized digital identity and CBDC solutions, has announced the release of its new company logo. The displayed former logo transitions from the golden phoenix to and S shape representing a secure wallet. The renewed design represents the company’s continued commitment to innovation and ref
Sovereign Wallet Network’s new logo

Sovereign Wallet Network (SWN), a leading provider of secure and decentralized digital identity and CBDC solutions, has announced the release of its new company logo. The displayed former logo transitions from the golden phoenix to and S shape representing a secure wallet. The renewed design represents the company’s continued commitment to innovation and reflects its vision for a more secure and decentralized digital world.

The new logo features a modern, minimalist design that embodies the company’s values of transparency, trust, and security. The logo’s design combines the elements of a wallet and the letter ‘S’ to represent ‘Sovereign’ and secure blockchain-based products and services offered.

“We are excited to introduce this new design and look forward to continuing to provide our customers with the most advanced and secure digital identity solutions on the market,” — Seokgu Yun, CEO of Sovereign Wallet Network.

The new logo will be used across all SWN branding, including its website, social media channels, and marketing materials. The company is also working on updating to a new website to reflect the new branding, which can be found at https://sovereignwallet.network/.

Sovereign Wallet Network (SWN) is a leading provider of secure and decentralized digital identity solutions. Its mission is to create a digital wallet and service that will become a ubiquitous tool to securely store and manage identity credentials and digital assets. The aim is to make this wallet a bridge that connects both the physical and digital worlds.

For more information, please contact support@sovereignwallet.network.


BlueSky

Bluesky User FAQ

Welcome to the Bluesky beta app! This is a user guide that answers some common questions.

Welcome to the Bluesky beta app! This is a user guide that answers some common questions.

For general questions about the Bluesky company, please visit our FAQ here. If you’re interested in learning more about the protocol Bluesky is built on (the AT Protocol), please refer to our protocol documentation or our protocol FAQ.

Data Privacy

What is public and what is private on Bluesky?

Bluesky is a public social network. Think of your posts as blog posts – anyone on the web can see them, even those without an invite code. An invite code simply grants access to the service we’re running that lets you publish a post yourself. (Developers familiar with the API can view all posts regardless of whether they have an account themselves.)

Specifically:

Posts and likes are public. Blocks are public. Mutes are private, but mutelists are public lists. Your mutelist subscriptions are private. Invites and invite trees are private.

Why are my posts, likes, and blocks public?

The AT Protocol, which Bluesky is built on, is designed to support public conversations. To make public conversations portable across all sorts of platforms, your data is stored in data repositories that anyone can view. This means that regardless of which server you choose to join, you’ll still be able to see posts across the whole network, and if you choose to change servers, you can easily take all of your data with you. This is what causes the user experience of Bluesky, a federated protocol, to be similar to all the other social media apps you have used before.

Can I set my profile to be private?

Currently, there are no private profiles on Bluesky.

What happens when I delete a post?

After you delete a post, it will be immediately removed from the user-facing app. Any images attached to your post will be immediately deleted in our data storage too.

However, it takes a bit longer for the text content of a post to be fully deleted in storage. The text content is stored in a non-readable form, but it is possible to query the data via the API. We will periodically perform back-end deletes to entirely wipe this data.

Can I get a copy of all of my data?

Yes — the AT Protocol keeps user data in a content-addressed archive. This archive can be used to migrate account data across servers. For developers, you can use this method to export a copy of your repository. For non-devs, the tooling is still being built to make it easy.

You can read our privacy policy here.

Moderation

What does muting do?

Muting prevents you from seeing any notifications or top-level posts from an account. If they reply to a thread, you’ll see a section that says “Post from an account you muted” with an option to show the post. The account will not know that they have been muted.

What does blocking do?

Blocking prevents interaction. When you block an account, both you and the other account will no longer be able to see or interact with each other’s posts.

How do I flag abuse?

You can report posts by clicking on the three-dot menu. You can also report an entire account by visiting their profile and clicking the three-dot menu there.

Where can I read more about your plans for moderation?

We wrote about our approach to composable moderation here and we’ll continue to publish more in the near future.

Custom Feeds

What are custom feeds?

Custom feeds is a feature on Bluesky that allows you to pick the algorithm that powers your social media experience. Imagine you want your timeline to only be posts from your mutuals, or only posts that have cat photos, or only posts related to sports — you can simply pick your feed of choice from an open marketplace.

For users, the ability to customize their feed returns control of their attention to themselves. For developers, an open marketplace of feeds provides the freedom to experiment and publish algorithms that anyone can use.

You can read more about custom feeds and algorithmic choice in our blog post here.

How do I use custom feeds?

On Bluesky, click "My Feeds" on the left side of the app. From there, you can add and discover new feeds.

How can I create a custom feed?

Developers can use our feed generator starter kit to create a custom feed. Eventually, we will provide better tooling such that anyone, including non-developers, can build custom feeds.

Invite codes

I don’t have access to Bluesky yet. How can I get an invite code?

We regularly send out invites to our waitlist, which you can sign up for here.

Now that I have an invite code, how can I access Bluesky?

When you get an invite code, you can download Bluesky from the iOS App Store and the Google Play Store. You can also use our web app at bsky.app.

When do I get invite codes? Do they expire?

Most accounts receive invite codes after they’ve been on Bluesky for a while — currently, it’s once every two weeks. These codes do not expire, unless we perform a security upgrade on all codes. In that case, a security upgrade will change each invite code’s combination of letters, but it will not affect the number of codes that you have.

Occasionally, when we take action to moderate an account, we may revoke invite codes from that account and other accounts in the same invite tree.

Why does Bluesky use an invite code system?

Invites helps us control growth of our beta as we finish building the tooling for moderation and curation. It also gives us visibility into the growing social graph. We will likely continue to use invite codes as we increase access to the app. At that time, each account will receive many more invite codes to hand out.

The code we use to distribute invites is open source and can be used by other server admins to grow their own communities over time as well.

How does the Bluesky server use the invite tree?

We reference the invite tree to track behavior on the social graph. For example, if we take action to moderate an account, we may revoke invite codes from other accounts in the same invite tree.

Security

How can I reset my password?

Click “Forgot” on the sign-in screen. You will receive an email with a reset code.

What if I didn’t get a password reset email?

If you didn’t receive a password reset email, please email us at support@bsky.app.

Will you implement two-factor authentication (2FA)?

Yes, implementing 2FA is on our short-term roadmap.

Bluesky, the AT Protocol, and Federation

What’s the difference between Bluesky and the AT Protocol?

Bluesky, the public benefit company, is developing two products: the AT Protocol, and the microblogging app Bluesky. Bluesky, the app, is meant to demonstrate the features of the underlying protocol. The AT Protocol is built to support an entire ecosystem of social apps that extends beyond microblogging.

You can read more about the differences between Bluesky and the AT Protocol in our general FAQ here.

How does federation affect me, as a user of the Bluesky app?

We’re prioritizing user experience and want to make Bluesky as user-friendly as possible. Regardless of which server you join, you’ll be able to see posts from people on other servers and take your data with you if you choose to move servers.

Is Bluesky built on a blockchain? Does it use cryptocurrency?

No, and no.

Miscellaneous

How can I submit feedback?

On the mobile app, open the left side menu and click “Feedback.” On the web app, there is a link to “Send feedback” on the right side of the screen.

You can also email support@bsky.app with support requests. Please do not use this email to request an invite. Instead, sign up for the waitlist here.

What is a post on Bluesky called?

The official term is “post.”

How can I find friends or mutuals from other social networks?

Third-party developers maintain tools such as Fedifinder to find friends from other social networks.

Is there a dark mode?

Yes. You can change the display settings to be light or dark mode, or to match your system settings, via Settings > Appearance.

How can I set up my domain as my handle?

Please refer to our tutorial here.

The answers here are subject to change. We’ll update this guide regularly as we continue to release more features. Thank you for joining our beta!

Thursday, 18. May 2023

Fission

Introducing Passkey Support

Passkey support, our third passwordless developer option, has arrived for ODD SDK. What are Passkeys? WebAuthn API, or Passkeys, is a user authentication system that replaces passwords using cryptography. We'll dive more into passkeys, how they work, and how they differ from other passwordless options in this blog post, but first, let's review why developers should consider building passwordless

Passkey support, our third passwordless developer option, has arrived for ODD SDK.

What are Passkeys?

WebAuthn API, or Passkeys, is a user authentication system that replaces passwords using cryptography. We'll dive more into passkeys, how they work, and how they differ from other passwordless options in this blog post, but first, let's review why developers should consider building passwordless apps.

The Benefits of Going Passwordless

There are several benefits to going passwordless for both users and developers.

1) Improved Security - This is a biggy! Machine learning is getting better at cracking passwords, and phishing attempts have gotten more sophisticated. Even the savviest users can fall prey to professional hackers. Data leaks jeopardize the exposed passwords and any accounts where users have re-used that password. App creators must have tight security when creating databases to store user passwords - the GDPR risks are enough to keep anyone up at night.

Passkeys cannot be guessed, and they are end-to-end encrypted. Phishing apps or websites created to steal users' keys won't work (unlike blockchain wallet keys) because these keys are inextricably linked to the user's device and verified by a special token (which we'll expand on in this post shortly).

2) Better User Experience - Unless a user has a subscription to a password manager (and even those can get compromised), they have to either write down or memorize all their passwords. When they inevitably forget their password, they must go through an account recovery process that can be frustrating when the user is short on time.

Fission's Passwordless Options

App developers using ODD SDK currently have two passwordless options - WebCrypto API and WalletAuth.

WebCrypto API gives the browser fairly low-level cryptographic building blocks, which include encryption and signatures. A user's private key is stored in their browser, and no one, not even the user, can extricate it. If they created their account on their laptop and want to access it on their mobile phone, they have to link their devices to do so (this is a simple process using the AWAKE protocol, also part of ODD SDK).

The upsides of WebCrypto API - it's highly secure and user-friendly. There are many tools built into it for developers to use. The downside - developers need to know what they want to build ahead of time. It's like getting a box of LEGO but no directions.

WalletAuth allows developers to integrate the Metamask browser extension wallet for user sign-in. When users connect to an ODD SDK app, they can access on-chain data as usual, as well as encrypted off-chain data storage features.

The upside - it's great for Web3 dapps and Web3 users who are comfortable with blockchain wallets and want to seamlessly work with both on and off-chain data. The downside - users who are not Web3 savvy will be very intimidated by this type of login option.

Under the Hood

At the beginning of this article, we learned that WebAuthn is a user authentication system that replaces passwords using cryptography. Instead of using a username and password, the user authenticates with a key stored on their device and signs some unique data with that key.

The unique data is divided into two parts: Client Data and Authenticator Data. Client Data includes contextual information about the WebAuthn Relying Party and the WebAuthn Client. Authenticator Data includes contextual information that derives its trustworthiness from the WebAuthn Relying Party's assessment of the security of the authenticator.

In other words, the unique data seeks to understand the relationship between the app (or specifically, the WebAuthn Relying Party who created the app) and the user (specifically, the user agent - i.e. the browser or mobile app they are using) and determines whether the app can securely verify the user. On both sides, there is a layer of abstraction to ensure privacy and trustworthiness.

After the user signs the unique data, they embed that signature and unique data into a Public Key Credential. It uses the FIDO standard and is a container just like JWT that formats the public key, signature, and unique data in a specific way according to the spec.

Passkeys in Action

Fission engineer Hugo Dias has spent months diving deep into the passkeys spec, learning what works across browsers and operating systems. As a result, he's put together a passkeys library that developers can use to add passkeys support to their ODD apps.

Hugo has created two demos to demonstrate passkey syncing and sharing.

This first video shows passkeys syncing across devices with Google Password Manager. Using the Android emulator, we visit https://passkeys.fission.app (the version on Hugo's personal domain hugodias.me) and create a new user with a passkey-based login.

The user is now logged in on Android and can upload and access files.

Switching to Google Password Manager, we see that the passkey for hugodias.me has been saved and synced.

We can now switch to a desktop browser, visit the same app, and get a prompt to use a saved passkey. A pin is used to unlock.

Once logged in, we can see that the picture uploaded on mobile is accessible on a desktop web browser, synced through user WNFS files.

This second video demonstrates navigating to https://passkeys.fission.app using the Android Mobile emulator. hugo-sharing is logged in.

The website is a progressive web app that is installable on Android, so “Install app” is chosen from the Android Chrome browser menu.

Once installed, the ODD logo appears on the home screen. After launching, we see that hugo-sharing is still logged in, as it uses the same context as the website version.

Switching to the Google Photos app, the Android share function is used. Selecting the installed ODD app uploads the photo into the user’s WNFS.

The share button again uses Android native share functionality, and we can share from the installed ODD PWA app over to WhatsApp (or any other Android app).

View the GitHub source notes for more information on these features.

You can start developing with passkeys in ODD SDK today. Visit the ODD SDK Passkeys repo to get started.


SelfKey

Cyberspace Challenges: Security and Privacy

The metaverse offers boundless opportunities for learning, working, and communication, unveiling a realm of incredible possibilities. However, as an extension of the internet, it inherits the privacy and security challenges associated with it. SelfKey's decentralized solutions may be able to address these concerns and pave the way for a more secure and decentralized future.
Summary

Because of the increasing popularity of digital spaces, people have slowly started to migrate towards cyberspace. 

It’s no longer uncommon for people to have an online presence, regardless of age, gender, or social status. We now have digital identities, digital meeting rooms, digital jobs, and digital entertainment. As long as there is access to the internet and people have the necessary gadgets, they will most certainly aim to create an online presence for themselves.

Cyberspace, also known as the metaverse, is like a parallel universe of our physical world, but with additional advantages: speed, efficiency, and flexibility. And, even though at the moment it has not reached its full potential, cyberspace can open doors to countless opportunities for knowledge and growth.

In spite of these amazing benefits, though, we must wonder: how well do we truly understand the concept of metaverse and the risks associated with it?

Cyberspace is a place filled with challenges as much as it is filled with potential and benefits. And, as individuals aiming to be part of such a complex and immersive digital space, it’s vital that we continuously educate ourselves on this topic.

This article strives to encompass as much vital information as possible about the fascinating world of metaverse, with all the good and the bad. Additionally, we will elaborate on SelfKey’s proposed decentralized solutions which aim to ensure the privacy and security of your digital identity as you operate in the online world.

Highlights An Introduction to Metaverse The Benefits of Metaverse The Dark Side of Metaverse How SelfKey May Improve your Metaverse Experience Conclusions An introduction to Metaverse Metaverse and humanity

In order to operate in the digital space, individuals have to exist in. Therefore, the introduction of digital identities was something which came naturally with people’s need to belong on the internet. 

Almost every person has had an online account, somewhere, or has been listed in some digital system. This was done either by the user themselves, or by other individuals in order to keep track of records more easily. The digital realm, therefore, has become an integral part of our daily lives, and it is something which almost every person seeks. 

Even more, advancements in modern technology have unveiled the exciting possibility of interacting with virtual objects and spaces. Venturing into the metaverse has the potential to unlock doors to new realms of knowledge. It may ignite the imagination and provide opportunities to explore aspects of the world that may otherwise be physically out of reach.

All in all, the metaverse may change the way we, as society, interact and educate ourselves. With all of this in mind, SelfKey strives to develop digital identity solutions that may enhance security and privacy as individuals begin to explore cyberspace.

Metaverse in a nutshell

The metaverse is a concept that came to be at the intersection of enhanced physical reality and persistent virtual space. It is centered around shared virtual spaces, consisting of various digital worlds, augmented reality, and the internet.

In simple terms, the metaverse represents a digital universe created and sustained by software, which is accessible through the internet. Within this virtual domain, individuals can interact with each other, as well as their virtual surroundings, in real-time.

For many years, the metaverse existed only as a theoretical concept, residing in the realm of people's imaginations. However, with the continuous advancements in technology, particularly in virtual and augmented reality, cyberspace is gradually capturing widespread attention. Although it is currently limited in scope, it has the potential to encompass endless possibilities in the future.

Depending on the context, cyberspace can serve as a parallel existence alongside the physical world, or it can strive to provide a fully immersive virtual environment. This leaves room for many opportunities for individuals to explore and share knowledge in the metaverse.

The Benefits of Metaverse

At its core, the metaverse has tremendous potential to revolutionize various aspects of society. 

Due to its capacity to serve as a versatile platform for a wide range of applications, it may completely revolutionize the way we work, study, and socialize. It could facilitate many activities, such as virtual reality gaming, education, business meetings, social networking, and even virtual tourism.

Another notable feature of the metaverse is the possibility for users to create and design avatars. This aligns with SelfKey’s idea of customizable avatars, which may enable individuals to design avatars in any ways they please. 

These avatars may enable interaction with other individuals or virtual objects in the metaverse. Furthermore, they may facilitate the creation of digital spaces that can be used for various purposes, such as virtual stores, events, and even entire cities. 

The development and adoption of the metaverse may offer many potential benefits, such as: 

Improved Connectivity  Digital Tourism  Increased Accessibility Virtual Education New Experiences and Entertainment  Streamlined Efficiency

Below, we will further elaborate on these advantages.

Improved Connectivity

Cyberspace allows for real-time communication and collaboration between individuals from anywhere around the world. Because physical distance is no longer an issue, individuals are able to build connections and easily collaborate on projects. 

Digital Tourism 

The metaverse may offer amazing opportunities for people to engage in virtual tourism. This means that individuals may delve into virtual recreations of amazing locations from the real, physical world. Additionally, they may be able to experience completely new, fascinating fictional realms which they’ve always dreamt of. 

Increased Accessibility

Through the power of digital tools and endless creativity, the metaverse may become accessible from anywhere around the world. As long as there are stable internet connections, and with the proper devices, anyone could be part of this immersive experience. 

This feature is particularly important, as it may facilitate participation for individuals with disabilities or mobility constraints. 

Virtual Education

The internet has provided access to information which surely must’ve made learning much more accessible than ever before. However, cyberspace can bring education to the next level by allowing the creation of immersive, virtual classrooms. 

Even more, individuals may be able to build realistic, interactive learning environments. This could prove to be incredibly helpful in terms of accessibility and education for young people all around the world.

New Experiences and Entertainment 

By utilizing the potential of cyberspace, individuals may be able to experience unique scenarios and activities which may not be practical or possible in the physical world. For instance, they could participate in fantasy-based virtual reality games and simulations. 

Streamlined Efficiency

The metaverse holds potential to streamline and enhance various business and organizational processes. This includes anything from meetings, training, and customer service, and it could facilitate debugging and solving particular technical issues much faster.

All in all, cyberspace represents a transformative landscape with far-reaching implications. It could potentially reshape the way we interact, learn, and engage with the world around us, and beyond. In spite of all these wonderful benefits, though, there are downsides to the metaverse which are worth mentioning.

The Dark Side of Metaverse

As stated above, cyberspace holds immense potential and possibilities. However, alongside these exciting prospects, it is crucial to acknowledge the challenges which come with this cutting-edge technology. 

While the metaverse indeed raises concerns about health and social isolation, among other crucial factors, we will narrow our discussion to two primary aspects: 

Privacy Issues Identity Theft and Hacking Privacy Issues

The metaverse relies on the utilization of Virtual Reality (VR) technology, which, like any cutting-edge innovation, presents its own set of challenges. As an extension of the internet, cyberspace confronts similar digital privacy and security issues, but perhaps at a more concerning level.

VR systems may collect user data, encompassing personal information and behavioral data, with the intention of enhancing the user experience. However, the collection and handling of such data carry inherent risks, including the potential for privacy breaches or unauthorized access.

Moreover, when engaging with VR applications and services developed by third parties, users should be aware that these entities may gather personal data for their own purposes, raising further privacy considerations.

Safeguarding this data is of utmost importance, as mishandling could expose individuals to privacy breaches, compromising their personal information and leaving them susceptible to identity theft or misuse.

Identity Theft and Hacking

The utilization of VR within the metaverse presents a vulnerable target for identity theft. With the aid of various machine learning algorithms, it may be relatively easy to manipulate sounds and visuals to a degree where they appear genuine.

Furthermore, attackers can exploit virtual reality platforms by introducing deceptive features aimed at extracting personal information from users. This susceptibility, similar to augmented reality, opens the door to ransomware attacks, where hackers compromise platforms and then demand a ransom.

Machine learning technologies also facilitate the manipulation of voices and videos while preserving their realism. If a hacker manages to access motion detection data from a VR headset, they could potentially employ it to generate a digital replica, commonly known as a deepfake.

A deepfake could then be used by malicious players to impersonate a user, or to try and access private accounts and systems in their name. While that is indeed a terrifying scenario, there are several decentralized solutions which may prevent such disasters.

How SelfKey May Improve your Metaverse Experience

Given the multitude of benefits cyberspace offers, it comes as no surprise that malicious actors seek to exploit vulnerabilities within it. 

With these facts in mind, SelfKey proposes several decentralized solutions which may enhance security and prevent identity theft and other privacy concerns. These methods put a strong emphasis on privacy and security. The end goal is to empower individuals to confidently navigate the digital world.

SelfKey iD

An innovative online identity verification method, SelfKey iD was designed with the aim to safeguard user privacy and security. By utilizing ZK filters and AI-Powered Proof of Individuality, it strives to prevent unauthorized access to personal online accounts and systems by bots or identity thieves. 

The purpose of ZK filters is to limit the amount of personal information shared when individuals interact with and engage with other individuals in digital spaces. If the shared data is limited, it may be insufficient for bad players to compromise. And this may enhance overall security and privacy.

AI-Powered Proof of Individuality is utilized with the aim to deter identity thieves from accessing a user’s online accounts with the latter’s stolen digital information. Thanks to AI’s ability to detect duplicates or fake, AI-generated images, SelfKey iD may be able to combat and prevent these attempts. 

This way, individuals may be able to enjoy the wonders of metaverse with confidence. Even more, they may ensure the privacy and security of their online persona.

SelfKey's Customizable NFTs

In the pursuit of a safer and decentralized future, SelfKey recognizes the importance of individuality. To this end, SelfKey may offer the means for individuals to embrace their uniqueness and make a mark in the metaverse through customization.

Through SelfKey's Customizable NFTs, also known as Normies, users can attach these digital assets to their SelfKey iD. And this may allow them to express their originality and gain access to exclusive features.

Normies are composed of Base NFTs and Wearables, offering users countless combinations and possibilities for customization. This provides users with an ideal opportunity to create digital representations that truly reflect their own identities.

SelfKey's ENS subdomains

Last, but not least, SelfKey's ENS subdomains strive to offer equal opportunities for individuals to own and personalize their domains. 

The cost-effectiveness of these subdomains aims to make them affordable for everyone. And to possibly ensure that each person has the opportunity to be the owner of their own digital presence.

By implementing these online identity security solutions, and aiming to create a secure digital environment, SelfKey is striving to contribute to a much safer digital future. The potential of cyberspace and modern technology is immense, and it’s worth being used for the good of the people. 

Conclusions

Every day, technology reaches new heights, unlocking doors to unimaginable opportunities. However, like any innovation, there are those who will exploit it for their own gain, regardless of the harm caused. 

Fortunately, with the right tools and knowledge, we may be able to prevent such occurrences.

The metaverse holds the power to revolutionize society and provide virtual access to knowledge like never before. But, it is essential to be aware of the accompanying risks. This way, we may navigate these remarkable digital realms safely.

SelfKey’s primary focus revolves around security and privacy. And every individual deserves these fundamental rights. People shouldn't have to risk their privacy and security in the pursuit of convenience or the use of modern-day technology. 

By prioritizing security and privacy, SelfKey strives to empower individuals to embrace technology without sacrificing the safety of their digital identity.

Stay up to date with SelfKey on Discord, Telegram, and Subscribe to the official SelfKey Newsletter to receive new information!


Finicity

Loanspark Continues Its Work with Mastercard, MidDesk, and LexusNexis to Facilitate a Smoother Lending Process

Loanspark partnered with world-leading tech brands Mastercard, Middesk, and LexisNexis to enhance, speed up, and secure service delivery for its co-branded partners and their business customers. A partnership with Mastercard enables Loanspark… The post Loanspark Continues Its Work with Mastercard, MidDesk, and LexusNexis to Facilitate a Smoother Lending Process appeared first on Finici

Loanspark partnered with world-leading tech brands Mastercard, Middesk, and LexisNexis to enhance, speed up, and secure service delivery for its co-branded partners and their business customers.

A partnership with Mastercard enables Loanspark to leverage Mastercard’s open banking platform, with some services delivered through its subsidiary, Finicity, allowing businesses to establish direct consumer-permissioned connections with their customers’ bank accounts. This enables SMB owners to submit financial information securely and easily while focusing on running their business, and in turn allowing Loanspark to make better credit decisions by quickly verifying the borrower’s financial details. Accurate borrower information minimizes the lending risks and increases accuracy and speed of funds for SMBs.

“Small businesses are increasingly looking for greater choice in how they borrow, pay and manage their finances. Partnering with organizations like Loanspark provides small businesses with a streamlined process to gain access to capital and putting their financial worries at ease.”

Andy Sheehan, EVP, US Open Banking, Mastercard. 

Read more about this partnership here.

The post Loanspark Continues Its Work with Mastercard, MidDesk, and LexusNexis to Facilitate a Smoother Lending Process appeared first on Finicity.


Entrust

Breaking Down Decentralized Identity and Know Your Customer

The global decentralized identity market was valued at $285 million in 2022 and is projected... The post Breaking Down Decentralized Identity and Know Your Customer appeared first on Entrust Blog.

The global decentralized identity market was valued at $285 million in 2022 and is projected to grow to $6.82 billion by 2027, at a compound annual growth rate (CAGR) of 88.7% during the forecast period, according to a new report by MarketsandMarkets™. This exponential growth is due to increased security breaches, identity-related fraud incidents, the inefficiency of existing identity management practices, and the lack of end-user control over identity usage.

In today’s highly connected and digital world, secure customer interactions have become a top concern for individuals, organizations, and governments. With the increasing use of digital services, the need for secure and reliable identity verification processes has become more pressing. This is where decentralized identity and Know Your Customer (KYC) processes come into play.

What does decentralized identity mean?

Decentralized identity is a system of identity management based on decentralized and distributed technologies, such as blockchain and peer-to-peer networks. This system, in theory, will provide a digital identity that is not controlled by any central authority, such as a government or corporation. Instead, it is managed by the individual who owns it. This allows the individual to then share only the necessary aspects of their identity to validate themselves in order to interact with different organizations and services online; once the transaction is completed, they can retract their personal identity information if they wish to. Decentralized identity is not yet globally viable, but there are many organizations and governments working to make it real.

What does Know Your Customer (KYC) mean?

KYC is a process used by financial institutions and other organizations to verify the identity of their clients. This is done to comply with anti-money laundering (AML) and counter-terrorism financing (CTF) regulations, as well as to protect against fraud and other financial crimes.

The traditional KYC process involves collecting and storing large amounts of personal information about clients, including their name, date of birth, address, and government-issued ID. This information is often stored in centralized databases that can be attacked by bad actors, making it vulnerable to data breaches and other security threats.

Decentralized KYC processes have the potential to revolutionize the way personal information is managed and verified. By using decentralized identity, individuals can store their identity information in a secure digital wallet that is owned and controlled by them, sharing only parts of their identity as and when needed.

How does decentralized identity benefit organizations?

By adopting decentralized identity, organizations can achieve:

Enhanced security
One of the most significant benefits is enhanced security. Identity systems store identity data on a blockchain, which is a tamper-proof and secure distributed ledger. The decentralized nature of the blockchain means that identity data is not stored in a central location, which reduces the risk of data breaches and identity theft. This is a significant benefit for organizations that store sensitive personal information, such as financial institutions, healthcare providers, and government agencies. Streamlined identity verification processes
In traditional identity verification systems, individuals often have to repeat the same verification process with multiple organizations. This is time-consuming and can be frustrating. With the adoption of decentralized identity, individuals can share verified identity information with multiple organizations without having to repeat the verification process. This not only saves time, but also reduces the risk of errors and inconsistencies in identity information. Reduced costs
With decentralized identity, organizations no longer need to store, manage, and verify identity data themselves. From a cost perspective, this can be particularly beneficial for smaller organizations that do not have the resources to build and maintain their own identity verification systems and also reduce the costs associated with fraud prevention and compliance, as it provides a secure way to authenticate identities. Improved user experience: seamless onboarding
Instead of having to fill out long and complicated forms, users can quickly and easily share their decentralized identity with a business, which speeds up account opening. This can improve the overall customer experience and reduce the likelihood of users abandoning the onboarding process.

To summarize, decentralized identity and KYC are two important concepts that are changing the way we think about online identity and security. By allowing individuals to control and manage their own digital identities, and by decentralizing the KYC process, we can create a more secure, transparent, and efficient online ecosystem. While these technologies are still in the early stages of development, they hold great promise for the future of digital identity and security.

Contact our experts to discuss your organization’s needs.

The post Breaking Down Decentralized Identity and Know Your Customer appeared first on Entrust Blog.


Ocean Protocol

DF37 Completes and DF38 Launches

Stakers can claim DF37 rewards. DF38 runs May 18 — May 25, 2023 1. Overview Data Farming Round 38 is here (DF38). DF38 is the 10th week of DF Main, the final phase of DF. This week, users can earn rewards up to 150K OCEAN. In DF Main, weekly rewards will grow to 1M+ OCEAN. The article “Ocean Data Farming Main is Here” has the full details of DF Main. In fact, it’s a self-contain
Stakers can claim DF37 rewards. DF38 runs May 18 — May 25, 2023 1. Overview

Data Farming Round 38 is here (DF38).

DF38 is the 10th week of DF Main, the final phase of DF. This week, users can earn rewards up to 150K OCEAN. In DF Main, weekly rewards will grow to 1M+ OCEAN.

The article “Ocean Data Farming Main is Here” has the full details of DF Main. In fact, it’s a self-contained description of Ocean Data Farming (DF), including all the details that matter. It is up-to-date with the latest reward function, weekly OCEAN allocation, and estimates of APYs given the current amount of OCEAN staked.

DF is like DeFi liquidity mining or yield farming, but is tuned to drive data consume volume (DCV) in the Ocean ecosystem. It rewards stakers with OCEAN who allocate voting power to curate data assets with high DCV.

To participate, users lock OCEAN to receive veOCEAN, then allocate veOCEAN to promising data assets (data NFTs) via the DF dapp.

DF37 counting started 12:01am May 11, 2023 and ended 12:01am May 18. You can claim them at the DF dapp Claim Portal.

DF38 is live and will conclude on May 25, 2023.

DF Round 38 (DF38) is the 10th week of DF Main. Details of DF Main can be found here.

The rest of this post describes how to claim rewards (section 2), and DF37 overview (section 3).

2. How To Claim Rewards

As a participant, follow these step on how to claim rewards:

Go to DF dapp Claim Portal Connect your wallet Passive and Active Rewards are distributed on Ethereum mainnet. Click “Claim”, sign the tx, and collect your rewards

Rewards accumulate over weeks so you can claim rewards at your leisure. If you claim weekly, you can re-stake your rewards for compound gains.

3. DF38 Overview

DF38 is part of DF Main, phase 1. This phase emits 150K OCEAN / week and runs for 52 weeks total. (A detailed DF Main schedule is here.)

Ocean currently supports five production networks: Ethereum Mainnet, Polygon, BSC, EWC, and Moonriver. DF applies to data on all of them.

Some key parameters:

Total budget is 150,000 OCEAN. 50% of the budget goes to passive rewards (75,000 OCEAN) — rewarding users who hold veOCEAN (locked OCEAN) 50% of the budget goes to active rewards (75,000 OCEAN) — rewarding users who allocate their veOCEAN towards productive datasets (having DCV).

Active rewards are calculated as follows:

First, distribute OCEAN across each asset based on rank: highest-DCV asset gets most OCEAN, etc. Then, for each asset and each veOCEAN holder:
– If the holder is a publisher, 2x the effective stake
– Baseline rewards = (% stake in asset) * (OCEAN for asset)
– Bound rewards to the asset by 125% APY
– Bound rewards by asset’s DCV * 0.1%. This prevents wash consume.

For further details, see the “DF Reward Function Details” in DF Main Appendix.

As usual, the Ocean core team reserves the right to update the DF rewards function and parameters, based on observing behavior. Updates are always announced at the beginning of a round, if not sooner.

Conclusion

DF37 has completed. To claim rewards, go to DF dapp Claim Portal.

DF38 begins May 18, 2023 at 12:01am UTC. It ends May 25, 2023 at 12:01am UTC.

DF38 is part of DF Main. For this phase of DF Main, the rewards budget is 150K OCEAN / week.

Appendix: Further Reading

The Data Farming Series post collects key articles and related resources about DF.

About Ocean Protocol

Ocean was founded to level the playing field for AI and data. Ocean tools enable people to privately & securely publish, exchange, and consume data.

Follow Ocean on Twitter or Telegram to keep up to date. Chat directly with the Ocean community on Discord. Or, track Ocean progress directly on GitHub.

DF37 Completes and DF38 Launches was originally published in Ocean Protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.


Onfido Tech

Remote design sprint: learnings from facilitating

Are you wondering what it’s like to facilitate a remote design sprint? Are you searching for ways to make your sprint work? If yes, then this post is for you! If not, no problem; you can still enjoy the tales of a designer going through his first remote design sprint 😀 I’m a Senior Product Designer at Onfido, an identity verification provider. We used a design sprint format to validate an id

Are you wondering what it’s like to facilitate a remote design sprint? Are you searching for ways to make your sprint work? If yes, then this post is for you! If not, no problem; you can still enjoy the tales of a designer going through his first remote design sprint 😀

I’m a Senior Product Designer at Onfido, an identity verification provider. We used a design sprint format to validate an idea that wasn’t part of our quarterly objectives. Going through the sprint process aligned stakeholders on the overall goal and made everyone part of the solution.

As you might know already, the design sprint process is well documented, and you can find plenty of resources across the World Wide Web. So I decided that I would add up to the pile myself! I listed the most useful resources at the end. Suppose you’re searching for good material to prepare for your sprint. Hopefully, you find the following bits of advice and learnings useful for yourself.

Screenshot of the completed Miro board :D Preparation is key Make it easy for yourself to facilitate. Prep in advance for every voting activity, timer, board, etc. You should be able to arrive at the activity and know the steps to follow just by seeing the board. I used this template made by Jake Knapp and Stéph Cruchon 👉 https://miro.com/miroverse/design-sprint-jake-knapp/. I did modify it to include process information like timing and number of votes per person. Remote means you’ll be on Zoom (video call) all day. PLAN LOTS OF BREAKS! I think it goes without saying, but a 10–15min break can save so much mental space and keep focus and attention from your sprint team. It can also give people space to answer some Slack messages, emails, you know … manage their “real work”. One every hour, give or take, is what we implemented and felt nice. Keep the book near you! Or any activity reference on your computer. You don’t need to know everything because you will forget them anyway. Before each session, I would skim through the reference to refresh my memory. That helped me be more knowledgeable of the task and be able to give directives easily. When the facilitator forgets the only material needed…

Quick fun fact: I only asked one thing for the team to prepare. Have A4 white paper and a black pen for the Sketching day. 15min before the sketching exercise, I realise… I don’t have A4 paper myself 🤦 So I ran to buy a block of 360 A4 paper… (sorry for the “ill”, I was “race walking” and typing on my phone). Told you you’re going to forget things!

Being adaptable is priceless Problems will surely arise, and you need to be ok dealing with them. A debate that goes for too long, an unplanned drop-out, an unavailable expert/user, … the list goes on. Whatever you plan will be put to the test. Knowing things will change allows you to remove the pressure of perfection. Keep in mind that the sprint must go on! It is fine to have a break in the middle of an activity to check things (I learned that from trying to be a dungeon master 😉). If you don’t know something, take a 5min break and check your reference or find a solution to the problem. No shame in saying, “Sorry, I don’t know how to proceed right now; let’s have a small break so that I can review the process.” Another solution is to ask your group! How would they prefer to solve the issue? How would they change the activity? What information do they need? …

Not so-quick fact: On Wednesday, you decide many things, from the solution to the flow of the prototype itself. And be sure that your team members will try to modify the decision you made 30 minutes, 1 hour ago. During the storyboarding activity, our group was stuck on “what ifs” and “why not change that”, deep inside conversations about very specific and technical subjects. I was boiling inside as the decisions weren’t happening! Seeing that we were not going anywhere, I put a hard stop on conversations and gave a 10min break for everyone. It allowed me to take a breather, think and regain the lead of the conversation. Sometimes cutting short helps you move forward.

Time pressure is a stress you will experience. The best advice I can give is: “If you are in a hurry, then slow down.” (unsure where this quote comes from, either from a German or Chinese proverb if you ask Google)

The importance of the team you choose As important as picking an impactful challenge, you need the right people in your team. A person who cares about the issue will show engagement and motivation. These are powerful drivers to have a successful sprint. Team members will stay focused, engage in conversations, and overall put energy into doing things well. Perspective diversity should come with relevant team members. I’m glad that everyone in the sprint shared and learned something! Humans are curious by nature, so having different points of view will generate engagement through learning and rewards through sharing. Having people with specific skills can help you design a feasible solution. Luckily we had 2–3 engineers in our team that helped us create a close-to-reality prototype! An interactive, half-coded prototype through Framer suited our use case. You shouldn’t base your choice on this, though. Engage the team when you can Within exercises, give them roles or ask them to do some tasks. The bible for remote design sprint referenced below advises it. That was also the first feedback I received on the first day: “When you organise post-its or move things from one board to another, you can ask the group to help. You don’t have to do it by yourself! It’ll save time.” And they were right! I decided to involve them more in the day-to-day logistics. Involving them lowered my stress level and slightly improved their attention and engagement in the activity. Make each person in the team shine during the week. Everyone has a skill or personality trait that can make the week easier or the work better! An engineer solving a problem? Give them a shout-out at the end of the day. Does a Product Manager have the best voice and temper for testing? Allow them to interview. A very knowledgeable team member? Give them 30min to expose their knowledge. Let them shine even outside of activities! I used Chat GPT to create “inspirational sentences” for every day of the week. One in the morning within an automated Slack message and one at the end of the day in the meeting. This person in our group started to ask for a sentence written in a specific style (like pirate or medieval), which resulted in some excellent impressions and stylistic reading of these quotes. I started to ask them what style did they want each day as well as leaving this person have a little show. That was a great moment between the team and perfect to end each day on a positive note!

Thursday’s end-of-the-day 🤖 chat GPT quote in an evangelist priest style:

ChatGPT conversation to create an inspirational sentence for my design sprint team on Thursday after finishing a prototype in one afternoon. TIPS: don’t use the word “quote” if you want something like this. Otherwise, it will get you a quote from someone, not “create one”.
Hallelujah, my brothers and sisters! Let us gather ‘round and bear witness to the miracle that is about to unfold before us. For in this very afternoon, through our unwavering faith and tireless labor, we shall bring forth a prototype that shall reign supreme and conquer all doubters! So let us join our hearts and minds together, and let the power of our convictions guide our hands as we march towards glory! Amen!”

Was our solution the perfect one? No. Did the week help us validate our idea? Mostly. Did we all learn something? Absolutely! This is a reminder that a design sprint is not a process in which you will end up with a fully ready-to-build product. Insights and learnings are the primary output. These will inform a product decision; iterating or building. In our case, we learned that our direction was good, but we needed to refine the solution itself.

Overall facilitating a design sprint is an excellent exercise for designers. The one we did was a one-off “bubble week” for me. By that, I meant facilitating a design sprint for a team I didn’t work in. This allowed a focus on the process and facilitation aspect of product design.

Ultimately you practice leading exercises, people and projects using lots of communication skills. Product design is about working with different people, so these skills are useful! I encourage everyone to try a design sprint; it’s a great experience! And obviously, have fun with it.

Thanks for reading 🙂

Sharing is caring! So here is a list of handy references that helped me:

The bible for remote design sprint (this is a must-read) by Jake Knapp, John Zeratsky and Jackie Colburn: https://www.thesprintbook.com/articles/remote-design-sprint-guide Each day detail by John Zeratsky: Monday: https://medium.com/gv-library/sprint-week-monday-4bf0606b5c81 Tuesday: https://library.gv.com/sprint-week-tuesday-d22b30f905c3 Wednesday: https://library.gv.com/sprint-week-wednesday-900fe3f2c26e Thursday: https://library.gv.com/sprint-week-thursday-df8d7c8c0555 Friday: https://library.gv.com/sprint-week-friday-7f66b4194137 Note-N-map by Stéph Cruchon: https://sprintstories.com/the-design-sprint-note-n-map-a9bf0ca88f51 Explain the How Might Wes using this by Maria Rosala: https://www.nngroup.com/articles/how-might-we-questions/ Storyboarding 2.0 by Tim Höffer: https://sprintstories.com/storyboarding-2-0-4e282b2da94d

Remote design sprint: learnings from facilitating was originally published in Onfido Product and Tech on Medium, where people are continuing the conversation by highlighting and responding to this story.


Ontology

Meet the Team: Head of Social Media, Christina Lu

What’s your name and where are you from? My name is Christina Lu and I joined Ontology in June 2018, shortly before Ontology’s mainnet launch. Tell us a bit about yourself. What did you study? What are your hobbies? I have a Masters degree in Conference Interpreting and Translation Studies from the University of Leeds. I felt that this degree was the right fit for me as I’m multi
What’s your name and where are you from?

My name is Christina Lu and I joined Ontology in June 2018, shortly before Ontology’s mainnet launch.

Tell us a bit about yourself. What did you study? What are your hobbies?

I have a Masters degree in Conference Interpreting and Translation Studies from the University of Leeds. I felt that this degree was the right fit for me as I’m multilingual, a fast learner, and adaptable. In the same vein, traveling and exploring new places is one of my favorite things to do as I like to experience different cultures and see what various cities have to offer.

What kind of work do you do on a day-to-day basis?

I am Ontology’s Head of Social Media. In this role, my main duties include: developing social media and marketing strategies to drive engagement and increase followers; creating content and campaigns; analyzing social listening and monitoring data; and conducting internal and external communications, including but not limited to media, KOLs, partners, exchanges, and event organizers.

In your opinion, what makes Ontology stand out from other blockchains?

What makes Ontology stand out from other blockchains is the fact that it has remained focused on Decentralized Identities (DID) and building the infrastructure and tools for Web3 for the past five years. We employ an exceptional tech team and a devoted marketing and business development team, who are dedicated to our overall vision.

What is the most exciting part of the project you’re working on for Ontology?

Ontology offers individuals new opportunities, bringing awareness to DID and giving users back control in order to manage their own personal data and online identities.

What has been the most important moment in your career so far?

While I have reached many notable milestones throughout my time at Ontology, the most prominent is building our global community from 3 languages to over 30+ languages worldwide. Additionally, the creation of the Harbinger program is an important moment in my career to date. Essentially, it is a network of passionate blockchain and Web3 lovers and contributors. Launched in 2020, it has played a major part in helping us build a strong global community presence.

What are you most excited for in your future at Ontology?

I’m most excited for more applications for DID and users owning and having complete control over their own data and identities.

As we mark Ontology’s five-year anniversary, where do you see Ontology and Web3 going in the next five years?

In the future, I anticipate that Ontology will be leading Web3 when it comes to DID. For Web3 more generally, I expect to see mass adoption of Web3 technologies. Just like we witnessed with Web2 and the rapid growth of smartphones and social media, individuals will realize that it’s essential to adopt this technology and understand what it can offer.

Contact us

- Ontology official website: https://ont.io/

- Email contact: contact@ont.io

- GitHub: https://github.com/ontio/

- Telegram group: https://t.me/OntologyNetwork

Meet the Team: Head of Social Media, Christina Lu was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


Coinfirm

Aleph Zero to Integrate with Coinfirm to Enhance Transactional Security, AML with Advanced Analytics

Aleph Zero, an innovative layer 1 blockchain with privacy-preserving features for enterprise use cases, and Coinfirm, a global leader in blockchain analytics and AML (Anti-Money Laundering) solutions started the collaboration on the implementation of Coinfirm’s advanced AML transaction monitoring and wallet screening capabilities into the Aleph Zero blockchain ecosystem. The partnership will enabl

Aleph Zero, an innovative layer 1 blockchain with privacy-preserving features for enterprise use cases, and Coinfirm, a global leader in blockchain analytics and AML (Anti-Money Laundering) solutions started the collaboration on the implementation of Coinfirm’s advanced AML transaction monitoring and wallet screening capabilities into the Aleph Zero blockchain ecosystem.

The partnership will enable Aleph Zero users to get enhanced security and compliance solutions by leveraging Coinfirm’s powerful analytics tools. Coinfirm’s AML platform will monitor and analyze transactions on the Aleph Zero blockchain, helping to ensure regulatory compliance and protect against illicit activities such as money laundering and fraud.

Antoni Żółciak, Aleph Zero’s co-founder, commented on the partnership, saying, “We are excited to work with Coinfirm, a pioneer in blockchain analytics and AML solutions. Their expertise will be instrumental in strengthening the Aleph Zero ecosystem and ensuring the platform remains secure and compliant in multiple jurisdictions. This partnership demonstrates Aleph Zero’s commitment to providing a secure environment in the realm of private DeFi.”

Coinfirm will analyze transactions on the Aleph Zero blockchain, including transaction amount, sender and receiver addresses, and other relevant metadata. This helps ensure regulatory compliance and protect against illicit activities such as money laundering and fraud.

Coinfirm is dedicated to ensuring that its analytics tools do not compromise user privacy. By focusing on transaction patterns and risk factors associated with illicit activities, Coinfirm’s Analytics platform effectively maintains regulatory compliance without collecting personal information about individual users. This approach ensures that users can enjoy the benefits of enhanced security and compliance within the Aleph Zero blockchain ecosystem while preserving their privacy. 

Jacek Trzmiel, Head of Business Development at Coinfirm, also shared his thoughts on the collaboration: “We are thrilled to contribute to the Aleph Zero ecosystem which pushes the boundaries of blockchain technology. Our partnership will bring Coinfirm’s robust AML and analytics solutions, further enhancing its security and compliance. We look forward to working closely with Aleph Zero and contributing to their growth and success.“

The integration of the Aleph Zero blockchain native asset AZERO into Coinfirm’s AML transaction monitoring and wallet screening tools is expected to be completed in the coming months.

About Aleph Zero:

Aleph Zero is an enterprise-ready, high-performance blockchain platform with a novel, Directed Acyclic Graph (DAG)-based consensus protocol that enables instant finality and Liminal, a privacy-preserving framework that combines ZK and MPC technology.

For more information visit: www.alephzero.org 

About Coinfirm:

Coinfirm is a global leader in blockchain analytics and AML solutions, providing comprehensive services to various industries, including financial institutions, governments, and blockchain-based businesses. Coinfirm’s advanced AML platform offers real-time transaction monitoring, wallet screening, and risk assessment to ensure compliance with global regulations and safeguard against illicit activities.

The company, using 350+ proprietary risk algorithms while monitoring 25k+ blockchain entities, provides seamless, scalable tools to comply with stringent regulatory requirements for both CeFi and DeFi.

For more information visit: www.coinfirm.com

The post Aleph Zero to Integrate with Coinfirm to Enhance Transactional Security, AML with Advanced Analytics appeared first on Coinfirm.


KYC Chain

Regulation Focus Series | Article 5: Switzerland & FINMA

While it's mostly known for being a picturesque, mountainous country with one of the most advanced financial sectors in the world, Switzerland is also home to a dynamic virtual asset space. And although the country may have a reputation for banking secrecy, Swiss financial institutions and VASPs must adhere to strict KYC regulations designed to prevent money laundering and other illegal activitie

Wednesday, 17. May 2023

Identosphere Identity Highlights

Identosphere 134: Next 3 Weeks: 1)MEF Connects London/virtual May 25: 2) MyData Helsinki: 3) Digital Identity unConference Europe, Zurich. US Strategy includes ID as Critical Infrastructure

Upcoming events, company news, organizational updates, developments in standards and specifications, and everything related to decentralized identity and verifiable credentials. Support us on Patron!
Identosphere’s Weekly Highlights:
We Gather, You Read! We’ll keep aggregating industry info. Show support by PayPal, or Patreon! Upcoming

[Brussels] Blockchain-Based Identity Management Systems: Opportunities and Challenges 5/24 - CPDP gathers academics, lawyers, practitioners, policy-makers, industry and civil society to discuss the latest emerging issues and trends

Web 3.0: Self Sovereign Identity for Startups 5/24 - Panel at Wacom Europe GmbH

Heroes of Data Privacy – the practise-oriented data privacy conference in Vienna LionsGate 5/24 Vienna, Austria

[Virtual Option] London Event: MEF CONNECTS Personal Data & Identity 5/25 Identity Praxis, Inc. ←-Rebekah CEO of Numeracle presenting

How MyData conference became the most impactful event for personal data sharing 5/31-6/1

did:hack - decentralized identity hackathon 6/6 

Digital Identity unConference Europe 6/7-9 Zurich ← Registration open

[NYC] Velocity Network Foundation® 2023 General Assembly 6/19-20 

[Amsterdam] The Closing Conference of the Blockchain & Society Policy Research Lab [Call for Papers] 7/3-4

[New Zealand] Digital Trust Hui Taumata: Registrations now open! 8/1

How To DID Delegation OwnYourData

DID Delegation involves a few key steps and are described below using the command line utility oydid:

Explainer

[Webinar] Verifiable Credentials and Digital Wallets: An Overview with Erik Scott CI Compass [video] [slides] -”How do you prove who you are or what you are without giving away the keys to the castle?”

[tweet thread] One of the biggest benefits of self-sovereign identity is that it eliminates the need for usernames and passwords Mahir Şentürk

Privacy in Your Hands: exploring the Power of Prifina with Valto Loikkanen IdentityPraxis

how data can automatically be gathered from sensors and IoT devices. We review the concepts of a PIMS, a personal data store, and demonstrate Prifina’s integration with OpenAI’s Chat GPT and how individuals can seamlessly use a personal artificial intelligent agent to mine and get real-time value from their data

Verifiable Credentials for the Modern Identity Practitioner KuppingerCole

despite knowing your way thru identity, you still can't really tell how they work in practice or how the boldest claims (no more centralized DBs! Apps cannot save PII!) will play out. This session will dive into VCs and separate the hype from their true, remarkable potential.

Company News [Thread] GateKeeper Beta is LIVE! GateKeeper

GateKeeper is a Digital Identity Aggregator that aims to fix the issue of fragmented #DigitalIdentity Data points. Its primary use case is the ability for users of GateKeeper to Issue #verifiableCredentials across multiple DID methods with ease.

The Secure Middleware KYC/AML Solution for International Businesses exeraID

without holding users’ data, automatic detection of non-compliant users, and ongoing monitoring of user accounts make it an ideal solution for international businesses to comply with KYC/AML

Threshold Signatures for Secure Multi-Party Credential Transactions Dock

For example, let's say a private key is divided into five shares, and the threshold is set at three. This means that any three out of the five parties can come together and combine their shares to create a valid signature.

[video] Workplace Verification on LinkedIn with Microsoft Entra Verified ID Tech Mind Factory

This video explains  how LinkedIn members can verify their place of work with a Microsoft Entra Verified ID Verifiable Credentials.

Microsoft Entra Verified ID Core Concepts and Use Case

Spruce Developer Update #31 - our latest development efforts: SSX, SIWE, Kepler Storage, SpruceID, DIDKit, TreeLDR, Rebase Adoption Idendi3 Podcast: Sarah Clark, Senior Vice President of Digital Identity at Mastercard Dock

Sarah's team is building and leading the development of Mastercard’s new digital identity network which will not be directly related to Mastercard’s payment network at all. The company plans for its digital IDs to be reusable for in-person interactions, online, through the phone, and other channels. 

[KYC, KYB] Deloitte Integrates KILT Identity Blockchain, Creating New Markets with Digital Credentials Delloite

Deloitte is providing a credential wallet in the form of a browser extension. [...] “By offering reusable digital credentials anchored on the KILT blockchain, Deloitte is transforming verification processes for individuals and entities,”

[Canada, Report] Perspectives on the Adoption of Verifiable Credentials DIACC

examines relative interest levels among industries using verifiable credentials as part of their identity management offerings. This report also provides a brief discussion of critical issues and challenges associated with representative verifiable credentials [...] concerning the general use of verifiable credentials in Canada's digital identity ecosystem.

Hedera Hashgraph Joins World Wide Web Consortium (W3C); New DID Method Published by W3C Credentials Community Group Hedera

The Hedera DID Method leverages the Hedera Consensus Service and can consequently support transaction volume of tens of thousands of identity anchoring operations per second - opening up use cases like the Internet of Things to the Decentralized Identity model.

Organization [OASIS] OASIS launching new standards effort Lightweight Verifiable Credential Schema and Process (LVCSP) Technical Committee (TC)

TC seeks to define a lightweight identity credential schema that will allow people (VC subjects) to share their verified identity attestations across various platforms and services [...] and we invite you to consider being a part of it.

[OASIS] Lightweight Verifiable Credential Schema and Process Identiverse

we cover the scope, purpose and deliverables of a new OASIS working group that aims to define a lightweight identity credential schema [...] The work will cover KYC, KYB and Financial institutions related credentials.

What is “Good” Digital Identity? DIACC

digital identity is evolving along with the responsible tech movement, itself growing and already impacting Artificial Intelligence (AI) governance, as seen in the principles of the UNESCO Recommendation on the Ethics of Artificial Intelligence and the EU AI ACT. 

The Revolution is Here: An Interview With Daniel Goldscheider, Founder of the OpenWallet Foundation IdentityPraxis

learn about the OpenWallet Foundation, the benefits of open-source SmartWallet capabilities, and how to get involved (Here is a hint: it is open, and it is free to contribute and use code; there is a nominal fee for commercial organizations to participate in the community governance).

Policy United States Government National Standards Strategy for Critical and Emerging Technology WhiteHouse.gov

The United States will prioritize efforts for standards development for a subset of CET that are essential for U.S. competitiveness and national security, including the following areas:

Digital Identity Infrastructure and Distributed Ledger Technologies, which increasingly affect a range of key economic sectors; 

Democratic Rep Says Self-Custody Wallets Should Have Federal Digital Identities Blockworks

is there any alternative to having both sides of every crypto transaction associated with a traceable digital identity and have that digital identity issued by a government with which we have extradition treaties and a common concept of financial fraud

FTC to Issue Policy Statement On Biometric Data, during May 18 Open Commission Meeting FTC

The proposed amendments would help clarify technologies and entities covered by the Rule, facilitate greater electronic breach notices to consumers, and expand the required content of the notices, among other changes.

Trade Regulation Rule on Commercial Surveillance and Data Security, 16 CFR Part 464 FTC.gov

the Commission invites comment on whether it should implement new trade regulation rules or other regulatory alternatives concerning the ways in which companies (1) collect, aggregate, protect, use, analyze, and retain consumer data, as well as (2) transfer, share, sell, or otherwise monetize that data in ways that are unfair or deceptive. 

myGov gets Budget certainty, new money flows to Digital ID InnovationAus

The government has also committed $26.9 million to “develop the stage of the Digital ID program”, which will mostly be used by the Department of Finance and the Digital Transformation Agency to maintain the current Digital ID system and design “policy and legislative foundations”.

European Digital Identity and the Velocity Network: Building a Secure Future for Digital Credentials Velocity Network Foundation

The ARF establishes a framework for integrating various credentials and ecosystems into EUDI Wallets, which combine elements of EIDASv1 with SSI standards from organizations [like W3C, DIF, IETF, OIDF, & ISO]

What is Nigeria’s blockchain policy, and why should you care? Techpoint Africa

the blockchain adoption strategy [..] will guide the adoption of blockchain technology [... including …] Identity data management; identity management system can evolve into a self-sovereign identity approach where everyone can take charge of their data through blockchain.

Business [Video] Analyst Chat #172: Trends and Predictions for 2023 - The Business Case for Decentralized Identity

one of the concerns a lot of organizations have [that] decentralized identity is a totally different way of doing identity management. No, [...[ It's an additional way for receiving context data that can be used for syndication decisions.

Why the FTX Collapse Was an Identity Problem Dark Reading

At a US House Financial Services Committee hearing last December, John J. Ray, the new CEO of FTX, admitted to lawmakers that there had been "no record keeping whatsoever," and confessed that the crypto exchange had essentially engaged in "old-fashioned embezzlement."

Infrastructure: Revolutionizing the Global Economy Finance Magazines

need to prioritize data privacy and security, offering customers transparent control over their financial data. Collaboration with decentralized platforms and integrating Web3 technologies will be crucial for banks to remain competitive. This might involve partnerships with blockchain startups, exploring decentralized lending and borrowing solutions, or implementing [SSI] frameworks.

How Verifiable Credentials Can Empower Small Businesses Indicio

none of this has to be complicated or particularly expensive. Whether you see your business as big or small, verifiable credentials offer a way to streamline several different aspects of your operations and make your life easier.

‘Data wallets’: a trap waiting to be sprung? Alan Mitchell

you might conclude, if you built a skyscraper ten times higher than the current one, then you’d be ten times closer to your goal. [...] Much of today’s fashionable talk about organisations providing their customers with ‘data wallets’ is a classic example of the First Step Fallacy at work. It’s a trap waiting to be sprung.

Understanding Consumer Attitudes Towards Modern Identity Auth0

let’s talk about methodology. Okta’s Customer Identity Cloud Unit (CICPU) conducted a series of 60-minute Zoom research sessions with consumers from around the globe. 

Web 3 Web 3.0 Marketing and Establishing Authentic B2C Connections with Virginie Glaenzer IdentityPraxis

For the first time, the people, not platform providers, will bring and manage the identities and personal data that will flow across the wires and through the air. The discussion provides a fascinating look into Virginie’s journey and offers valuable lessons for aspiring entrepreneurs, marketers, and investors.

“Let MiCA do its job”: Industry experts praise framework but issue stark warning.

“If we are going to have a successful crypto hub – wherever that is, then we need to instil certain rules and regulations that enable a functioning and healthy ecosystem, and KYC will be integral part in facilitating this,” added Rayissa.

Polygon ID and Verida Make Zero-Knowledge Credentials Accessible to All Verida

Verida has integrated Polygon ID wallet SDKs to become the first consumer-ready mobile crypto wallet to support Polygon ID zero-knowledge (ZK) credentials.

Decentralized Web BLUESKY EARLY GROWTH The Stack Report

being an open protocol though, Software Developer Zhuowei Zhang was able to get access to the posts and users data and published a Bluesky data dump on his blog.

[video] Expert Webinar: How to Build Aviation’s Identity-First Future FindBiometrics

By the end of the decade, the travel experience will allow travellers to enroll their biometrics and identity data [...] speed through security wouthout invasive measures, [...] and board their plane in record time without having to dig through their belongings for their physical ID.

Identity not SSI 

ŌURA Acquires Proxy in All-Equity Deal: Acquisition opens the door to payments and authentication through Proxy's identity platform and smart ring patent portfolio

[Identiverse Trends Report] Growth, acceleration, and safety: Identiverse Trends Report (the report) SC Media

Open Source Summit 2023 (LF) Lessons Learned in Building an Interdependent Open Source Team OSS2023

● Alluxio Open Source Intro 
● The Mission(s) for Open Source Team 
● Open Source Team Explained 
● User Types and Community Growth Flywheel
● Building Blocks for an Open Source Team 
● Resource Planning, & Metrics 

Open Source is Winning but we could Still Lose OSS2023

● We ARE in competition with commercial software
● How do we promote and position ourselves? 
● Marking is not a dirty word. 
● Improve overall experience, overall image 
● Better coordination along entire process CI/CD to include SBOM checks 
● Better communication 

Thanks for Reading!

Read more \ Subscribe: newsletter.identosphere.net
Please support our efforts by Patreon or Paypal
Contact \ Submission: newsletter [at] identosphere [dot] net


Shyft Network

The Shyft Perspective: Hong Kong’s New Crypto Climate

The estimated global crypto ownership rate is around 4.2%, with more than 420 million crypto users worldwide. Although Hong Kong falls beyond the global average with an ownership percentage of 2.35%, more than 175,000 people own cryptocurrency there. With the new regulations are set to come into place in June 2023. Industry experts, analysts, large investors, individual participants, and crypto e

The estimated global crypto ownership rate is around 4.2%, with more than 420 million crypto users worldwide. Although Hong Kong falls beyond the global average with an ownership percentage of 2.35%, more than 175,000 people own cryptocurrency there.

With the new regulations are set to come into place in June 2023. Industry experts, analysts, large investors, individual participants, and crypto enthusiasts — all are busy assessing potential challenges and opportunities that lie on this regulated road ahead.

Understanding the new climate

The new regulations come with the final guidelines for crypto exchanges looking to launch in Hong Kong this May.

According to a regulatory synopsis issued on 27th April 2023 by Ms. Julia Leung, the CEO of Hong Kong’s Securities and Futures Commission, the SFC is “working with global counterparts to set baseline standards to regulate centralized virtual asset exchanges for adoption in major markets.”

The commission held public consultations on crypto exchange regulations last year to determine the best way for retail investors to access cryptocurrencies. It also wanted to test the feasibility of offering crypto exchange-traded funds (ETFs) in the geography. Indicating considerable buzz for the framework to come into place, the SFC received over 150 responses from the public.

The new regulations are also considered a milestone as they would let investors trade in major cryptocurrencies like Bitcoin and Ethereum from June 1st this year.

More on the topic: Hong Kong’s Proposed Asset Regulation Sparks Industry Optimism

Current crypto situation

The new regulations will considerably impact the crypto climate in Hong Kong, presenting a significant market opportunity for the crypto industry. According to Chainanalysis, cryptocurrency values received by internet addresses in Hong Kong were around US$70 billion in the first half of 2022 alone.

The New Regulations

The new regulations come under the legislative purview of the ‘Anti-Money Laundering and Counter-Terrorist Financing (Amendment) Bill 2022.”

According to its stated objectives, the bill wants to amend the Anti-Money Laundering and Counter-Terrorist Financing Ordinances (AMLO) to establish a licensing system for Virtual Asset Service Providers and apply the customer due diligence and record-keeping requirements under Schedule 2 to AMLO to VASPs.

Impact on Cryptocurrency Exchanges

Once the new regulations come into effect, crypto-related service providers will require an SFC license to operate in Hong Kong. And to obtain a license, they would have to furnish a detailed plan for combating money laundering risks and protecting their investors from such breaches and intrusions.

Image Source

In a broader sense, the new regulations would help set standards and benchmarks for crypto regulations in Hong Kong, a region actively pursuing the goal of becoming a crypto hub — not only in Asia but at a global scale — for some time now.

Numbers suggest that over 80 foreign and Mainland China companies have already expressed interest in setting up their Web3 company through this licensing mechanism. The interest indicates positivity towards norms that advocate system regularization and protection of investor rights and interests.

Comparison With Previous Regulatory Frameworks

Many new regulations often come at the cost of destabilizing the traditional system, but that is not the case with the Hong Kong crypto licensing rules, as it would subject providers to the same AML and counter-terrorist financing legislation that traditional financial institutions follow. Hence, one can expect a seamless or at least considerably less friction in the transformative process when it becomes effective from June 1st onwards.

Implications for Cryptocurrency Exchanges

Although the new regime is set to come into effect from June 2023, at least 23 businesses, out of the 80 that expressed interest, according to Hong Kong’s Secretary for Financial Services and Treasury Christopher Hui, are already setting up their presence in Hong Kong. And these 23 not only include crypto exchanges but Web3 security companies, blockchain payment providers, blockchain infrastructure builders, and more.

Adaptation strategies of local exchanges

Adapting to the imminent regulatory shifts and challenges could be easy for local exchanges by fulfilling the core demand: exhibiting the capability to perform due diligence on their customers and meet Hong Kong’s AML and anti-terrorist financing standards. Only then would it become easier to receive a license!

Potential challenges and opportunities

The potential challenge may arise from something common to most of the transformative processes worldwide: arranging provisions for compliance and setting up a robust mechanism for conducting adequate due diligence. It involves upskilling the existing compliance team. The team needs to become aware and set up the system following what has been asked for.

What awaits providers at the end of the compliance and diligence tunnel are enhanced trust from consumers, increased transparency in the system, and the scope of a pumped-up audience base.

The Hong Kong authorities are also coming out with increased support for virtual asset service providers willing to pass through this regime. For instance, Hong Kong has allocated HK$50 million in its annual budget spending to the growth of the Web3 ecosystem.

Comparative analysis with other jurisdictions

Similar to Hong Kong, a lot of adjacent and global jurisdictions are also proactively taking up the job of regulating virtual assets and the Web3 economy. And crypto entrepreneurs believe that when compared to Singapore and South Korea, Hong Kong’s new crypto regulations would be much similar to Singapore’s approach. Both are focused on protecting investors, institutional and retail alike.

Global and Asian Crypto Climate

Like Singapore and South Korea, being part of the Asia continent helps Hong Kong become a participant in the most thriving crypto market globally. Out of more than 420 million estimated crypto owners worldwide, more than 260 million are from the Asian market itself.

Key trends in global cryptocurrency adoption

It has to be kept in mind that Asia is also the most populous geographic region among continents in the world. However, other continents are also picking up speed in terms of gaining adoption and traction.

The number of owners in North America is more than 54 million, followed by Africa (38 million+), South America (33 million+), Europe (31 million+), and Oceania (15 million+).

In Asia, while regions like Singapore and Hong Kong are keen to encourage growth in the Web3 ecosystem, some countries like South Korea want to tread cautiously. Hong Kong, on the other hand, is moving forward in a structured way. The current exchange licensing system will be followed by stablecoin regulations in 2024.

Influence of international regulations

Along with its Asian counterparts, there are many countries or regions which are considering introducing similar regulatory systems.

For instance, the European Parliament has already approved the first set of comprehensive rules to regulate cryptocurrency markets, known as MiCA or Markets in Crypto Assets Regulation.

Dubai has also set up the Virtual Assets Regulatory Authority as a unique body regulating Virtual assets across Dubai.

All these international regulatory institutions are expected to duly influence the crypto market in Hong Kong.

More on the topic: EU Parliament Greenflags MiCA — Potential Impact on the Crypto Landscape

History of Cryptocurrency in Hong Kong

Hong Kong, for some time now, has been looked at as one of the most preferred crypto destinations in the world. In fact, the Worldwide Crypto Readiness Report termed Hong Kong as the most crypto-ready location in 2022.

In 2022, Hong Kong topped the charts in all categories, including the number of blockchain startups per 100,000 people and the number of crypto ATMs proportional to the population. The region was ahead of some of the most thriving global economies, including the US and Switzerland.

Over the years, multiple local crypto companies have emerged in the region, such as Kikitrade, Saxo Crypto Products, MYETHSHOP, CoinUnited.io, and other cryptocurrency trading and educational platforms, hubs, and more.

Regulatory Framework and Government Initiatives

We have already discussed the form and potential use cases of the upcoming crypto licensing mechanism in Hong Kong. It is also to be noted that the region’s AMLO aligns with the global recommendations of FATF. It requires VASPs across countries to adhere to an anti-money laundering benchmark in countering fraud, breaches, and financing terrorism regulations.

The Securities and Futures Commission is the primary body looking after the execution of the imminent licensing regime in Hong Kong.

Securities and Futures Commission (SFC)

Founded in 1989, the Commission’s job is to work as an independent statutory body to regulate Hong Kong’s securities and futures markets. Its principal responsibilities are to maintain and promote fairness, efficiency, competitiveness, transparency, and orderliness in the securities and futures industry.

Anti-Money Laundering and Counter-Terrorist Financing Ordinance (AMLO)

One of the most crucial legislations that the SFC is bothered with in regulating the crypto industry is the AMLO. In the Hong Kong special administrative region, the legislation comes with an ordinance. The ordinance amends the laws to apply customer due diligence and record-keeping requirements to virtual asset service providers.

Government support and initiatives

While regulations are set in place, the Hong Kong government is also keen to keep its support window open for the holistic growth of the Web3 ecosystem.

Cyberport came up with Web3@Cyberport in the early half of the 2023–24 financial year. According to the official budget document, HK$50 million was allocated to expedite the Web3 ecosystem development by “organizing major international seminars” and promoting “cross-sectoral business cooperation.”

Collaboration with Mainland China

When it comes to calibrating the region’s crypto strategies with that of Mainland China, Xiao Feng, the Chairman of the Hong Kong Crypto exchange HashKey, believes that the Hong Kong government’s emphasis is on practicing different laws from mainland China under the “One Country, Two Systems” framework.

Market Opportunities and Key Players

The SFC list of licensed virtual asset trading platforms in Hong Kong includes OSL Digital Securities Limited and Hash Blockchain Limited. Invariably the list of such exchanges is set to grow in the coming days, with local exchanges coming up in accordance with the regime and increasing their market share and accelerating growth.

Brands like Hex Trust, fully licensed, insured digital asset custodians, are also active in Hong Kong.

NUTS Finance is a blockchain development lab active in Hong Kong that focuses on building secure, composable, open-source technology to enable decentralized finance services. DTTD, another Hong Kong-based service, makes managing NFTs across multiple crypto wallets easy. Xversem, a Hong Kong-located service, aims at building effective Bitcoin wallets for Web3. Hong Kong-based Pictta is a mobile-first, gasless, and social-enabled NFT marketplace.

Such examples are many, and the bouquet of such service & product companies is bound to grow in the coming days.

Challenges and Risks

It is evident that implementing the licensing mechanism will come with its share of challenges and risks.

Any new regulation introduced in an existing market is bound to create friction in the system. Companies will have to exhibit to the regulators and their consumers that they can do due diligence and protect the investor from potential hacks, attacks, breaches, laundering, and other possible mismanagement of funds. At the same time, they will have to comply with global benchmarks for protecting user data.

Necessarily, it would have to be a consistent effort, demanding a fixed cost to be incurred by businesses every quarter. There are chances that occasional volatility in an emergent market like crypto will take a toll on the company’s bottom line. It is essential to have efficient fund management practices in place so that companies do not lack in their compliance and diligence efforts.

The People’s Republic of China’s Digital Yuan promised to achieve a “relatively high degree of financial inclusion” and aid in developing “various large-scale financial platforms” that reduce the cost of providing financial services. Its presence could also play a decisive role in shaping the future of crypto services in Hong Kong.

Future Outlook and Trends

Despite all these potential challenges, Hong Kong is looking toward a brighter future in crypto services. Blockchain and DeFi are high-potential areas with a lot to offer to people who still lack access to traditional finance avenues.

Merging of blockchain and crypto services to the traditional finance functions of lending and borrowing, underwriting, etc., have already started. One only needs to be careful of promises made to promote the new economic order that Web3 has the power to introduce through the decentralization route.

Final Note

The imminent regulations and licensing regimes in Hong Kong present both risks and opportunities to the crypto investor and user communities.

While regulatory compliance may lower the speed of innovation and pump up costs, the investors stand to gain from a more protective atmosphere in general. There is also the scope for businesses to enjoy long-term premiums by offering enhanced trust and transparency.

Every stakeholder must stay updated on the shifts and changes happening in the regime. It always helps steer ahead in an evolving market with less friction and optimal cost-efficiency.

Mickey and the Hong Kong Crypto Quest Additional Resources

https://www.legco.gov.hk/yr2022/english/counmtg/papers/cm20221207cb3-876-1-e.pdf

https://www.legco.gov.hk/yr2022/english/bills/b202206241.pdf

https://www.legco.gov.hk/yr2022/english/ord/2022ord015-e.pdf

The Shyft Perspective: Hong Kong’s New Crypto Climate was originally published in Shyft Network on Medium, where people are continuing the conversation by highlighting and responding to this story.


Indicio

Newsletter Vol 53

The post Newsletter Vol 53 appeared first on Indicio.

Now on LinkedIn! Subscribe here

How Verifiable Credentials Can Empower Small Businesses

What can verifiable credentials do for you if you have a small business? Indicio Deputy CTO Sam Curren explains how this technology isn’t just for large organizations and, in fact, can deliver many benefits to SMEs.  

Read more Identity Insights — Proven Sandbox with James Schulte

Learn how you can use  the sandbox version of the powerful Indicio Proven ecosystem for verifiable credentials to explore, experiment, and test use cases from Indicio VP of Business Development James Schulte. 

Watch the video Seamless but secure: ChatGPT and data protection in travel

Join us Thursday, May 18, at 11am ET for a PhocusWire LinkedIn Audio Event — “Seamless but secure: ChatGPT and data protection in travel,” featuring Indicio VP of Communication and Governance Trevor Butterworth.

Event details Proven Works Demo

Watch a walk through of how you can use verifiable credentials to onboard and verify an employee, and how this technology can simplify your organization’s access management strategies.

Watch the video Want to see more weekly videos? Subscribe to the Indicio YouTube channel! News from around the community:

Monokee customer Gibus Spa takes home a best Enterprise Award in the Identity and Access Management (IAM) category at this year’s KuppingerCole European Identity & Cloud conference in Berlin.

FinClusive announces a partnership to embed essential compliance tools that allow businesses to transact faster and more securely.

IdRamp shares its thoughts on identity orchestration taking center stage at the RSA conference.

Upcoming Events

 

Here are a few events in the decentralized identity space to look out for.

Identity Implementors Working Group 5/18 DIF DIDcomm Working Group 5/22 Aries Bifold User Group 5/23 TOIP Working Group 5/23 Hyperledger Aries Working Group 5/24 Cardea Community Meeting 5/25

The post Newsletter Vol 53 appeared first on Indicio.


Ontology

Ontology Weekly Report (May 9–15, 2023)

Highlights OKX Wallet now supports the Ontology Bridge! This collaboration makes it even easier to participate in the Ontology #EVM, ensuring a seamless experience. We’re committed to making Web3 as smooth as possible. Latest Developments Development Progress ● We are 80% done with the high ledger memory usage optimization. ● We are 80% done with the EVM bloom bit index optimization. ● We ar
Highlights

OKX Wallet now supports the Ontology Bridge! This collaboration makes it even easier to participate in the Ontology #EVM, ensuring a seamless experience. We’re committed to making Web3 as smooth as possible.

Latest Developments Development Progress

● We are 80% done with the high ledger memory usage optimization.

● We are 80% done with the EVM bloom bit index optimization.

● We are 65% done with the optimization of ONT liquid staking.

Product Development

● ONTO has announced the monthly report of April.

● ONTO has listed $BIBI on BNB Chain.

● ONTO has published a blog: Decentralized Identity (DID) in Crypto Wallets: A Game-Changer for Security and Privacy.

On-Chain Activity

● 164 total dApps on MainNet as of May 15th, 2023.

● 7,535,950 total dApp-related transactions on MainNet, an increase of 14,503 from last week.

● 18,683,191 total transactions on MainNet, an increase of 21,580 from last week.

Community Growth

● We started our Weekly Community Call Series. We talked about Bitcoin Miami Conference. Users expressed their insights around this and actively engaged.

● We held our Telegram weekly Community Discussion led by Ontology Loyal Members, discussing Meme coins. Participants got the chance to win Loyal Member NFTs.

● As always, we’re active on Twitter and Telegram, where you can keep up with our latest developments and community updates.

Global News

● Discover the convenience of exchanging $ONT on ChangeNOW! $ONT plays a pivotal role in empowering a decentralized, digital identity framework.

● Celebrating our integration with iZUMi! They now support assets on the Ontology EVM, paving the way for improved liquidity and a smoother experience.

● Loyal member Sasen and Furst joined the AMA with Crypto Wallet on behalf of Ontology, discussing the ecosystem of Ontology and how ONT ID empowers Web3 world.

● Continuing our ‘Meet the Team’ series, we’re very pleased to ask Ontology Harbinger, Hamzat a few questions.

● It’s time for our latest OWN101, as part of our OWNInsights series. This week, we bring you “Merkle Tree”.

Contact us

- Ontology official website: https://ont.io/

- Email contact: contact@ont.io

- GitHub: https://github.com/ontio/

- Telegram group: https://t.me/OntologyNetwork

Ontology Weekly Report (May 9–15, 2023) was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


Tokeny Solutions

Tokeny’s Talent | Lautaro

The post Tokeny’s Talent | Lautaro appeared first on Tokeny.
Lautaro Giambroni is Product Owner at Tokeny.  Who are you?

Hi there! I’m Lautaro Giambroni, a Product Owner at Tokeny. I’m 27 years old and was born and raised in Buenos Aires, Argentina. I earned my engineering degree there and later pursued a master’s degree in financial markets in Spain.

From a young age, I’ve been interested in how technology and digital products shape the world and improve people’s lives. It’s a passion that has driven me to continue my career in this field, and in recent years, I’ve been closely following the exciting developments in the blockchain space.

What were you doing before Tokeny and what inspired you to join the team?

Before joining Tokeny, I worked in various companies related to the financial and banking sector. As a UX designer, product manager, and innovation consultant, I gained experience and insights into the industry. However, I couldn’t help but notice the outdated structures and practices that seemed to hinder innovation in the sector.

Seeking to make a real impact, I came across Tokeny. What really stood out to me was the innovative approach to leveraging blockchain technology to transform financial operations in companies, as well as their commitment to bringing the benefits of blockchain to society.

How would you describe working at Tokeny?

Working here is an awesome experience. The best part is the team I get to work with, who are not only highly skilled but also share the same passion to make finance easy-to-access and efficient for everyone via our leading tokenization platform. 

The work we do is demanding, but it’s also highly rewarding. Our product is growing at a fast pace, and keeping up with the disruptive innovations in the DeFi industry can be a challenge. However, seeing the team’s hard work come to fruition after a release makes it all worth it.

Being part of a company that is driving real change in the industry is something I’m grateful for every day. The tech industry is an exciting place to be, and being part of a team that is leading the way is truly amazing.

What are you most passionate about in life?

There are two things that really ignite my passion: meeting new cultures and learning new skills. I’ve had the opportunity to live in different countries, and that experience has fueled my desire to explore new places, try new foods, and meet people from all walks of life. In addition to that, I’m always looking to learn new skills. Whether it’s taking up a new hobby or improving my professional background, I’m eager to expand my knowledge and challenge myself. Lately, I’ve been getting into surfing, and I started practicing analogic photography.

Overall, I think it’s important to stay curious and keep pushing yourself beyond your comfort zone. Life is all about growth and learning, and I’m passionate about embracing that journey and seeing where it takes me.

 

What is your ultimate dream?

Over the past decade, my home country has faced one of the worst financial crises in its recent history, marked by high unemployment rates, inflation, and a lack of economic stability. This has caused many young people, including myself, to leave in search of a better future. It’s a difficult reality that I would love to see change.

I strongly believe that decentralized finance could have an impact. Especially in a country where the monetary policy has been mismanaged, and lack of transparency has led to a loss of trust in local currency. Ultimately, my dream is to contribute to creating better conditions for the people in my home country.

What advice would you give to future Tokeny employees?

As a future Tokeny employee, you will have the opportunity to learn and grow as a professional in a dynamic and innovative environment. We work hard to push the boundaries of what’s possible in the world of tokenization, and we’re constantly looking for new ways to innovate and make a difference.

So, my advice to any prospective employee is simple: join the team and be prepared to work hard, learn a lot, and make a real difference in the world of tokenization. We will be excited to have you on board!

What gets you excited about Tokeny’s future?

What excites me the most about Tokeny’s future is the tremendous potential for growth in the tokenization market. Over the past few years, we’ve seen significant growth in this space, and it’s estimated that the market will continue to grow at an unprecedented rate.

Being part of a company at the forefront of this market is truly amazing. Our platform is designed to empower firms seeking to digitize their operations and benefit from the latest advancements in technology.

I am proud to be part of a company that is driving real change in the industry, and I’m excited about the endless possibilities that lie ahead.

He prefers: check

Coffee

Tea

check

Book

Movie

check

Work from home

Work from the office

check

Cats

Dogs

check

Call

Text

check

Burger

Salad

check

Ocean

Mountains

Beer

check

Wine

check

Countryside

City

check

Slack

Emails

check

Casual

Formal

check

Crypto

Fiat

Night

check

Morning

More Stories  Tokeny’s Talent|Barbora’s Story 28 May 2021 Tokeny’s Talent|Radka’s Story 4 May 2022 Tokeny’s Talent|Eva’s Story 19 February 2021 Tokeny’s Talent|Mihalis’s Story 28 January 2022 Tokeny’s Talent|Ivie’s Story 1 July 2022 Tokeny’s Talent|Shurong’s Story 20 November 2020 Tokeny’s Talent|Laurie’s Story 26 January 2023 Tokeny’s Talent|Nida’s Story 15 January 2021 Tokeny’s Talent|José’s Story 19 August 2021 Tokeny’s Talent|Héctor’s Story 29 July 2022 Join Tokeny Solutions Family We are looking for talents to join us, you can find the opening positions by clicking the button. Available Positions

The post Tokeny’s Talent | Lautaro first appeared on Tokeny.

The post Tokeny’s Talent | Lautaro appeared first on Tokeny.


KuppingerCole

Microsoft Putting Decentralized Identities Into Practice

by Martin Kuppinger Decentralized Identities in all their facets have been a hot topic at the European Identity and Cloud Conference 2023 (EIC), which ran in Berlin last week. Just a few days before, Microsoft announced their support for Verified Workplace in LinkedIn using Microsoft Entra Verified ID. Verifying the workplace What Microsoft announced is focused on a particular use case, the v

by Martin Kuppinger

Decentralized Identities in all their facets have been a hot topic at the European Identity and Cloud Conference 2023 (EIC), which ran in Berlin last week. Just a few days before, Microsoft announced their support for Verified Workplace in LinkedIn using Microsoft Entra Verified ID.

Verifying the workplace

What Microsoft announced is focused on a particular use case, the verification of workplaces of LinkedIn members. It builds on multi-step verification.

The entry level is workplace email verification. There is little friction in that. Organizations can verify the work mail of their employees based on a lightweight WebApp adding to the Microsoft Entra Verified ID service, which again requires a Microsoft Entra Azure Active Directory (AD) tenant. Factually, this allows issuing verified credentials/IDs based on Azure AD or any other OpenID Connect (OIDC) provider.

In the U.S., Microsoft also will provide, together with its partner Clear, a government identity verification and binding to this. This adds another level of proof.

The third step then is a wallet for professionals that, in the first phase, can accept verified credentials. Enterprises can issue credentials to their employees that can be shared with LinkedIn or to Microsoft Authenticator for verification beyond LinkedIn. This thus integrates both the Microsoft Authenticator and LinkedIn – relying parties that are talking to the Authenticator also can talk to LinkedIn and the Authenticator can be used for extended verification scenarios.

Microsoft is basing this on the (still emerging set of) standards around decentralized identity, making it interoperable.

The first step into practice and towards critical mass

While the announcement is focused on a specific use case, it is an important step forward towards the ubiquitous use of decentralized identities:

It demonstrates a concrete, practical application of decentralized identities. It builds on standards for interoperability. It is supported by a growing partner ecosystem. It neatly integrates with Microsoft Entra (including Azure Active Directory), LinkedIn, etc., and can be integrated with further solutions. It provides concrete value to the users. It helps in creating a critical mass of users.

At the EIC, I’ve illustrated my talk about decentralized identity in the enterprise with the employee and partner onboarding use case, where a verified government ID is used for initial onboarding and proofs of employment and job title/role are used for partner onboarding. In the public preview, Microsoft is demonstrating such a use case in combination with Entitlement Management, a feature within Microsoft Entra.

I’m looking forward to seeing broader adoption of the Decentralized Identity model across the globe. Approaches such as the one announced by Microsoft can help it happen earlier.


Ontology

Master the Swap: ONG to ONT via iZUMi Finance

A Step-by-Step Guide! Embarking on your DeFi journey with Ontology just got easier! With the power of iZUMi Finance, you can now swap your ONG for ONT in a few simple steps. This guide will walk you through the process, helping you navigate the decentralized exchange (DEX) like a pro. So, whether you’re a seasoned DeFi enthusiast or a newcomer to the space, let’s get started and master the art of
A Step-by-Step Guide!

Embarking on your DeFi journey with Ontology just got easier! With the power of iZUMi Finance, you can now swap your ONG for ONT in a few simple steps. This guide will walk you through the process, helping you navigate the decentralized exchange (DEX) like a pro. So, whether you’re a seasoned DeFi enthusiast or a newcomer to the space, let’s get started and master the art of swapping!

Three steps on how to turn ONG into ONT via the DEX: Step 1: Transfer ONG (Ontology) to ONG (ORC20) with Ontology Bridge Go to: https://bridge.ont.io, connect non-EVM wallet and EVM wallet. You should see your wallet address displayed once done. Click the ONG Transfer button as shown in the figure below. Choose Non-EVM from the Wallet column (Your native address will be recognized automatically then) and fill in the EVM address (To which ORC-20 ONG will be sent) Click “Transfer” when everything is good, make sure you have enough ONG for the gas fee (0.05 ONG each transaction). Step 2: Use iziSwap to swap ONG (ORC-20) to WONT (ORC-20) Go to: https://izumi.finance/trade/swap, locate in the top right corner and click the “Connect Wallet” button to connect metamask wallet. Also, switch to “Ontology” in your wallet’s network settings. Time to swap “ONG” into “WONT”. The top token or “from” placeholder depicts the token you own (ONG in this scenario), while the second token or “to” placeholder represents the token you receive (WONT in this scenario) after the successful swap. Double check the amount you want to swap and make sure you prepare enough ORC-20 ONG for the gas fee (Around 0.5 ONG for each transaction). Click swap and confirm the transaction. If the transaction is successful, a message will appear in the bottom right corner of the page. Step 3: Bridge WONT (ORC20) to ONT (Ontology) with Ontology Bridge Go to: https://bridge.ont.io again.This time we should use Token Bridge. Let’s choose “WONT” in the assets list and ensure the asset will be bridged from ORC-20 into Ontology native. Double check the amount and make sure you have enough ORC-20 ONG for the gas fee (Around 0.15 ONG normally). Click Swap and confirm the transaction. You ONT will be available in a short time. About iZUMi Finance

iZUMi Finance is a DeFi protocol providing one-stop Liquidity as a Service (LaaS), dedicated to building a top DEX and providing liquidity service on ZK-rollups and multi-chains. iZUMi Finance has two products, iZiSwap and LiquidBox. iZiSwap is the first and top on-chain Order Book DEX driven by iZUMi’s innovative DL-AMM and peer-to-pool Order Book design, providing CEX experience to DEX users on Layer 2. LiquidBox has raised and managed over $100M for concentrated liquidity DEXes and protocols, providing sustainable high APR with less impermanent loss.

Master the Swap: ONG to ONT via iZUMi Finance was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.

Tuesday, 16. May 2023

Entrust

SSL Review: April 2023

The Entrust monthly SSL review covers TLS/SSL discussions — recaps, news, trends, and opinions from... The post SSL Review: April 2023 appeared first on Entrust Blog.

The Entrust monthly SSL review covers TLS/SSL discussions — recaps, news, trends, and opinions from the industry.

Entrust

Google’s 90 Day Proposal for TLS Certificates

Bulletproof TLS Newsletter #100

OpenSSL Cookbook Released Under CC BY-NC

TLS News & Notes

Cloudflare discusses mTLS client certificate revocation vulnerability with TLS Session Resumption Announcing Certainly: Fastly’s own TLS Certification Authority with 30-day certificates

S/MIME News & Notes

Hashed Out and the 5 Reasons Email Encryption Should Be Next on Your To-Do List S/MIME Baseline Requirements Certificate Factory

The post SSL Review: April 2023 appeared first on Entrust Blog.


Gmail now offers a new checkmark for emails enabled with Verified Mark Certificates and BIMI

Entrust Verified Mark Certificates (VMCs) are digital certificates that enable organizations to create a more... The post Gmail now offers a new checkmark for emails enabled with Verified Mark Certificates and BIMI appeared first on Entrust Blog.

Entrust Verified Mark Certificates (VMCs) are digital certificates that enable organizations to create a more immersive and personalized email experience by displaying their registered trademark logo in the avatar slot alongside emails. This helps them elevate their brand and stand out in their prospects’ crowded inbox. Because VMCs are linked with DMARC technology, they also potentially reduce email spoofing.

Entrust started working on VMCs in June 2017. Entrust is proud to have issued the world’s first VMC (in Sept 2019), performed a pilot with Gmail (in July 2020), and issued the world’s first Government Mark Certificate (in March 2023). Entrust has also partnered with Red Sift to offer organizations an integrated BIMI solution to enhance their email security and boost their brand recognition in the inbox.

Now, Gmail has included an additional benefit to email senders and users in the Gmail desktop web browser UI – a new checkmark icon that proactively indicates the sender of the email has verified that it owns the domain and the logo displayed in the avatar slot. This new checkmark is only available when DMARC and VMC are both implemented.

This first image shows how the new checkmark icon will appear in the Gmail application “detail message” view.  The checkmark is shown in white against a blue badge, to the right of the VMC logo and the sender’s display name:

Additionally, there’s a new tool tip that displays the following confirmation text when a user hovers over the checkmark.

As you see, the tool tip message confirms that the email sender has verified that it owns its email domain and logo as specified in the VMC.

Gmail is actively rolling out this VMC checkmark icon now, so all desktop web browser Gmail users should see these new icons soon. Gmail has not yet announced how this might appear in mobile environments but Entrust expects there will be coordination of Gmail UIs across their other environments.

This checkmark icon is a good visual cue for recipients to identify verified email senders. Entrust strongly supports this new VMC enhancement and believes it will be embraced by email senders and users.

If you’re interested in getting a VMC to take advantage of this new checkmark icon in Gmail, contact us at sales@entrust.com or visit us at www.entrust.com/vmc.

The post Gmail now offers a new checkmark for emails enabled with Verified Mark Certificates and BIMI appeared first on Entrust Blog.


1Kosmos BlockID

Knowledge-Based Authentication (KBA) Explained

Knowledge-Based Authentication (KBA), is a security measure used to verify a person’s identity by asking them to provide specific information that only they should know. How Does Knowledge-Based Authentication Work? The idea is that these questions serve as a block for other individuals who should not know the user’s private information but which the user … Continued The post Knowledge-Based Aut

Knowledge-Based Authentication (KBA), is a security measure used to verify a person’s identity by asking them to provide specific information that only they should know.
How Does Knowledge-Based Authentication Work?
The idea is that these questions serve as a block for other individuals who should not know the user’s private information but which the user has immediate knowledge of (and does not have to memorize).

KBA questions can be generated in two primary approaches: static and dynamic.

Static KBA
Static approaches involve selecting a set of predetermined questions that users can choose from when setting up their KBA. These questions are often personal, referring to highly personalized experiences or preferences that would be difficult for an attacker to guess without directly knowing the user.

Examples of static KBA questions include:

What is your mother’s maiden name? What was the name of your first pet? In what city were you born? What is your favorite book?

These questions are typically selected during user onboarding. Later, when the user accesses the account, they are asked to provide the answers to those same questions.

Dynamic KBA
Dynamic approaches, as a form of adaptive authentication, will generate questions based on information from various data sources tied to the user, such as public records, credit reports, or social media profiles. The system uses algorithms to select relevant, personalized questions only the user should know. Examples of dynamic KBA questions include:

Which of the following addresses have you lived at in the past? What was the name of your elementary school? Which of these phone numbers have you previously used? What bank/financial institution/auto lender do you have a loan/bank account/credit account with?

The generation process for dynamic KBA questions typically involves data aggregation and analysis to create a set of potential questions.

Both static and dynamic KBA have their pros and cons, with static KBA being more straightforward to implement but potentially less secure due to the limited set of questions. Dynamic KBA is more secure since it generates personalized questions based on the user’s unique background. Still, it requires access to reliable data sources and can be more complex to implement.

How Effective Is Knowledge-Based Authentication?

KBA can provide a reasonable level of security in cases where it is used with other authentication methods. As a standalone method, however, it has some limitations.
Some of the critical issues with KBA include:

Guessable Answers: Answers to static KBA questions may be easy to guess or research using publicly available information, such as social media profiles, online directories, or leaked data from breaches. This makes KBA more susceptible to attacks by determined adversaries. Data Breaches: If an organization suffers a data breach, the answers to KBA questions may be compromised, rendering the KBA process ineffective for affected users. Knowledge and Memory: Users might need to remember their answers, leading to account lockouts and frustration. Additionally, users might give easy-to-guess or common answers that are easier to guess. Social Engineering: Attackers can use techniques like phishing or pretexting, to trick users into revealing the answers to their KBA questions.

Due to these limitations, KBA is generally considered less secure than other authentication methods like multi-factor authentication (MFA) or biometrics. While it can still serve as an additional layer of security, organizations are increasingly adopting more advanced and secure authentication methods to protect user accounts and sensitive information better.

Is Knowledge-Based Authentication Used in Multi-Factor Authentication?

MFA requires users to present multiple factors to verify their identity before granting access to a system, application, or resource.

These factors are typically classified into three categories:

Knowledge: This factor includes information that only the user should know, such as passwords, personal identification numbers (PINs), or answers to knowledge-based authentication (KBA) questions. Possession: This factor involves something the user physically possesses, such as a hardware token, a one-time password (OTP) sent to a registered mobile device, or a software token generated by an authenticator app. Inherence: This factor refers to biometric characteristics unique to the user, such as fingerprints, facial recognition, voice recognition, or iris/retina scans.

Knowledge-based authentication can fit into the knowledge category. But, broadly, KBA isn’t a replacement for passwords or PINs. However, and more appropriately, KBA can provide another, easy-to-implement layer on top of MFA.

So, for example, if a user attempts to access their credit report, the provider can ask for a password and a one-time password sent via email. Then, when the user logs in, the provider can use dynamic KBA to ask questions related to the credit report, adding another layer of security to an incredibly sensitive document.

What Are Some Alternatives to Knowledge-Based Authentication?

Several alternatives to Knowledge-Based Authentication (KBA) provide more secure and reliable methods for verifying a user’s identity. Some popular alternatives include:

Token-Based Authentication: One-time passwords are unique, time-sensitive codes generated by a dedicated hardware token, mobile device, or software application. The user must enter the OTP in addition to their standard password to authenticate their identity. Since OTPs expire after a short period or upon use, they offer a more secure alternative to KBA. Device Authentication: Verification systems can use push notifications or apps to allow users the ability to authenticate via their device, typically a mobile device. While these devices can be compromised, they will, at minimum, greatly reduce the attack surface that could impact those users. Biometrics: Biometric authentication relies on unique physical or behavioral characteristics, such as fingerprints, behavioral biometrics, or facial scans, to verify a user’s identity. Biometric authentication is more secure than KBA because it is based on features unique to the individual, making it difficult for attackers to spoof.

These alternatives offer varying levels of security, usability, and implementation complexity, depending on the specific use case and the organization’s requirements. Many organizations opt for a combination of these methods to provide a more robust and secure authentication process.

Ground Your Security in Identity-Based Authentication with 1Kosmos

Knowledge-based authentication is a useful companion to other forms of identity verification but only a small part of it. Implementing such solutions can help add layers of security but don’t serve as a replacement for solid security techniques like identity verification, biometrics, and passwordless authentication.

1Kosmos provides these critical fundamentals through the following features:

Identity-Based Authentication: We push biometrics and authentication into a new “who you are” paradigm. BlockID uses biometrics to identify individuals, not devices, through credential triangulation and identity verification. Cloud-Native Architecture: Flexible and scalable cloud architecture makes it simple to build applications using our standard API and SDK. Identity Proofing: BlockID verifies identity anywhere, anytime and on any device with over 99% accuracy. Privacy by Design: Embedding privacy into the design of our ecosystem is a core principle of 1Kosmos. We protect personally identifiable information in a distributed identity architecture, and the encrypted data is only accessible by the user. Private and Permissioned Blockchain: 1Kosmos protects personally identifiable information in a private and permissioned blockchain and encrypts digital identities and is only accessible by the user. The distributed properties ensure no databases to breach or honeypots for hackers to target. Interoperability: BlockID can readily integrate with existing infrastructure through its 50+ out-of-the-box integrations or via API/SDK. SIM Binding: The BlockID application uses SMS verification, identity proofing, and SIM card authentication to create solid, robust, and secure device authentication from any employee’s phone.

Sign up for our newsletter to learn more about how BlockID can support real security and help mitigate phishing attacks. Also, read our whitepaper on how to Go Beyond Passwordless Solutions.

The post Knowledge-Based Authentication (KBA) Explained appeared first on 1Kosmos.


Fission

Fission's Origin Story

This is the story of how we built an open source company that specializes in developing the identity, data, and compute protocols for the future of the Internet. Poutine Meetup in Berlin Fission’s co-founder and CEO, Boris Mann, has been on the Internet for a long time. As one of the first 1,000 bloggers in the world, Boris loved being able to publish content online but recognized the challenges t

This is the story of how we built an open source company that specializes in developing the identity, data, and compute protocols for the future of the Internet.

Poutine Meetup in Berlin

Fission’s co-founder and CEO, Boris Mann, has been on the Internet for a long time. As one of the first 1,000 bloggers in the world, Boris loved being able to publish content online but recognized the challenges to making self-publishing possible for everyone. Running a server under your desk at home, installing software on the command line, and generally having the necessary technical know-how is neither realistic nor equitable.

Discovering the Drupal open source content management system and its worldwide network of contributors was an eye-opener: here was software and more importantly community working on making it possible for anyone to publish online. Boris became an active contributor, including helping to organize events that grow from 40 people to 1000s of Drupalcon attendees, and helping to form the Drupal Association.

At the same time, large proprietary software companies, like Microsoft, were working hard to retain their market position. They viewed open source software as a dangerous competitor and adopted internal policies to thwart the advancement of FOSS.

Boris continues to advocate for and share open source code, data, and community principles. Building better together, so that people can have more control and agency of technology for better communication and collaboration online and off.

Fission's co-founder and CTO, Brooklyn Zelenka, started studying music composition at the University of Calgary and doing graphic design projects on the side. She picked up Photoshop and Illustrator and eventually dropped out of music school and moved to Vancouver. About five years later, while doing graphic design professionally at a friend's startup, she was given a book on HTML, CSS, and JavaScript and asked if she could learn a bit of web development. By the end of the year, Brooklyn had taught herself over 20 programming languages.

Brooklyn became deeply involved in the local meetup scene, running multiple communities across many topics. She became involved in international functional programming and distributed systems conferences after she ported the Haskell standard library to Elixir. (It later tuned out that functional data structures are a great fit for IPLD).

Eventually, Brooklyn decided she wanted to return to startups, and was hired by a company based out of San Francisco. She and her Vancouver co-worker rented a desk from Boris's co-working space, which is how they met!

Brooklyn went on to do consulting work for Facebook, Uber, Kickstarter, and many others. After writing the same boilerplate for the nth time and fighting with k8s, she started to feel that there must be a better way. She began early designs of things that would eventually find their way into Fission.

After a couple of years away travelling as a digital nomad, she she returned to Vancouver in 2017. She joined Boris as principal engineer at a company working on Ethereum-based, legally compliant infrastructure. Brooklyn then focused on writing a formally verifiable smart contract language that was also human-readable (so that a lawyer could review the code).

April 2019, RUN EVM event in Berlin

Excited by the potential of a backend as a service, aka the "world computer" of Ethereum, Boris and Brooklyn together founded a company called SPADE ("Special Projects and Decentralized Engineering") and received an important grant from the ConsenSys Tachyon program to work on Ethereum Status Codes, which were later renamed FISSION Codes. They were one of only two companies to receive an open-source grant (the other being WalletConnect).

While building FISSION Codes, both Boris and Brooklyn started to recognize that the real challenge for both Web2 and Web3 was the increasing complexity of putting together full stack applications – plus the devops needs of running such apps. Except now, these problems were compounded by complex security needs and a more fervent demand for user control and agency. Users had been burned by Web2 exploitation and were becoming more discerning and knowledgeable about how their privacy was being compromised.

While the Web3 of blockchain, decentralized web, and p2p components were new and everyone was learning, all of the same challenges of needing to gain all of the increasingly complex skills of full stack app development and deployment were there.

After spending time on core Ethereum development & community building, our cofounders asked themselves:

What if some of the emerging concepts of  Web3 – advances in distributed systems, cryptography, and self verifiable data structures – could make it dramatically easier for front end developers to build apps, without having to become full stack developers and devops experts.

And so Fission the company was born: building an edge stack with identity, data, and compute components that enables front-end developers to build local-first apps while respecting user privacy and agency.

The 2019 Diffusion Hackathon grand winners, Marco and Eugene, were gifted a 4L jug of maple syrup for using the first version of the Fission service in their winning entry

In a Web2 stack, to put an app online, you need to run a server to hold user data. All of this data is visible to the people that run the service, and the incentives are typically to lock the users' data into the service. Never mind scaling and devops server management.

In prototyping, front end developers use dummy data in the browser without having to worry about setting up a server.

What if they could make that much simpler prototype development capable of running the production data? It would make development much easier for developers, and data wouldn't have to leave the device! By default, it would have to respect user agency.

Fission (as it is now known) became a member of the Protocol Labs Network and started to build on top of the IPFS protocol. This direction would lead to the inventions of UCAN (decentralized user auth) and WNFS (user file system on top of IPFS with encrypted private data).

Today our front end developer SDK has evolved its storage solution into an overall data solution. We are working on a far-edge database called Rhizome to be added to the mix. We continue to iterate and collaborate with others on UCAN and other decentralized identity tools and have done an initial release of the final piece of our edge stack: content-addressed computation on IPFS with the InterPlanetary Virtual Machine (IPVM) specification.

Our commitment to community building and open-source development is unwavering, and we look forward to continuing to grow and collaborate within this robust and exciting ecosystem.


Caribou Digital

Persons Living With Disabilities Self-Directing Their Own Stories

This is a guest blog by Abu Majid, Storyteller at Story x Design As a multimedia producer at Nairobi-based Story x Design, I’ve had the pleasure of collaborating on four Caribou Digital research projects since October 2020. When Caribou Digital reached out in February 2022 asking to expand our participatory video storytelling methodology to meet the needs of digital platform workers living w

This is a guest blog by Abu Majid, Storyteller at Story x Design

As a multimedia producer at Nairobi-based Story x Design, I’ve had the pleasure of collaborating on four Caribou Digital research projects since October 2020. When Caribou Digital reached out in February 2022 asking to expand our participatory video storytelling methodology to meet the needs of digital platform workers living with disabilities, we had the opportunity to rethink our processes, from training, to filming, to the accessibility of our final edits.

An earlier blog by Story x Design Director Miranda Grant outlines our original participatory storytelling methods and offers tips for those interested in this filmmaking process. This blog reflects on our experience of collaborating with persons living with disabilities to create short documentaries about their lives and how we adapted our methods to fit their needs. We recount our design process, learnings, successes, and contingencies to barriers encountered.

This research, co-led by InAble and Caribou Digital and in partnership with Mastercard Foundation, aims to understand the experiences and challenges of young people living with disabilities and engaging on platforms for work. With the support of InAble and Caribou Digital, we designed open-ended questionnaires to elicit authentic reflections on the challenges faced and strategies employed by platform workers living with disabilities. Our agile approach allowed us to stay nimble, learn from participants, and adjust as required throughout the process.

Three key adjustments to our process Adapting the project design to ensure access for all. In order to make our collaboration with persons living with disabilities beneficial to them, we tweaked some of our supporting tools (i.e., film practice manuals, questionnaires, film prompts) for ease of use to all participants. Working with interpreters and aides to accommodate multiple abilities. With the great support of a sign language interpreter and individual aides, we were able to help each participant identify and express the stories most important to them through a mix of in-person and one-on-one narrative and technical skills training. This hybrid mentoring approach helped participants gain confidence in their abilities to self-direct their own stories. Diversifying post-production. Our versatile post-production, leveraging WhatsApp video / voice calls / voice notes and regular carrier calls, but also importantly in-person training, enabled us to stay in close touch with the participants who faced different types of challenges creating their content. Adapting the project design 1. Adapting resources

Our participatory self-shot video diaries approach was developed in 2020 as a countermeasure to the COVID-19 lockdown. The framework requires the deployment of equipment and intensive multimedia skills training to all participants, where success depends on participants’ comprehension and willingness to contribute on their own time. So this process can be time and resource intensive. However, even after the COVID-19 lockdown was lifted, we decided to carry on with the self-shot video approach as participants expressed their eagerness to tell their own stories and willingness to take up the role of self-directing.

As per our usual process, we distributed a small smartphone production kit to each participant. But made a few adjustments to accommodate those who took part.

The phones were preloaded with:

Six instructional videos on the downloadable VLC phone application. These instructional videos were embedded with a voiceover (VO) feature that defined and detailed basic film terms and techniques. The VO not only navigated participants with visual impairment through the content but also offered relatable examples for all to try. See a sample Story x Design instructional video here. A readable softcopy instructional manual for the native TalkBack assistive application for both the visually and hearing impaired. An audio instructional manual for the visually impaired participants and their aides. Hardcopy instructional manuals for all. A softcopy questionnaire and prompt document readable with TalkBack apps for all participants. 2. Working with aides during skills training and filming

We gathered with five of the six participants and their aides in a conference room in Thika Town, 44 kilometers from Nairobi City, a central location for all in attendance.

Together with participants and aides, we explored the self-shot video diaries format, examining proper camera orientation, sound capture, lighting, framing, and many more good film practices. Ongoing virtual mentorship provided participants with the support necessary to overcome technical and creative challenges over the course of production.

The close collaboration with interpreters and aides was key to enabling transparent communication with participants, in particular about the use of participants’ images and published videos. With clearly drafted agreement and image release forms, participants with their aides as witnesses, read, understood and after being satisfied, signed onto contributing to the project.

Participants and aides were compensated for their work on this project. Participants received ongoing payments to purchase internet bundles for uploading large media files throughout production, as well as a one-time payment on project completion. Aides were provided with per diems for their efforts in supporting filming.

3. Diversifying post-production

A key addition to this project was Monica Onyango, a sign language interpreter who helped translate footage submitted by the participants with hearing impairment. This enabled the post-production team to transcribe and edit subtitles to build a story that honored the contributors.

In one case where one of the visually impaired participants could not manage to find help in filming and sending more footage, we successfully sought her consent to use her voice notes as voice overs to missing segments in her story. This was in line with our desire to create a safe, accommodating space that offered an equal platform for expression.

Challenges and contingencies

We recognized that there would be challenges that we had not faced before. Although the six participants all come from one country and one time zone, and had experience navigating digital platforms, none had experience self-directing for video or taking part in a collaborative multimedia project of this scale.

Beyond the usual challenges in supporting the production of self-directed videos, here are some specific challenges people with disabilities faced.

The need for in-person support. In addition to an initial in-person training, participants required one-on-one mentorship and refresher sessions throughout the duration of the project. When internet connections failed, team members needed to visit participants with visual impairment in person for troubleshooting and automatic upload restoration. Similarly, when capturing B-roll proved too challenging for participants, we traveled to support, while taking great care to follow participants’ self-direction and maintain the participatory project design. The availability of aides. Even with per diems on offer, most aides were unavailable on occasion, particularly because most aides selected by participants were fellow students, and others had full-time jobs. Consequently, production was often halted as the visually impaired participants waited for help. The authenticity of participants’ stories. We wanted to highlight participants’ authentic techniques of navigating the digital world to find work and understand their challenges. However, we refrained from asking questions beyond what participants revealed, allowing them to authentically share their successes and obstacles, instead of only emphasizing their disabilities. Self-determination

The most gratifying measure of achievement for all of us is the fact that the expressions of all participants were not dictated, directed, or curtailed. At each stage of the project, participants had full control of their stories and authority on what to feature in their final videos.

All contributors stated that they felt liberated and supported throughout the projects, which made even the most difficult moments feel like blips in the project’s timeline.

To those looking to produce user-generated content in collaboration with people from all walks of life, we hope these learnings serve you well. There are many measures to evaluate a film, documentary, or piece of text, but no number can ever be placed on the impact of people feeling seen, listened to, and moved.

Watch The Youth With Disabilities Digital Platform Experience videos here.

Persons Living With Disabilities Self-Directing Their Own Stories was originally published in Caribou Digital on Medium, where people are continuing the conversation by highlighting and responding to this story.


Radiant Logic

How to Monitor Cybersecurity Threats with Identity Analytics

Learn about using Identity Analytics as a preventative solution for monitoring cybersecurity threats, and how to automate access-related IT checks with prescriptive analytics. The post How to Monitor Cybersecurity Threats with Identity Analytics appeared first on Radiant Logic.

Entrust

The State of Post Quantum Preparedness, from an Analyst Perspective

As part of my hosting duties on the Entrust Engage podcast, I’ve had the pleasure... The post The State of Post Quantum Preparedness, from an Analyst Perspective appeared first on Entrust Blog.

As part of my hosting duties on the Entrust Engage podcast, I’ve had the pleasure of speaking to some outstanding guests on a variety of topics from the science behind quantum computers themselves, to impacts post-quantum will have on digital security. In the latest episode, I was pleased to get a perspective I hadn’t yet – that of an analyst – when I was joined by guest speaker, Forrester Principal Analyst Sandy Carielli. As we set up the discussion for the episode, here are some of the insights she provided on the state of post-quantum preparedness:

Samantha Mabey: Where do you currently feel like organizations are with looking at PQ and kicking off preparations to prepare against and mitigate the threat?

Sandy Carielli: Organizations are in the early stages of PQ preparation. The financial services and public sectors are the farthest ahead in this area, not surprising given the sensitivity of the data that traverses their systems. The Quantum Computing Cybersecurity Practices Act signed by President Biden at the end of last year speaks to the attention that government is placing on PQ, and the recently unveiled National Cybersecurity Strategy also stresses the importance of planning for the transition to post-quantum cryptography. However, even those industries paying close attention to PQ are at the early stages – aside from a few pilots and proofs of concept, organizations are primarily at the planning stage, with some kicking off cryptographic inventories.

SM: What do you foresee being some of the greatest challenges that organizations will face in preparing for the migration to post-quantum cyrptography?

SC: Cryptographic migration is never easy – previous migrations, such as from SHA-1 to SHA-256, have taken years. Even increasing the key size, such as moving from 1024 to 2048-bit RSA, doesn’t happen overnight. The migration from RSA or ECC to a post-quantum algorithm will be even more complicated – given how deeply embedded cryptographic functions are in code and devices, rip and replace is rarely simple. For software and systems that organizations develop themselves, development teams will need to replace existing cryptographic code with new libraries, but standard implementations of the NIST selected algorithms aren’t widespread. Then there’s the supply chain issue – organizations will rely on their partners and vendors to update cryptographic implementations in their own products before the organization can fully migrate to PQ.

SM: While there is a consensus that the threat (of a quantum computer being able to break traditional public key cryptography in use today) is possibly a decade away, there is a more immediate threat known as “harvest now, decrypt later”. Do you feel like there is a lack of awareness of this threat and that it further justifies that the need to prepare for PQ now?

SC: The “harvest now, decrypt later” threat is understood in pockets, such as government and financial services, and these are the areas where that threat is critical – customers’ bank account numbers and citizens’ government identification numbers are not likely to change in ten or twenty years. This is why these sectors have started to prepare and must continue to do so – they realize that they will need to have migrated to PQ long before a quantum computer is able to break traditional public key cryptography, and that attackers won’t be able to decrypt any PQ-encrypted data that they harvest. Outside of that, security leaders are not as aware of the “harvest now, decrypt later” threat – leaders must realize that any harvested data protected with RSA or ECC could be vulnerable later, including account information, intellectual property, and personal information (which an attacker could use as blackmail material). Broader awareness of the “harvest now, decrypt later” threat would help organizations support PQ preparation strategies.

To listen to the full Entrust Engage Episode “The Road to PQ Preparedness: The Analysts Perspective”, click here.

The post The State of Post Quantum Preparedness, from an Analyst Perspective appeared first on Entrust Blog.


Forgerock Blog

ForgeRock’s IDLive 2023: Join Us for Customer Stories, Identity Innovations and Training that Deliver Exceptional Security and User Experiences

I'm beyond excited to host IDLive 2023, our 11th annual user conference in May! And this year promises to be among our best! We'll hear from global customers who are using ForgeRock to outcompete with identity; partners sharing how they are helping customers to make this happen; and ForgeRock experts revealing next generation identity innovation, best practices and education to help users optimize

I'm beyond excited to host IDLive 2023, our 11th annual user conference in May! And this year promises to be among our best! We'll hear from global customers who are using ForgeRock to outcompete with identity; partners sharing how they are helping customers to make this happen; and ForgeRock experts revealing next generation identity innovation, best practices and education to help users optimize our powerful technology platform.

In that spirit, we look forward to bringing customers on stage during IDLive to learn from their successes and innovations across a variety of industries. You'll hear from leaders in financial services, telco, healthcare, education, and more.

I am also extremely excited to be launching a new partner program to provide a comprehensive set of benefits that will drive growth and help ForgeRock partners deliver remarkable customer experiences! The program will take our existing program and scale it to new heights, allowing ForgeRock's partners to evolve and expand their identity capabilities into areas like governance and passwordless for both CIAM and workforce use cases.

All of ForgeRock's innovations and programs are designed specifically for customer and partner needs - solutions and programs to help them grow their business, transform the way they work and maintain simple and safe digital experiences that engage their users online. Highlighted below are a few of the updates we are announcing at IDLive 2023 and a preview of what you can expect to hear from our featured customer presentations.

Our product strategy continues to adhere to our mandate of "serving the needs of the large enterprise." We'll be sharing advancements we're making across our comprehensive platform in areas like passwordless, registration and governance, and providing our view into how decentralized identity will shape the future of online experiences for everyone.

We know these ongoing improvements to our identity platform, alongside the stories you're going to hear from our impressive lineup of customers, are going to help everyone accelerate their digital identity strategies this year. Don't miss insights from:

JPMorgan Chase & Co., Managing Director, Head of Payments Engineering, Architecture and APIs Jack Gibson, FedEx SVP Anthony Norris and Navy Federal Credit Union AVP Clint Hardison who will join our CEO Fran Rosch on stage to discuss industry trends and challenges. People Corp. VP of Enterprise Information Security at People Corporation Colin McDonald and Director, Enterprise Architect Julious Nelson will discuss how ForgeRock Identity Cloud is helping the organization to establish one identity platform after many acquisitions in order to deliver optional service to millions of customers. Rolf Hausammann, Head of Identity and Access Management at Swisscom AG will discuss how one of the world's largest telco providers is leveraging identity to reduce costs and improve service to employees, customers and partners. Hasan Jafri, Vice President Platform Engineering & Common Services and Phil Beauchamp, Director of Identity Platforms, both of TELUS, will share how the power of ForgeRock Identity Cloud is helping the organization establish one identity platform for dozens of brands to better serve its 18 million customers. Nasim Hasari, Director of Identity and Access Management at York University, will reveal how the institution is using identity to improve service to workers and customers using ForgeRock's low-code, no-code Intelligent Access, also called "trees," to customize user experiences.

I guarantee you'll leave knowing more about how to go passwordless, how to make compliance a breeze with our AI-powered governance solution, and how to get a head start on building decentralized identity into your digital strategy.

In addition to this packed agenda of identity inspirations and innovations, we are also planning some fun activities to give attendees a little taste of the incredible city of Austin, TX. Join us next week and register for IDLive today!


IDENTOS

IDENTOS updates digital front door enhancing usability for users and administrators

IDENTOS rolls out Navigator 3.0, a new release of its digital front door product featuring simplified navigation for users and improved content management for administrators IDENTOS Inc., a leader in digital identity and access management, has launched the latest version of its Community Navigator, Navigator 3.0. This new Navigator includes key updates that improve the […] The post IDENTOS updat

IDENTOS rolls out Navigator 3.0, a new release of its digital front door product featuring simplified navigation for users and improved content management for administrators

IDENTOS Inc., a leader in digital identity and access management, has launched the latest version of its Community Navigator, Navigator 3.0. This new Navigator includes key updates that improve the end-user experience and offer greater ease in managing content.  It enables organizations to launch modern user-centric digital front door services and integrate digital services seamlessly, including the ability to deliver a patient-centric digital health ecosystem.

The Community Navigator: more than just a digital front door.

Navigator 3.0 (Nav 3.0) builds on the company’s previous Community Navigator version (Nav 2.0), a configurable digital front door, which leverages IDENTOS’ FPX standards based identity and consent APIs to safely share data across a set of trusted service partners. This facilitates a privacy respecting, secure, and personalized user experience.  

The Community Navigator:

-Enables digital service discovery: an easily configurable app to support simple integrations or a complex marketplace of partners for your users to discover.

-Is built for trusted onboarding and digital identity services: facilitates onboarding of one or multiple digital services without requiring an organization to overhaul their existing/internal workflows.

-Allows for granular data sharing: IDENTOS’ FPX standards based identity and consent APIs allows for the safe sharing of data across a set of trusted service partners.

 

Benefits/features of the updated Navigator

-A simplified end-user experience: enabled by a simplified navigation in the Navigator and its Wallet application through the introduction of user-friendly tabs and buttons – improving discoverability of content and integrated services.

-A modern web-based application that retains the benefits of a native iOS and Android application: facilitated by a web-based interface using open pervasive technology (html and css) that resides in a native application shell enabling IDENTOS to offer the benefits of both native and web apps.  This allows for instantaneous content updates, without needing to re-submit to App stores. 

-Increased flexibility around adding, updating or removing content for administrators: resulting in improved efficiency surrounding content management.

These significant enhancements will benefit various institutions, including government, financial and health service providers. This also includes health teams and healthcare systems that operate with several EMRs/EHRs, ACOs, and entities accountable for ensuring seamless care delivery with a collaborative approach to healthcare.

Availability
IDENTOS’ Navigator 3.0 updates are available today to new and existing customers.

Learn more about Navigator 3.0 in our upcoming webinar and demo: 
Register here.

The post IDENTOS updates digital front door enhancing usability for users and administrators appeared first on Identos.


KuppingerCole

Delivering Business Value through Orchestration

by Phillip Messerschmidt In an ever-changing digital world, enterprises and vendors face new digital challenges. This is driving the adoption of new solution strategies in which delivering satisfactory modern identity services is key. As a result, the market is shifting to a more agile and modular approach that leverages orchestration to integrate solutions, even in complex architectures.

by Phillip Messerschmidt

In an ever-changing digital world, enterprises and vendors face new digital challenges. This is driving the adoption of new solution strategies in which delivering satisfactory modern identity services is key. As a result, the market is shifting to a more agile and modular approach that leverages orchestration to integrate solutions, even in complex architectures.

Ocean Protocol

Introducing Ocean Templates: Tools to build Next-Generation Web3 dApps

At Ocean Protocol we believe in democratising access to data and making it accessible for everyone. Today we’re excited to announce the launch of Templates, a new open-source set of tools that make building decentralised applications (dApps) easier than ever before. Ocean Templates are designed to be flexible and easy to use, allowing developers to build diverse sets of Web3 dApps without ne

At Ocean Protocol we believe in democratising access to data and making it accessible for everyone. Today we’re excited to announce the launch of Templates, a new open-source set of tools that make building decentralised applications (dApps) easier than ever before.

Ocean Templates are designed to be flexible and easy to use, allowing developers to build diverse sets of Web3 dApps without needing to have prior smart contract programming knowledge.

By leveraging the core functionalities of Ocean’s smart contracts, such as data NFTs to represent IP ownership, datatokens to define the rules of IP access, as well as compute-to-data to orchestrate privacy-preserving access to data — users can now build next-generation marketplaces and token-gated applications.

These templates are designed to work seamlessly with large-scale data sets, making it easier to create powerful and flexible applications that can be used in a variety of contexts. Developers can also customize these templates to suit their own requirements, or offer them as tokengated templates to the broader Web3 community — for free or at a cost.

The launch of Ocean Templates marks a significant step forward for the Ocean Protocol ecosystem. By providing advanced tools that makes it easier for developers to build Next-Generation Web3 dApps, we’re making it simpler for everyone to participate in shaping a New Data Economy.

We’re excited to see what the community builds using Ocean Templates, and we’re committed to supporting the wider ecosystem as it continues to grow and evolve.

If you’re keen to get started with Ocean Templates, visit our website and check out the available templates at: oceanprotocol.com/templates

If you are looking to launch your own template, and monetize it to the broader Web3 community, reach out to: andrea@oceanprotocol.com

About Ocean Protocol

Ocean Protocol aims to level the playing field for data and AI by enabling people to privately & securely publish, exchange, and consume data assets.

Follow Ocean on Twitter or Telegram to keep up to date. Chat directly with the Ocean community on Discord or track Ocean progress directly on GitHub.

Introducing Ocean Templates: Tools to build Next-Generation Web3 dApps was originally published in Ocean Protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.


Coinfirm

The Digital Euro: An Exploration of the International Aspects of CBDCs

The world of finance is rapidly evolving and at the heart of this transformation is the emergence of digital currencies. Central Bank Digital Currencies (CBDCs), in particular, have been a topic of increasing interest among global central banks. Coinfirm, as a leader in blockchain analytics and anti-money laundering (AML) solutions, closely follows these developments. Today, […] The post The Dig
The world of finance is rapidly evolving and at the heart of this transformation is the emergence of digital currencies. Central Bank Digital Currencies (CBDCs), in particular, have been a topic of increasing interest among global central banks. Coinfirm, as a leader in blockchain analytics and anti-money laundering (AML) solutions, closely follows these developments. Today, we delve into the latest updates from the European Central Bank (ECB) on their ongoing project – the digital euro. The Digital Euro: Prioritizing the Euro Area

According to a recent presentation from the ECB, the primary focus of the digital euro project is to serve the needs of the euro area. The ECB plans to gradually introduce different use cases for the digital euro, aiming to make it accessible to everyone within the euro area before expanding to users outside this zone. This phased approach is designed to prevent the project from becoming overly complex while ensuring it meets the needs of its primary users.

Exploring International Opportunities

The ECB recognizes the significant potential of an international dimension for the digital euro. It aims to enable access to and use of the digital euro outside the euro area without requiring currency conversion. Furthermore, the bank acknowledges the need for interoperability to facilitate cross-currency payments. The ECB’s foresight in considering these broader opportunities is a promising indicator of the digital euro’s future utility.

A Rising Tide of CBDC Projects

The ECB isn’t alone in this digital currency exploration. In fact, a recent report by the Bank for International Settlements (BIS) shows a substantial rise in CBDC projects around the world. The ECB is actively supporting international CBDC initiatives, collaborating with the BIS, G7, and International Monetary Fund (IMF) among others. This collective effort by central banks worldwide underscores the significance of CBDCs in the future of finance.

The International Use Cases of the Digital Euro

When it comes to the use of the digital euro outside the euro area, the ECB is considering several scenarios. These include cross-border payments in digital euro, payments in digital euro outside the euro area, and use by visitors. The rules for geographical access will be determined by legislation, and if permitted, access could be extended to consumers and merchants in the European Economic Area and selected non-EU countries.

Interoperability of Digital Euro with Other CBDCs

One of the key factors in the success of the digital euro will be its interoperability with other CBDCs. The ECB has proposed two models to facilitate this: the interlinking model, which allows transactions without participation in each system, and the single system model, which involves a single system across multiple currencies.

Wholesale Central Bank Money Settlement: A New Initiative

The ECB is looking to contribute to digital innovation while ensuring that central bank money remains a monetary anchor. To this end, it has announced a new initiative on wholesale central bank money settlement. A new Eurosystem task force has been established to explore new technologies that could facilitate interaction between TARGET services and distributed ledger technology (DLT) platforms.

Final Thoughts

As an organization deeply entrenched in the blockchain and digital currency space, Coinfirm recognizes the potential of CBDCs to revolutionize financial systems globally. We are excited to see how the ECB’s exploration of the digital euro progresses and will continue to provide our expertise and solutions in AML and blockchain analytics to support this evolution.

Presentation: International aspects of CBDCs: update on digital euro

(source: ECB)

ecb.degov230515_InternationalconsiderationsofCBDCs.en_Download

Stay tuned to our blog for more updates on CBDCs and other developments in the blockchain and cryptocurrency space!

The post The Digital Euro: An Exploration of the International Aspects of CBDCs appeared first on Coinfirm.


UbiSecure

Identity Platform 2023.1 is released today, including SSO 9.2 and CustomerID 6.2

During this development cycle we have focused our lens on Sweden in particular, specifically with the continued support for Swedish BankID. Additionally,... The post Identity Platform 2023.1 is released today, including SSO 9.2 and CustomerID 6.2 appeared first on Ubisecure Customer Identity Management.

During this development cycle we have focused our lens on Sweden in particular, specifically with the continued support for Swedish BankID. Additionally, we have performed expected security and life-cycle management development. For full details, please head over to review the Release Notes.

Features Swedish BankID

Throughout the Nordics, the ability to strongly identify oneself is considered to be second nature. The countries have invested substantial time over the last twenty years ensuring that each citizen and resident can perform many of their day-to-day tasks online. To carry this out, one needs to have a strongly authenticated digital identification method.

Within Sweden, this is predominantly BankID. Since our original support of BankID via CIBA (Client Initiated Backchannel Authentication) back in the summer of 2018, things have changed. We have reviewed the BankID Relying Party Guidelines (v3.7) and developed an authentication adaptor to enable you to directly configure BankID within the Identity Platform. This means both two device flow, and the more commonly used single device flow, are supported. Now your users can strongly authenticate your web page or mobile app directly on their mobile device.

Focus Sweden

Full support for BankID coupled with our existing support for Freja eID and Svipe means that you can empower your users to authenticate with the most common services in Sweden. It’s well known that Ubisecure has a long history of supporting the Finnish population with a full suite of authentication options, from banking (TUPAS, which is now the FTN), through social logins (O365, Google and Apple), as well as specialised authenticators.  Our latest release demonstrates our commitment towards Sweden, and we continue to develop the Identity Platform with a pan-European focus. We will be carrying out additional work over the next year to develop animated QR codes, required in 2024 by BankID. If there is an authenticator you need, that we are not yet offering, please contact Support or your account team and let us know. We work alongside our customers to ensure our Identity Platform evolves in line with your needs and expectations.

Improved Security

Ongoing security maintenance is, of course, paramount. As we maintain the Identity Platform, we continue to improve both its functionality and security. During this release cycle, we have focused on updating Wildfly (to Wildfly 26). We have also updated OpenLDAP to 2.5.14, and Redis support to 6.2.8. Support of Linux Redhat 8 has also been tested. We always encourage you to upgrade to the latest possible release of our software as it provides the most secure and feature-rich Identity Platform possible. As always, you can find highlights of the changes on the Release Notes along with a link to the detailed example of the logging changes found on our Developer Portal.

Looking Forward

In the short term, we are continuing work towards ongoing improvements that will be available in the next month, when we release updates for both SSO (SSO 9.2.1) and CustomerID (CID 6.2.1). Over the summer, we are continuing the maintenance cycle as usual. Continuing into the autumn and winter, we will be continuing our research into risk-based authentication development and other aspects of strong authentication.

Please have a listen on your favourite podcast platform to our Let’s talk about digital identity podcast. You will hear from a wide range of guests covering many digital identity hot topics, including possible development areas. If there are specific features you would like to see developed into the Identity Platform, you are very welcome to let our Operations team, or your account team, know.

As always, if you have any questions over this release or prior releases, just reach out. We are here to help.

The post Identity Platform 2023.1 is released today, including SSO 9.2 and CustomerID 6.2 appeared first on Ubisecure Customer Identity Management.


HYPR

An Intro to PSD2 SCA Requirements

It’s estimated that by 2024, 74% of fraudulent card transactions worldwide will involve card-not-present (CNP) transactions. The PSD2 regulatory framework is designed to protect customers and financial institutions operating in the digital payment ecosystem from fraud, especially CNP fraud.

It’s estimated that by 2024, 74% of fraudulent card transactions worldwide will involve card-not-present (CNP) transactions. The PSD2 regulatory framework is designed to protect customers and financial institutions operating in the digital payment ecosystem from fraud, especially CNP fraud.

What is PSD2 SCA?

The revised Payment Services Directive, more generally known as  PSD2, is a regulation that has been implemented in Europe, covering countries in the EEA, Monaco and the UK. Its main goals are to improve customer choice among payment providers, protect customer safety, and combat payment fraud. One key aspect of PSD2 is its obligation for all companies that provide payment services, including banks, financial institutions or other payment service providers (PSPs), to use Strong Customer Authentication (SCA). 

The PSD2 SCA requirements oblige relevant companies to enhance customer security around payments by deploying multi-factor authentication (MFA) for relevant transactions. MFA means that the customer must verify their identity by providing two or more factors of authentication. The guidelines on PSD2 SCA also note that authentication must be performed in a manner that protects customer data. 

Who Needs to Comply with PSD2 SCA? 

Who falls under the umbrella of PSD2 SCA compliance? Well, it's not just the traditional players in the banking industry. Thanks to PSD2, new players like Payment Initiation Service Providers (PISPs) and Account Information Service Providers (AISPs) have entered the scene. PISPs act as intermediaries between customers' banks (or where their money is) and sellers of goods or services. AISPs, upon user permission, consolidate all user financial information (such as deposits, loans, and direct debits) in one place. 

If your company falls into these categories, you need to adhere to the PSD2 SCA requirements when providing the following services:

Payment initiation services (these inform a seller that the customer has proceeded through an online purchase) Account access (when a user, or a third-party service provider, asks their bank to inform them of their financial situation) Card-based instructions (when a customer asks their bank or PSP to pay someone from their account)

If the payment or information instruction is initiated and processed in the regulation’s area, it must follow PSD2 SCA guidelines. For US companies, this could apply, for example, if they have a European-based entity or are looking to provide the specific services mentioned to EU citizens.

Organizations are required to decline transactions that do not meet SCA requirements. There are exemptions for small transactions (less than €30), where the financial institution can prove it is a low-risk transaction or where the customer has whitelisted the service provider, such as for a direct debit of a phone or electricity bill.

How to Implement PSD2 SCA

To comply with the PSD2 SCA requirements, organizations must ensure that their customer authentication meets the regulatory technical standards (RTS) set out for PSD2 SCA. This involves implementing MFA for relevant transactions and adhering to  other specifications detailed in the RTS. These include:

Dynamic linking of remote payments (creating a unique code specifying payee and amount to reduce fraud) should be based on technologies such as digital signatures or cryptographically underpinned validity assertions using keys.  Special advice should be given about the length or complexity of knowledge factors and the algorithms underpinning possession and inherence factors.  The application of PSD2 SCA should strike a balance between enhanced security for relevant payments and user-friendliness and accessibility.  Payments and PSD2 SCA elements must have separation. For example, if a smart device is used to make a payment and also counts as a factor of authentication, there must be a clear process to differentiate them.  Benefits of PSD2 SCA

Beyond regulatory compliance, PSD2 SCA brings other significant benefits to financial institutions and their customers, including improved customer security, fraud reduction, and a better user experience. 

Improved Customer Security

First and foremost, PSD2 SCA enhances customer security by raising the bar for authentication. With MFA in place, it becomes much harder for attackers to impersonate users and carry out fraudulent activities. While there are still issues around allowing knowledge factors (namely passwords) in MFA as they can be easily phished or brute forced, mandating MFA is a major step towards a more secure online world for consumers.

Reduced Fraud

Fraud committed through account takeover, which uses a victim’s card or bank details to purchase items, can create significant distrust between the customer and the firm involved, even if neither party was at fault. Reducing payment fraud fosters greater trust in online transactions, making customers more comfortable with making purchases and boosting revenue opportunities for service providers.

Better User Experience

The PSD2 SCA regulation recognizes the importance of maintaining a user-friendly process. Nobody wants frustrated customers abandoning their shopping carts due to cumbersome authentication requirements or forgotten passwords. The specification for being accessible and user-friendly means companies complying with PSD2 SCA should consider how their authentication stream impacts the user and how it can be as frictionless as possible.

What’s Next?

Earlier this year, the European Commission presented a study on the application and impact of PSD2. The study follows a period of open consultations into revisions of PSD2 legislation. It is widely expected that a PSD3 framework will be announced, but its timing and the extent of the revisions have not yet been made public.

Comply With PSD2 SCA and Improve Customer Experience With HYPR

The PSD2 regulates financial transactions in the EU and other countries such as the UK and Norway. Its strong customer authentication (SCA) requirement obliges financial institutions, banks and payment service providers to maintain strict MFA authentication procedures for operations such as payments over €30 or requests for account information. The PSD2 SCA requirements aim to improve online consumer trust by reducing fraud and account takeover attacks, yet without a significant trade-off of user experience.

HYPR’s True Passwordless solution for customer authentication allows firms to fully comply with the PSD2 SCA and transaction signing directives, including cryptographic signing of every transaction and unique dynamic linking. With HYPR, you can ensure SCA regulatory compliance while removing friction from your customer authentication process. To learn more, read our PSD2 SCA guide or contact our team.

Monday, 15. May 2023

Tokeny Solutions

On-demand Webinar: Opening Up Access to Fine Art Investment with Tokenization

The post On-demand Webinar: Opening Up Access to Fine Art Investment with Tokenization appeared first on Tokeny.

Webinar: Expert Talk Series 

Opening Up Access to Fine Art Investment with Tokenization View on-demand recording below Daniel Coheur

Chief Commercial Officer

Elizabeth von Habsburg

Managing Director

Nanne Dekking

CEO & Founder

Replay the webinar Access the exclusive recording of the webinar co-organized by Tokeny, Artory, and Winston Art Group. 

Remarkably, the top artists in the fine art market have surpassed the S&P 500’s performance over the past 20 years. However, high barriers to entry have traditionally made participation challenging. Today, thanks to tokenization, we have the opportunity to democratize access to these profitable investments, ensuring global liquidity access for all. 

Topics to be covered: The benefits of art tokenization for investors and issuers The main challenges: Inaccessibility, untrustworthy valuation, and illiquidity How tokenization addresses these challenges Ensuring tokenized assets are compatible with securities distributors

The post On-demand Webinar: Opening Up Access to Fine Art Investment with Tokenization first appeared on Tokeny.

The post On-demand Webinar: Opening Up Access to Fine Art Investment with Tokenization appeared first on Tokeny.


auth0

Preparing for Rules and Hooks End of Life

Actions is the next generation of extensibility
Actions is the next generation of extensibility

HYPR

Using "Approval-as-Code" in Access Management

In today's modern work environment, effective management of user identities and access is crucial for securing enterprise applications, systems, and data. As organizations increasingly deploy cloud-based services and resources, managing access has become complex and necessary. As a helpdesk engineer, it’s my responsibility to manage and track access requests from end users. To streamlin

In today's modern work environment, effective management of user identities and access is crucial for securing enterprise applications, systems, and data. As organizations increasingly deploy cloud-based services and resources, managing access has become complex and necessary. As a helpdesk engineer, it’s my responsibility to manage and track access requests from end users. To streamline this process and ensure secure access, I recently deployed an approval-as-code tool, which automates the approval process for access requests and integrates with our existing tools. This will enhance our ability to manage access and mitigate the risks associated with unauthorized access to critical systems and data.

Access Management Challenges

Traditionally, I provide users with predefined default access based on their roles and responsibilities. Some admin users are provided with full access to core resources on an ongoing basis. Unfortunately, these accounts are potential targets for security breaches.  Experian, Uber and Okta, and many others, were all breached due to hackers, such as the Lapsus$ group, exploiting admin access.

In some emergency situations, users are provided with elevated access to be able to resolve a critical issue or incident. These requests are largely handled in either tickets or Slack threads. In other words done “by hand”. 

Some of the challenges I face:

Inefficiency: Manually managing access can be an inefficient process and mistakes can be made if access is not removed. Compliance challenges: It can be difficult to demonstrate that access controls are being enforced when some users have perpetual admin access. Scaling: As the organization grows and the number of users and resources increases, it becomes more difficult to manage access for a large number of users across many systems. Creating access policies can get complex. What Is Approval-As-Code?

At HYPR, Infrastructure-as-Code (IaC) is an already adopted practice to automate and standardize the process of deploying and managing our IT resources. We recently came across a tool that advertises “approval-as-code,” which extends the same principles as Infrastructure as code. With approval-as-code, we can automate and codify approval workflows for elevated access.

The tool in question is called SYM by a company called SYMOps. Sym allows you to build approval workflows that can be initiated and approved within Slack to dedicated resources defined within the tool.  This provides just-in-time access in a manner which is audited and tracked. The current approval flow we have in place:

End users request access within Slack and are asked Which resource they need access to Justification (ticket, incident or reason) Duration of access Request is routed to a dedicated Slack channel Designated approvers can either approve or deny the request If approved, the user is provided access to the resource for the duration selected Once the time for the duration expires, access is removed and the user is notified within Slack. 

Being that SYM is approval-as-code, we use our current code review process and I can collaborate with my team when building the workflows. The tool itself is managed with Terraform code so you can deploy resources and update the workflows. The approval logic is all defined in Python code so I am able to create custom policies that meet our needs but also help us scale.

Deployment 

Now I am not a strong or even an average coder, but the examples and support provided by the SYM team has been invaluable in helping my team and I get this tool off the ground running. No question was too basic or complex for their team as they helped me craft this tool which is now critical for our daily operations.

SYMops offers multiple out-of-the-box integrations with their Python libraries. With the custom logic, we are taking advantage of the PagerDuty integration. This integration on request checks if the user is on-call in PagerDuty, to assist with  “break-glass” access after hours in response to an incident or a page. Here is an example of the code that defines this logic which is also in the examples SYMOps provides:

https://github.com/symopsio/examples/tree/main/basic/pagerduty_on_call

Greater Security, Greater Convenience 

By implementing a tool like SYM on top of our HYPR Passwordless stack, we can improve the speed, tracking and security of our applications, systems and data. Being that the tool is built within Slack allows our organization to easily adopt this process and not impede any critical work or velocity of our teams. Having the elevated access being gated by the approval workflows helps us ensure that there are sufficient guardrails for access controls in place and they are enforced.


IDnow

Where is micromobility headed? Industry experts debate polls, Paris, and the future of transportation.

Recap of recent IDnow ‘The State of Micromobility Regulations 2023’ webinar. Our April webinar, which explored the current state of micromobility regulations, could not have happened at a more interesting time for the industry, following the recent Paris escooter ban. Moderated by Oliver Bruce, Co-Host of The Micromobility Conference & Podcast, the hour-long session welcomed […]
Recap of recent IDnow ‘The State of Micromobility Regulations 2023’ webinar.

Our April webinar, which explored the current state of micromobility regulations, could not have happened at a more interesting time for the industry, following the recent Paris escooter ban.

Moderated by Oliver Bruce, Co-Host of The Micromobility Conference & Podcast, the hour-long session welcomed industry insiders and experts Usman Abid, Senior Product Manager at Tier, Martin Lefrancq, New Mobility Policy Advisor at Brussels Mobility, Felix Aumair, Head of Customer Success at GoUrban, along with our very own Michael Holland-Nell, Senior Sales Manager Mobility at IDnow.

What happens in Paris, stays in Paris?

With the escooter ban fresh in people’s minds, it was natural for the group to start by sharing their opinions on how it would likely impact the wider micromobility landscape in the rest of Europe.

Surprisingly, it was a mixed bag of opinions, with some participants pointing out that it would likely heap additional pressure on mobility operators, who would be forced to play the waiting game while other cities decide whether to follow suit with a ban or not.

Despite the recent happenings in Paris, Michael said that there was no time to waste, and operators should seize the opportunity to enter the growing micromobility industry, “Now is a good chance to invest more and have a bigger focus on these kinds of topics to improve and develop micromobility services in the future.’’

The relationship between technology and regulations.

Later in the webinar, Felix and Michael discussed what measures and technology micromobility providers could implement to improve user compliance with regulations and ensure responsible use of their services.

This is where IDnow supports many companies – not only in mobility – to give more responsibility to the user because there needs to be rules in place, certain behaviors, certain qualifications to make use of certain services, especially in the mobility sector. But since regulations develop and change, it’s important that providers have flexible solutions.

Michael Holland-Nell, Senior Sales Manager Mobility at IDnow
Shaping the future of a safer, more sustainable mobility future.

Later in the webinar, discussions shifted from regulations to the role that micromobility could play in addressing urban congestion, and how this can be better communicated to policymakers and the public.

Martin mentioned how there have been huge changes over the years regarding sustainable urban mobility in Brussels.

“Micromobility is forecasted to help reduce carbon emissions in our cities by 30-50% by 2030,’’ said Usman.

Before the webinar ended with a Q+A session with the audience, the panel covered the key trends and developments that will shape the regulatory landscape in the coming years.

Predictions ranged from the importance of free-floating, dedicated parking zones to why micromobility needs to have a consistent regulatory framework that is only part of a broader ecosystem. Michael said that the very future of cities was dependant on the success of micromobility, citing the need for collaboration between key stakeholders (users, city governance, operators and regulators) to ensure regulations offer a safer and balanced mobility experience for everyone.

To watch the full-length webinar on-demand, click ‘The State of Micromobility Regulations in 2023’ webinar.

By

Kristen Walter
Content Marketing Associate
Connect with Kristen on LinkedIn

Trend Report: Mobility Download our ebook to find out more about where micromobility is headed. Get your free copy

PingTalk

Top Benefits of SSO and Why It's Important for Your Business

Single sign-on (SSO) has been prevalent in many organizations for years, but its importance is often overlooked. Many enterprises are moving to the cloud and using third-party services. Business efficiency and a seamless customer experience is essential. To achieve this, access to multiple applications must be available from any location and device.   Single sign-on allows users to log