Last Update 6:06 PM September 20, 2020 (UTC)

Identosphere - Company Blog Feeds

Brought to you by Identity Woman and Infominer.
Please do support our collaboration on Patreon!!!

Saturday, 19. September 2020

SELFKEY

Governance Tokens: DeFi Giving Control Back to the Users

SelfKey Weekly Newsletter Date – 16th September, 2020 In this edition, read about Governance tokens and how it is democratizing crypto. The post Governance Tokens: DeFi Giving Control Back to the Users appeared first on SelfKey.

SelfKey Weekly Newsletter

Date – 16th September, 2020

In this edition, read about Governance tokens and how it is democratizing crypto.

The post Governance Tokens: DeFi Giving Control Back to the Users appeared first on SelfKey.

Friday, 18. September 2020

procivis

Online Session: Enhancing resilience in the Digital State Meeting of Thursday, 3 September 2020

Presentation by Member of the European Parliament Eva A. Kaili, Chair, Panel for the Future of Science and Technology, European Parliament, Belgium Presentation by Dr Gianluca Misuraca, Senior Scientist, European Commission, Joint Research Centre, Institute for Prospective Technological Studies, Spain The Procivis Think Tank meeting of 3 September 2020 took place against the backdrop of […] The

Presentation by Member of the European Parliament Eva A. Kaili, Chair, Panel for the Future of Science and Technology, European Parliament, Belgium

Presentation by Dr Gianluca Misuraca, Senior Scientist, European Commission, Joint Research Centre, Institute for Prospective Technological Studies, Spain

The Procivis Think Tank meeting of 3 September 2020 took place against the backdrop of a corona flare up and new travel restrictions in Europe. Accordingly, at short notice we switched to our first-ever webinar format.

Our first speaker, MEP Eva Kaili, shared her views against the background of the announcement of the EU’s €750 billion “Next Generation” pandemic recovery plan in addition to its new long-term budget of €1,100 billion. She highlighted the EU’s priorities – “green and digital” and security in North Africa/Middle East.

How did the European parliament operate during the lockdown?

Eva Kaili explained that ballots were submitted via email. This illustrates how one always makes do with the technology that one has during an emergency. Projects were under way to make the system more resilient. During the Q&A with Eva Kaili it was also helpful to be reminded again that the EU only has the competences that were conferred to it by member states in treaties.

Shaping Digital Europe 2040

Gianluca Misuraca presented his research on “Shaping Digital Europe 2040: Artificial intelligence & public sector innovation in a data-driven society”. It was interesting to see that the EU is funding such long-term research. Looking ahead to 2040, he presented scenarios based on a sliding scale of digital participation by citizens vs. the degree to which this participation was regulated.  The digital future might not be knowable, but we can imagine its contours, challenges and opportunities.

The post Online Session: Enhancing resilience in the Digital State Meeting of Thursday, 3 September 2020 appeared first on Procivis.


Meeco

Meeco announces KidTech partnership with Heder to co-develop safe media platform for kids

Meeco has some very happy news to share about a new partnership with Heder VZW and a new KidTech product planned to launch December 2020. But first, we would like to share a little about why we believe this is so important. Back in April 2019, we presented our thoughts at the ... Read More The post Meeco announces KidTech partnership with Heder to co-develop safe media platform for
Meeco has some very happy news to share about a new partnership with Heder VZW and a new KidTech product planned to launch December 2020. But first, we would like to share a little about why we believe this is so important. Back in April 2019, we presented our thoughts at the World Government Summit on the digital future for Kids.
“Every child born in today’s digital world will live a dual physical and digital life. How we enable privacy and security for their digital twin in as important as the rights afforded to them in the physical world. If we don’t get this right, we risk a generation born into digital slavery”

– Katryna Dow, Meeco
World Government Summit, 2019
Sadly, a year on from the Summit and our concerns are evidenced all too often. This week by the news that the parents of 5 million British children under the age of 13, are mounting a legal battle with the tech giant Google, for $3 Billion. At the heart of their claims is the desire to protect the digital rights of their children. The lawsuit alleges that YouTube’s method of targeting underage audiences constitute “major breaches” of UK and European privacy and data rules designed to protect citizens, especially kids. This month NETFLIX released The Social Dilemma, a must watch documentary on the social media addiction that is both alluring and alienating for each of us, especially our children. The documentary includes alarming statistics on the increase in childhood depression, self-harm and suicide. Most alarming is that the age group impacted is getting younger and younger and directly linked to their early access to social media. These platforms make it all too easy for underage children to access and be subjected to unfiltered feedback that contributes to feelings of inadequacy, dissatisfaction and isolation.
“There has to be a better way for us to enable children to participate in the digital world, along with making their first digital experiences safe.”
– Katryna Dow, Meeco
Enter Heder, a not-for-profit organisation based in Antwerp, Belgium. Heder provides care and guidance to improve quality of life for children with different abilities, along with their families and extended networks. Heder’s philosophy is to always start from present strengths. Finding small things or looking for bigger solutions that enable people to focus on their unique strengths. Fostering pride and allowing themselves to experience joy from everyday things. Cultivating this from early childhood, together with support from family and community really makes a difference. Their multidisciplinary experts work together to give support in different ways including creative play, adjusted sports, training, physical and psychological therapy. Heder offers services in the home, at day-care, school or on campus, and is just some of the ways Heder contributes to a more inclusive society. And now, there’s another way; drawing on Heder’s philosophy of building on strengths, the idea for mIKs-it was born. After a number of years of research, the inspiring vision of Gaby Pereira Martins (Masters in Educational Sciences) and Kim Struyf (Director of Early Development), supported by Heder’s Senior General Manager, Erik Van Acker, will now become a reality.
“Together with Meeco we can give mIKs-it a strong foundation”
– Kim Struyf, Heder
Meeco is honoured to announce our partnership with Heder to co-develop a media platform for kids, focussing on the developmental stages of 0-7 years. The platform is called mIKs-it. It has been designed to foster joy and connection to everyday life for kids, with privacy, security and control as the foundation. Early childhood is the most vulnerable time in a child’s development as they are totally dependent on family and society for nurturing and protection. It is also likely the period when they will first come into contact with technology. Whether it is playing with their parents’ phone, watching YouTube or listening to content, all these media choices present new challenges to navigate in order to keep children safe. In designing mIKs-it we had the privilege to work with Trendwolves, who validated the design, usability and kid’s experience through a series of research projects. The research was led by Maarten Leyts, the founder of Trendwolves. Maarten brought unique insight to the project, given his expertise in KidTech. Many of the issues we wanted to understand were addressed by Maarten in his upcoming book “Generation Alpha in Beta”, due for release later this year.
“Today’s and tomorrow’s kids swipe before they draw. Say hi to Generation Alpha. But we’re facing a challenge here. Fully immersed in technology during their formative years and in a fast-changing world, differentiates Gen Alpha from previous generations.”
– Maarten Leyts, Trendwolves
Working with Trendwolves was the obvious choice as they specialise in trend research focusing on families and global youth culture. Their insights help shape and validate product design, always focussing on how we can enhance meaningful human relationships.  Following a successful proof-of-concept completed earlier in the year, the decision to develop the platform was clear. The results of the research, and especially observing kids with a wide range of abilities interact with personal content touched our hearts and motivated our action. Especially as the study included kids with a diverse range of physical and developmental abilities, focussing on media formats for kids with reduced motor skills or that are visually or hearing impaired.
“Meeco’s guiding principle in the digital world is the same as Heder’s principle for the physical world: children are entitled to a safe and positive climate in which to flourish.”
– Gaby Pereira Martins, Heder
mIKs-it has been designed to provide a safe digital place for children to interact with multimedia; photos, video and audio files. The platform is developed using Meeco’s globally awarded technology, with content secured and encrypted using the same approach Meeco developed to protect bank applications, like the digital vault for KBC Bank in Belgium. mIKs-it is a family platform and not a social network. There are two companion apps – a media app for kids, supported by an app for their trusted grown-ups to manage connections, consent and content. The features of mIKs-it are all the things that it doesn’t do, such as: no ads no tracking no manipulation no unauthorised access no contacts without your approval no content without your consent no access to your media no data mining Most importantly, the control is always with parents and guardians, and personal data is never sold!
“Heder and Meeco is a purpose driven partnership, our shared goal is to enable a more empowering digital experience for all children.”
– Erik Van Acker, Heder
If you would like to know more about our progress over the coming months and the launch you can register your interest at the mIKs-it website.
We realise there’s still a lot to do to help our kids develop healthy and empowering digital habits, but our hope is that mIKs-it is a small, but meaningful step in the right direction. Now we invite you to join us on this most wonderful adventure, into the land of mIKs-it: a playground where children and their grown ups can connect and safely share snippets of everyday life and build lasting memories.

The safe multi-media app for kids

Made with by Heder & Meeco

The post Meeco announces KidTech partnership with Heder to co-develop safe media platform for kids appeared first on The Meeco Blog.


Smarter with Gartner - IT

Build a Defensible Cybersecurity Program in 3 Steps

As the risks of digitalization evolve and cybersecurity threats grow, there’s only one way for security and risk leaders to effectively protect the organization — institute a continuous, sustainable security program. Yet all too often, organizations prioritize ticking compliance boxes over establishing effective, risk-based controls.  Executives are more likely to subscribe to a vision wh

As the risks of digitalization evolve and cybersecurity threats grow, there’s only one way for security and risk leaders to effectively protect the organization — institute a continuous, sustainable security program. Yet all too often, organizations prioritize ticking compliance boxes over establishing effective, risk-based controls. 

Executives are more likely to subscribe to a vision when the components and objectives are relevant and laid out in nontechnical terms

The result? Programs lack defensibility at the business level, leading to mistrust and making it harder to gain adequate support and investment.

[swg_ad]

“Business leaders continue to treat security as a business inhibitor due to the lack of a defensible security program that links into business outcomes,” says Tom Scholtz, Distinguished VP Analyst, Gartner.

Read more: The 15-Minute, 7-Slide Security Presentation for Your Board of Directors

To achieve a defensible information security management program, security and risk management leaders must bring the business along as they establish governance and develop the ability to assess and interpret risk effectively. 

Establish accountability with a security charter

A key aspect of defensibility is having the proper documentation and processes in place to  enable risk-based control decisions. 

To form the foundation of your security program, create an Enterprise Security Charter. This short, plain-language document establishes clear owner accountability for protecting information resources and provides a mandate for the CISO (or equivalent) to establish and maintain the security program. 

Executive leadership must read, understand, visibly endorse and annually review the charter, ensuring sign-off on roles, scope and responsibilities.

Establish an information security steering committee to ensure decisions aren’t made in a vacuum by the security team. Include direct, decision-making representation across business units and functions.

By creating a place for ongoing input and support for security programs from senior business leaders, other leaders are able to see the risks not only to their own business unit, but across the business.

Set a clear vision for security programs

Business support for the security program hinges on conveying a clear vision that reflects the unique business context of the enterprise. Has there been recent cost cutting? Where’s the organization on its digital journey? What regulatory requirements have shifted? 

Executives are more likely to subscribe to a vision when the components and objectives are relevant and laid out in nontechnical terms. The vision should reflect the mid and long-term business needs for security. 

Read more: How Security and Risk Leaders Can Prepare for Reduced Budgets

Provide a prioritized roadmap that clearly links projects and corrective actions to risks, vulnerabilities and the relevant business, technology and environmental drivers. 

Demonstrate a quick response to changing threats

Security is a moving target, and executives are under pressure to demonstrate that the enterprise can handle changing threats. By gearing programs toward anticipating and reacting to frequent, unexpected changes, security and risk management leaders illustrate their ability to protect the organization — no matter what happens in the business environment.  

To guide agile security planning implementation and operations day to day, develop a set of agreed-upon principles with business partners. Examples of principles include:

Supporting business outcomes rather than solely protecting the infrastructure

Considering the human element when designing and managing security controls

Conducting regular/periodic vulnerability assessments of the enterprise’s environment

Laying out these principles can help you continuously improve the effectiveness and efficiency of security controls while also reacting to change.

The post Build a Defensible Cybersecurity Program in 3 Steps appeared first on Smarter With Gartner.


Otaka - Secure, scalable, and highly available authentication and user management for any app.

Migrate User Passwords with Okta's Password Hook

Okta is an identity platform focused on making authentication easy to build with minimal code. Our goal at Okta is to build a solution so flexible and easy to use, that you’ll never have to build authentication again. And while Okta can provide a lot of new functionality to your application, including multi-factor authentication (MFA) based on contextual policies, self-service password resets,

Okta is an identity platform focused on making authentication easy to build with minimal code. Our goal at Okta is to build a solution so flexible and easy to use, that you’ll never have to build authentication again.

And while Okta can provide a lot of new functionality to your application, including multi-factor authentication (MFA) based on contextual policies, self-service password resets, and federation to enterprise identity providers and social media accounts, we’ve found that the biggest hesitation for many customers is password migration.

If your passwords are hashed and salted, you can do a bulk import with Okta Users API. Hashing functions supported include SHA-1, SHA-256, SHA-512, BCRYPT, and MD5. However, if your passwords are not salted, or if you do not have the option for a bulk export, you can still migrate a user’s password with a just-in-time migration. This guide will show you how to do that.

For the just-in-time migration, you will leverage the Password Import Inline Hook. Using the Password Hook, migration is done the first time the user authenticates to Okta by first verifying the password against the current user store. If the password matches the current user store, Okta will allow the user to log in and simultaneously hash, salt and store the password in Okta’s Universal Directory. If the password does not match the user is blocked from signing in and the password is not stored. This will continue until the user is verified successfully, or the user’s password is reset in Okta.

The following guide will show you how to use Okta’s Password Hook to verify a user’s password against a SQLite Database. The database will be initiated for you with credentials for two users, one with a plain text password and the other with a hashed (not salted) password to demonstrate your options. This guide will test against both requirements, so that you can test both or whichever is most relevant for you.

But first I’ll address a few key questions you may be wondering about.

What Are Inline Hooks?

If you’re familiar with webhooks, then you’ll recognize the similarities with inline hooks, but with a tie to a specific Okta process, allowing you to add custom functionality. For example, the Password Import Inline Hook is tied to the Okta authentication process, allowing us to verify the user’s password against a database before determining whether to grant access and migrate the password.

Inline Hooks do require a web service with an internet-accessible endpoint. In this example you’ll be using Glitch, a simple-to-use app-hosting platform, to both host and run the custom process flow. However, generally users host the Inline Hook service within a web server they already have in place.

The call is synchronous, so that Okta pauses the process that triggered the flow (in this example, authentication) until a response from the service or endpoint is received. Every inline hook takes the form of HTTPS REST, and includes a JSON object, but the specifics of the data.context object within will vary by inline hook. The response from the service back to Okta will also typically include a JSON payload with a commands object to specify actions and communicate back to Okta.

For example, with the Import Password Inline Hook, the JSON data.context object includes credentials typed in by the user. The service is expected to evaluate if the credentials match the original user store and send a response. The response is expected to include the command com.okta.action.update, specifying if the credentials are VERIFIED or UNVERIFIED.

For more information, visit our (Inline Hooks documentation)[https://developer.okta.com/docs/concepts/inline-hooks/] and specifically Password Import Inline Hooks.

All the code is already written for you, all you have to do is copy the application I created, create an Inline Hook within your Okta tenant and you’re ready for testing. I’m using Node.js, but you can convert the steps I’m using to the language of your choice.

Use Import Password Inline Hook

Remix This Glitch App: https://okta-inlinehook-passwordimport.glitch.me/

Remixing will create a name and URL that you’ll be able to use for testing. For example:

Glitch app name: snapdragon-foremost-flat Glitch app URL: https://snapdragon-foremost-flat.glitch.me/

The code is written to automatically initialize a SQLite database and insert the first two rows of data. To confirm this, click Show (next to Glitch App Name) and select In a New Window.

Add /getUsers to your URL and you should see the following JSON output with user credentials for two users.

The first user is stored with a plain text password and the second is stored with a hashed password so that you can test both use cases.

Create an Inline Hook in Okta

You will point the inline hook to the /passwordImport endpoint, as I’ve created a POST Request to receive the credentials from Okta and send a response of VERIFIED or UNVERIFIED dependent on if the password was validated against the original user store.

The code below is already in the Glitch app you remixed, but is also provided here for visibility.

// Attempt to verify user password and return result to Okta app.post("/passwordImport", async (request, response) => { console.log(" "); // for separation of logs during testing var verify = await comparePassword(request.body.data.context.credential); console.log('Password for ' + request.body.data.context.credential['username'] + ": " + verify); var returnValue = { "commands":[ { "type":"com.okta.action.update", "value":{ "credential": verify } } ]} response.send(JSON.stringify(returnValue)); }) // If database password stored as hash, call hashpassword() to convert typed in password // Compare typed in password to database password and return result async function comparePassword(creds) { var pwd = ""; if (Boolean(hashed)) { pwd = await hashPassword(creds['password']); console.log('Text Password: ' + creds['password']); console.log('Hashed Password: ' + pwd); } else { pwd = creds['password']; console.log('Text Password: ' + creds['password']); } var db_password = await getDbPwd(creds['username'].toLowerCase()); console.log('Database Password: ' + db_password); if (pwd === db_password) { return('VERIFIED'); } else { return('UNVERIFIED'); } }

In your Okta tenant, navigate to Workflow > Inline Hooks > Add Inline Hook in the Admin Portal and select Password Import.

Give the hook a name and enter your Glitch base url + /passwordImport as the URL. Select Save.

Note: For this test, authentication verification has not been added but would be recommended in a production environment.

Test Your Password Import Hook

That’s it, since all the code has already been written in your remixed Glitch application, you just need to test.

You will need to use Okta’s [Create User with Password Import Inline Hook API Request]](https://developer.okta.com/docs/reference/api/users/#create-user-with-password-import-inline-hook). This will indicate to Okta that the Import Password Inline Hook we created should be called the next time the user authenticates. The hook will continue to be called until the user’s password is verified successfully or until the password is reset in Okta.

The easiest way to use the API is to create and use an Okta Postman Environment. We will be working off the Users Collection, but will need to duplicate an existing request to create one for the Password Import Hook. For this example, I copied Create User with Password by selecting the three ellipsis to the right of the request and selecting Duplicate.

Once duplicated, I renamed the request to Create User with Password Hook and set activate to true. The body of the request should be modified so that the password value is a hook with a default type. If you want to test with the data already inserted into the database you’ll want to create two users tinatest@doesnotexist.com and timtest@doesnotexist.com.

The screenshot below shows the request body format in Postman with the data set to match the first user in the database. You’ll need to send the request twice, once for the Tina Test user and a second time for the Tim Test user.

Once the user is added to your Okta tenant, you can attempt to log in. I’ve included message outputs via the console of the Glitch application. If you’d like to confirm the service receives the request and the data retrieved, select **Tools > Logs **. For the initial tests with tinatest@doesnotexist.com, first type in an incorrect password to confirm the user is blocked from signing in and the password is not stored, requiring the inline hook to be called again during the next user login attempt. Once sign-in is blocked, you can attempt to log in with the correct password: textPassword.

The user should be signed in, landed on the Okta dashboard. If checking the console, your output should look similar to this:

Optionally, you can sign out and log in again to confirm the inline hook is no longer called for this user now that the password has been successfully stored in Okta.

##Test Your Password Import Hook with Hashed Password

For the second user, timtest@doesnotexist.com, you’ll be using a hashed password, so you’ll need to make a slight modification to the code of the glitch application.

On line 44, there is a line of code setting the variable hashed to false. Change the value to true.

// Determines if password needs to be hashed prior to comparing // This example uses SHA-512 (already imported), but can be modified to include more or alternate hashes // If true, the password typed in by the user will be hashed before comparison // If false, the plain text password will be used for comparison var hashed = true;

This will result in a function being called to convert the password typed in by the user to a hashed value for comparison to the value in the database.

// Function to convert password to hashed password for comparison // Can be modified for alternate hash functions or to add salt if needed function hashPassword(pwd) { return sha512(pwd); }

The code above is separate from the rest of the application logic, to allow for easy modification to an alternative hashing function depending on what is used to store passwords in your database or user store.

Optionally, clear the log for the app, so that only messages for this new user will be visible.

The user timtest@doesnotexist.com should already be in your Okta tenant from the previous step, but if not, add the user following the steps and formatting described above.

Once the user is created in your Okta tenant, attempt to log in first with an incorrect password and then with the correct password: hashedPassword.

Similar to the first user, the incorrect password should result in the user being blocked from signing in and the password not being stored. The second attempt, with the correct password, should allow the user access to the Okta dashboard and the password should be stored. Console Log output should look similar to this:

As before, you can sign the user out and log back in to confirm the inline hook is not called again now that the password is stored.

It’s as easy as that! You now have a working inline hook that can migrate your user’s password at the time of first login.

Continue Developing with Okta and Event Hooks

This post walked through using the Import Password Inline Hook, but there are additional inline hooks to expand functionality for user import, registration, or customizing SAML Assertions and OIDC tokens. If you’re interested in learning more about these, visit our Inline Hooks documentation for resources on how to build and use them.

As mentioned earlier in this post, Inline Hooks are synchronous, pausing the process that triggered the flow until a response is received. However, if you would like an asynchronous option that will allow the process to continue running without disruption, we have options for this as well with Event Hooks. View our Event Hook Eligible documentation to see options for triggering asynchronous hooks.

If you’d like to learn more about Okta Hooks or if you’re interested in learning how you can add Okta authentication to your Node.js application, you may find these posts helpful:

Use Okta Token Hooks to Supercharge OpenID Connect Build Easy User Sync Webhooks with Okta Node.js Login with Express and OIDC

If you like this content, be sure to follow us on Twitter and subscribe to our YouTube Channel for updates on new posts and videos.

Thursday, 17. September 2020

MATTR

Using privacy-preserving ZKP credentials on the MATTR Platform

MATTR is proud to announce we’ve added support for privacy-preserving verifiable credentials on our platform using BBS+ signatures. Using a technique to implement selective disclosure, we’ve added the ability to generate credentials that support zero knowledge proofs without revealing any unnecessary information about the end-user, or placing any added burden on issuers, in the process. Since

MATTR is proud to announce we’ve added support for privacy-preserving verifiable credentials on our platform using BBS+ signatures. Using a technique to implement selective disclosure, we’ve added the ability to generate credentials that support zero knowledge proofs without revealing any unnecessary information about the end-user, or placing any added burden on issuers, in the process. Since we first introduced and open-sourced JSON-LD BBS+ Signatures at IIW30 in April of this year, we’ve received lots of engagement, feedback and contributions from the broader technical community to further develop the implementations and specifications we presented. You can read more about our approach to privacy-preserving verifiable credentials on our introductory blog post.

One of the benefits of using the BBS+ cryptographic scheme to sign credentials is the ability to derive a zero knowledge proof from the signature, where the party generating the proof can choose to partially disclose statements from the original message. When enabled, this feature allows issuers to create a credential that effectively enforces minimal data disclosure using the MATTR Platform and a compliant digital wallet.

Issuers can create ZKP-enabled credentials that allow the user to selectively disclose data

To support this functionality, we generate the keys required to support these signatures and create a Decentralized Identifier (DID) with the keys referenced in the DID Document. BBS+ signatures require what’s called a pairing-friendly curve, we use BLS12–381. This DID can be referenced in credentials to establish the issuer of the data, a common practice to allow a verifier or relying party to trace the root of trust in a credential.

To issue a ZKP-enabled credential, simply use our API endpoint to create a new DID Key with type set to BLS 12–381. Then, create a Verifiable Credential (VC) using your new DID Key as the issuer DID. Our platform will automatically detect this capability is available in your DID and create a ZKP-enabled BBS+ credential for you. You can use the platform this way to create a privacy-enabled credential, or you can create a regular credential by providing a DID with a different key type — you have the option.

On the user side, you can hold ZKP-enabled credentials in your wallet alongside all of your other credentials. We’ve designed this process in a way that minimizes friction to the user. In future updates, our Mobile Wallet App will be able to detect if BBS+ signatures are being used in a credential. When you get a request to verify some information contained in one of these privacy-enabled credentials, it will derive a new presentation that selectively discloses the required info using a zero-knowledge proof. The platform will then allow verification of the proof using the same interface as any other type of presentation.

Our integrated approach treats zero-knowledge proofs as an extension of VCs, rather than an entirely new framework with a separate set of dependencies. We have built BBS+ Signatures and privacy-enabled credentials into our platform for anybody to experiment with, in what we think is a significant milestone for standards-based credential solutions on the market today.

As a technology, BBS+ digital signatures can be used to sign more than just verifiable credentials. Combining these technologies is quite effective, though they can also be treated as modular or separate components. We’ve open-sourced software for creating and verifying BBS+ signatures in browser environments as well as node.js, and we’ve also published a library for generating BLS 12–381 keypairs for signing and verifying BBS+ Signatures.

By leveraging pairing-friendly elliptic-curve cryptography in the context of Linked Data Proofs, our approach provides an unprecedented way to perform zero-knowledge proofs using the semantics of JSON-LD. This allows credential issuers to tap into vast data vocabularies that exist on the web today, such as schema.org and Google Knowledge Graph, making user data more context-rich without sacrificing security and privacy of the user in the process. Not only is this approach more interoperable with existing implementations of the VC data model and semantic web technologies, it also doesn’t rely on any external dependencies to operate (like a distributed ledger), meaning it’s far more efficient than other approaches based on CL-signatures and zk-SNARKs. We’ve open-sourced our LD-Proofs suite for VCs including performance benchmarks so you can check it out yourself.

We’re excited to finally make these powerful privacy features easily accessible for everyone, and we can’t wait to see what you build with it. To get started, sign up now on our website and follow our tutorials on MATTR Learn to start creating ZKP-enabled verifiable credentials on the MATTR Platform.

Additional Links

Open-source:

Node JS BBS+ Signatures — BBS+ signatures implementation for node.js environments WASM JS BSS+ Signatures — BBS+ signatures implementation for browser & node.js environments BLS 12–381 Key Pair JS — crypto keys for signing/verifying BBS+ signatures BBS+ JSON-LD Signatures JS — uses BBS+ signatures & BLS 12–381 keypair in a Linked Data Proofs suite (for use in VC implementations)

Specifications:

BBS+ JSON-LD Signatures Spec — specifies linked data suite for BBS+ signatures BBS+ Signatures Spec — definition of BBS+ signatures scheme

The article Using privacy-preserving ZKP credentials on the MATTR Platform appeared first on MATTR.


KuppingerCole

Matthias Reinwarth: Beyond Static Access - Leveraging Access Policies To Deal With The Increasing Complexity Of Access Governance




Olivier Schraner: Adapting IGA to Your Digital Agenda

As more products become digitally presented and delivered, process agility increases and the requirements against IGA solutions change significantly. Established patterns need to be shed, and new approaches to governing your human and robotic workforce become essential. This talk looks at the evolution of IGA requirements in the face of rapid business transformation, and explores different approac

As more products become digitally presented and delivered, process agility increases and the requirements against IGA solutions change significantly. Established patterns need to be shed, and new approaches to governing your human and robotic workforce become essential. This talk looks at the evolution of IGA requirements in the face of rapid business transformation, and explores different approaches of solving new challenges while keeping then enterprise safe and compliant.




Interview with James Taylor




Mans Hakansson: Modernizing IAM - Implementing Policy Based Access Management & Governance

In this session PlainID will discuss how organizations can rethink, redesign and modernize their Identity and Access Management (IAM) architecture by implementing PBAC (Policy Based Access Control). This service should be a central service supporting not only one specific set of applications but rather act as a focal point (or a “brain” if you like) for different IAM technologies. This new archite

In this session PlainID will discuss how organizations can rethink, redesign and modernize their Identity and Access Management (IAM) architecture by implementing PBAC (Policy Based Access Control). This service should be a central service supporting not only one specific set of applications but rather act as a focal point (or a “brain” if you like) for different IAM technologies. This new architecture pattern has evolved to better support more applications and more advanced use cases.




Darran Rolls: Standing on the Beach, Looking at the Sea: Identity Governance & Administration, Today, Tomorrow and Sometime Later

In this session Mr. Darran Rolls with provide a unique perspective on the emergence, growth and future advancement of IGA technology.  In it, he provides an assessment of where we stand today with existing solutions and deployment approaches, and highlights where the industry needs to focus regarding program oversight, cross-system orchestration and integration with cloud and DevOps processes

In this session Mr. Darran Rolls with provide a unique perspective on the emergence, growth and future advancement of IGA technology.  In it, he provides an assessment of where we stand today with existing solutions and deployment approaches, and highlights where the industry needs to focus regarding program oversight, cross-system orchestration and integration with cloud and DevOps processes.

I’ll start working on the content this week and have some questions on format and delivery:

Is there a preferred slide template or format? What is the optimum approach record heads-up and rotate slides in a split-screen Will each presenter go over their slides live or are things pre-recorded?


David Black: The Use of Real World Identities in Support of Identity and Access Management




Alpha Barry: The Value of Identity Governance and Administration in Non-Regulated Companies

While properly defined and tool-supported identity and access governance (IGA) is prevalent in regulated industries to ensure compliance, it is still fairly uncommon in mid-sized or even larger companies in non-regulated industry sectors. This has not been a problem in the past, when classical, data-center based IT infrastructure was dominant. Mr. Barry will point out why a lack of IGA can become

While properly defined and tool-supported identity and access governance (IGA) is prevalent in regulated industries to ensure compliance, it is still fairly uncommon in mid-sized or even larger companies in non-regulated industry sectors. This has not been a problem in the past, when classical, data-center based IT infrastructure was dominant. Mr. Barry will point out why a lack of IGA can become a major issue when introducing hybrid or cloud-based IT infrastructure, and will explain why tool-based IGA can even add long term value in automating the administration of a hybrid infrastructure environment.




Nick Groh: Evolving Data-Driven Decision Making Beyond Identity Management

As organizations become increasingly digital, they must continue to evolve their IAM strategy to solve business challenges, support new initiatives, and incorporate data-driven decisions. In this session, Nick Groh will introduce the concept of data-driven decision making, including how artificial intelligence can help reduce the costs of decision-making. The session will also cover mobile trends

As organizations become increasingly digital, they must continue to evolve their IAM strategy to solve business challenges, support new initiatives, and incorporate data-driven decisions. In this session, Nick Groh will introduce the concept of data-driven decision making, including how artificial intelligence can help reduce the costs of decision-making. The session will also cover mobile trends and other sources of leveraging data, and focus on applications to identity management. This session will look at how IGA has mature use cases, but needs to be applied more broadly. Finally, there will be a discussion on how these applications extend beyond identity management, such as other areas of security, and how the business can incorporate identity data. 




In an Age of Digital Transformation Managing Vendor and Partner Identity Is Critical

Organizations have been managing the identity and access of employees for many years to protect data and the overall security of the enterprise. However, the onset of digital transformation has driven a need for faster, cost-effective innovation and with it the increased utilization of third-party resources. Consequently, organizations have a greater need to manage third-party access to data, syst

Organizations have been managing the identity and access of employees for many years to protect data and the overall security of the enterprise. However, the onset of digital transformation has driven a need for faster, cost-effective innovation and with it the increased utilization of third-party resources. Consequently, organizations have a greater need to manage third-party access to data, systems, and facilities. This includes contractors, vendors but also partners, affiliates, volunteers, and even service accounts and bots. Modern organizations are much more collaborative and open structures than those of even just a few years ago and continue to change.




Forgerock Blog

Introducing a New Kind of Security Key: ZenKey

It used to be that before you walked out the door, you always made sure you had your keys and your wallet. The “key and wallet check” was essential to leaving your home with a clear head. Now, your smartphone has undoubtedly joined the fray as something equal to, or even more important than, those other vital items. Leaving your house without your smartphone likely seems inconceivable, and whether

It used to be that before you walked out the door, you always made sure you had your keys and your wallet. The “key and wallet check” was essential to leaving your home with a clear head. Now, your smartphone has undoubtedly joined the fray as something equal to, or even more important than, those other vital items. Leaving your house without your smartphone likely seems inconceivable, and whether we want to admit it or not, it is one of the most ubiquitous things in our lives.

Your Smartphone Can Do More

Your phone understands who you are, knows where you are, it’s connected to a powerful network that is your portal to the world. So why not use that knowledge? Why do websites continue to ask me to register and log in with usernames and passwords when there is a device in my pocket that can authenticate me? Could the networks that power our smart devices play a role by adding additional, seamless security?

Using ZenKey to Unlock Trust

These questions now have an answer. AT&T, T-Mobile and Verizon have created something meaningful with the launch of ZenKey. ZenKey is a new solution that leverages the  network and SIM card details to deliver authentication and identity verification features to web and mobile applications. ZenKey is differentiated because it relies on network and device data and can’t be hacked using only a stolen username and password or even a malicious SIM swap.

What This Means for Your Security

Even more exciting is that ForgeRock is a launch partner with ZenKey, giving customers the ability to leverage the ZenKey Authentication Node in ForgeRock’s Intelligent Access solution. This node provides instant, drop-in support for the ZenKey service. By simply leveraging the ZenKey node, any website or service can offer an alternative to long registration forms and password-based logins with a highly secure, device-based, multifactor authentication.

Now, ForgeRock customers can quickly reduce abandonment during sign-ups while receiving trusted user attributes from MNOs. This capability helps reduce the risk of fraud while creating an identity pre-populated with user attributes. Once a user is enrolled in ZenKey, ForgeRock Intelligent Access can use ZenKey to power a highly secure passwordless authentication experience. This integration delivers a tremendous balance between security and usability, something every ForgeRock customer is trying to accomplish.

As more apps, websites and services take advantage of the combination of ZenKey and ForgeRock for seamless registration and authentication, one of the keys you will never leave home without will be your ZenKey.

 Learn more about ForgeRock Intelligent Access here

 


Smarter with Gartner - IT

Gartner Top 9 Security and Risk Trends for 2020

The shortage of technical security staff, the rapid migration to cloud computing, regulatory compliance requirements and the unrelenting evolution of threats continue to be the most significant ongoing major security challenges.  However, responding to COVID-19 remains the biggest challenge for most security organizations in 2020.  “The pandemic, and its resulting changes to the busi

The shortage of technical security staff, the rapid migration to cloud computing, regulatory compliance requirements and the unrelenting evolution of threats continue to be the most significant ongoing major security challenges. 

However, responding to COVID-19 remains the biggest challenge for most security organizations in 2020. 

“The pandemic, and its resulting changes to the business world, accelerated digitalization of business processes, endpoint mobility and the expansion of cloud computing in most organizations, revealing legacy thinking and technologies,” said Peter Firstbrook, VP Analyst, Gartner, during the virtual Gartner Security and Risk Management Summit, 2020.

[swg_ad id="28186"]

COVID-19 refocused security teams on the value of cloud delivered security and operational tools that don’t require a LAN connection to function, reviewing remote access policies and tools, migration to cloud data centers and SaaS applications, and securing new digitization efforts to minimize person-to-person interactions.

Gartner has identified nine annual top trends that are the response by leading organizations to these longer-term external trends. These top trends highlight strategic shifts in the security ecosystem that aren’t yet widely recognized, but are expected to have broad industry impact and significant potential for disruption.

Trend No. 1: Extended detection and response capabilities emerge to improve accuracy and productivity

Extended detection and response (XDR) solutions are emerging that automatically collect and correlate data from multiple security products to improve threat detection and provide an incident response capability. For example, an attack that caused alerts on email, endpoint and network can be combined into a single incident. The primary goals of an XDR solution are to increase detection accuracy and improve security operations efficiency and productivity.

“Centralization and normalization of data also helps improve detection by combining softer signals from more components to detect events that might otherwise be ignored,” said Firstbrook.

Trend No. 2: Security process automation emerges to eliminate repetitive tasks

The shortage of skilled security practitioners and the availability of automation within security tools have driven the use of more security process automation. This technology automates computer-centric security operations tasks based on predefined rules and templates. 

Automated security tasks can be performed much faster, in a scalable way and with fewer errors. However, there are diminishing returns to building and maintaining automation. SRM leaders must invest in automation projects that help to eliminate repetitive tasks that consume a lot of time, leaving more time to focus on more critical security functions.

Trend No. 3: AI creates new security responsibilities for protecting digital business initiatives

AI, and especially machine learning (ML), continues to automate and augment human decision making across a broad set of use cases in security and digital business. However, these technologies require security expertise to address three key challenges: Protect AI-powered digital business systems, leverage AI with packaged security products to enhance security defense and anticipate nefarious use of AI by attackers.

Trend No. 4: Enterprise-level chief security officers (CSOs) emerge to bring together multiple security-oriented silos

In 2019, incidents, threats and vulnerability disclosures outside of traditional enterprise IT systems increased, and pushed leading organizations to rethink security across the cyber and physical worlds. Emerging threats such as ransomware attacks on business processes, potential siegeware attacks on building management systems, GPS spoofing and continuing OT/IOT system vulnerabilities straddle the cyber-physical world. Organizations primarily focused on information-security-centric efforts are not equipped to deal with the effect of security failures on physical safety. 

As a result, leading organizations that deploy cyber-physical systems are implementing enterprise-level CSOs to bring together multiple security-oriented silos both for defensive purposes and, in some cases, to be a business enabler. The CSO can aggregate IT security, OT security, physical security, supply chain security, product management security, and health, safety and environmental programs into a centralized organization and governance model.

Trend No 5. Privacy is becoming a discipline of its own

No longer “just a part of” compliance, legal or auditing, privacy is becoming an increasingly influential, defined discipline of its own, affecting almost all aspects of an organization. 

As a rapidly growing stand-alone discipline, privacy needs to be more integrated throughout the organization. Specifically, the privacy discipline co-directs the corporate strategy, and as such needs to closely align with security, IT/OT/IoT, procurement, HR, legal, governance and more.

Trend No. 6: New “digital trust and safety” teams focus on maintaining the integrity of all interactions where consumer meets the brand

Consumers interact with brands through an increasing variety of touchpoints, from social media to retail. How secure the consumer feels within that touchpoint is a business differentiator. Security for these touchpoints is often managed by discrete groups, with specific business units focusing on areas they run. However, companies are increasingly moving toward cross-functional trust and safety teams to oversee all the interactions, ensuring a standard level of safety across each space where consumers interact with the business.  

Trend No. 7: Network security transforms from the focus on LAN-based appliance models to SASE

Cloud-delivered security services are growing increasingly popular with the evolution of remote office technology. Secure access service edge (SASE) technology allows organizations to better protect mobile workers and cloud applications by routing traffic through a cloud-based security stack, versus backhauling the traffic so it flows through a physical security system in a data center. 

Trend No. 8: A full life cycle approach for protection of the dynamic requirements of cloud-native applications

Many organizations use the same security product on end-user-facing endpoints as they did for server workloads, a technique that often continued on during “lift and shift” cloud migrations. But cloud-native applications require different rules and techniques, leading to the development of cloud workload protection (CWPP). But as the applications grow increasingly dynamic, the security options need to shift as well. Combining CWPP with the emerging cloud security posture management (CSPM) accounts for all evolution in security needs. 

Trend No. 9: Zero-trust network access technology begins to replace VPNs

The COVID pandemic has highlighted many of the problems with traditional VPNs. Emerging zero-trust network access (ZTNA) enables enterprises to control remote access to specific applications. This is a more secure option, as it “hides” applications from the internet — ZTNA only communicates to the ZTNA service provider, and can only be accessed via the ZTNA provider’s cloud service.

This reduces the risk of an attacker piggybacking on the VPN connection to attack other applications. Full-scale ZTNA adoption does require enterprises to have an accurate mapping of which users need access to what applications, which will slow adoption.

This article has been updated from the original, created on June 22, 2020, to reflect new events, conditions and research.

[swg_ad id="36843"]

The post Gartner Top 9 Security and Risk Trends for 2020 appeared first on Smarter With Gartner.


Global ID

Partner Spotlight: Introducing Developer First, Instant Card Issuance from Apto

Greg Kidd is the co-founder and CEO of GlobaliD and the founding partner of Hard Yaka, an investment firm focused on fintech ecosystems. His commitment to innovation and democratization in banking and payments has led to his involvement in other projects such as Apto, a Y-Combinator alum focused on developer-first card issuance he co-founded with CEO Meg Nakamura. In 2014, after several years wor

Greg Kidd is the co-founder and CEO of GlobaliD and the founding partner of Hard Yaka, an investment firm focused on fintech ecosystems. His commitment to innovation and democratization in banking and payments has led to his involvement in other projects such as Apto, a Y-Combinator alum focused on developer-first card issuance he co-founded with CEO Meg Nakamura.

In 2014, after several years working for a regulatory and compliance advisory company, I co-founded Apto with my colleague and friend Meg Nakamura. It became apparent from my experience that the financial services landscape was in need of a significant upgrade — the industry was dependent on outdated, legacy infrastructure and entangled in a web of opaque regulatory requirements, making it difficult to build and launch innovative and bespoke card programs specifically designed for their users.

The system was set up to serve only the most highly-resourced, established players in the market.

Since those foundational experiences in finance, we’ve been passionate about creating a payments ecosystem that is more fair and equitable, allowing the most aspirational companies to enter the market quickly and responsibly. To this end, we’ve prioritized working with those who are equally as enthusiastic about creating user-first experiences in the financial services sector as we built Apto. Since its Y-Combinator days, when Apto was known as Shift Payments, we now support several of the fastest growing fintech companies in the U.S. and expanded our business to Europe as of 2019. After building cutting-edge card programs for fintech innovators like Venmo and Coinbase, we’re expanding our purview to make card programs truly accessible for all.

I’m thrilled to announce that in the coming weeks, Apto will be taking an exciting and important step forward in democratizing fintech by launching an instant card issuance portal for developers at companies of any size.

Launching a card program is historically a long, painful process. Our goal in designing this program is to remove barriers to entry and shield you from the technical complexities to make it fast and easy to design and launch card programs in minutes.

Read Greg’s full post over at Apto Sign up for Apto’s waitlist

You might also like:

Meet the Team — Erik Westra, head of GlobaliD Labs GlobaliD App: Introducing SEPA and crypto transfers to your Wallet Why “developer-first” matters

Partner Spotlight: Introducing Developer First, Instant Card Issuance from Apto was originally published in GlobaliD on Medium, where people are continuing the conversation by highlighting and responding to this story.


Forgerock Blog

ForgeTalks: Busting Cloud Myths

Welcome back to another episode of ForgeTalks. My guest this week is ForgeRock VP of Cloud Success, Renee Beckloff. Renee's career has been connected to the cloud for the last 15 years, making her uniquely suited to help bust some pretty entrenched myths. She doesn't hold back in our discussion and shares why there has never been a better time for large enterprise customers to embrace the clo

Welcome back to another episode of ForgeTalks. My guest this week is ForgeRock VP of Cloud Success, Renee Beckloff. Renee's career has been connected to the cloud for the last 15 years, making her uniquely suited to help bust some pretty entrenched myths. She doesn't hold back in our discussion and shares why there has never been a better time for large enterprise customers to embrace the cloud

In this episode we'll cover: 

What are some of the most common myths holding people back from cloud adoption? What are the benefits of making a move to the cloud? Why you should attend ForgeRock Identity Live: Cloud Edition 

 

I hope you enjoy this great episode. If you want to learn more, Renee offers a deeper dive into cloud myths at ForgeRock Identity Live: Cloud Edition next week. Check out the agenda for this virtual event and register for a time that works for you. And if you want to check out any of our previous episodes of ForgeTalks you can do so here.


Mythics Blog

Oracle Expands Government Cloud with National Security Regions for IC and DoD

Oracle has announced the expansion of the Oracle Government Cloud with National Security Regions for US Intelligence…

Oracle has announced the expansion of the Oracle Government Cloud with National Security Regions for US Intelligence…


KuppingerCole

Fudo PAM by Fudo Security

by Paul Fisher Fudo Security’s PAM solution is the company’s primary product in the expanding PAM market. In the last few years PAM has evolved into a set of targeted technologies that addresses some of the most urgent areas of business security in a period of rapid technological change. Digital transformation, Cloud, and Hybrid IT environments are creating new demands and innovative PAM solution

by Paul Fisher

Fudo Security’s PAM solution is the company’s primary product in the expanding PAM market. In the last few years PAM has evolved into a set of targeted technologies that addresses some of the most urgent areas of business security in a period of rapid technological change. Digital transformation, Cloud, and Hybrid IT environments are creating new demands and innovative PAM solutions are emerging to meet these challenges.


MyKey

Crypto Stablecoin Report 18: The market capitalization of stablecoins increased to $18.53

The market capitalization of stablecoins increased to $18.53 billion, The rise of CBDC Original link: https://bihu.com/article/1793611236 Original publish time: September 15, 2020 Original author: HaiBo Jiang, researcher of MYKEY Lab We released MYKEY Crypto Stablecoin Report to share our interpretation of the development status of stablecoins and analysis of their development t
The market capitalization of stablecoins increased to $18.53 billion, The rise of CBDC

Original link: https://bihu.com/article/1793611236

Original publish time: September 15, 2020

Original author: HaiBo Jiang, researcher of MYKEY Lab

We released MYKEY Crypto Stablecoin Report to share our interpretation of the development status of stablecoins and analysis of their development trends to help the participants in the crypto market stay updated on the development status of stablecoin. The MYKEY Crypto Stablecoin Report will be published every week, looking forward to maintaining communication with the industry and exploring the development prospects of stablecoin together.

Quick Preview At present, the market capitalization of major stablecoins has increased by $987 million to $18.53 billion. Last week, Tether additionally issued 500 million USDT on Tron twice. MakerDAO added USDT and PAX as collateral for DAI. DC/EP may be the most advanced CBDC project currently. Sweden is experiencing the largest and fastest decline in cash use, and more and more stores no longer accept cash. The Bank of Canada stated that there is currently no convincing reason to issue the CBDC. No central bank is willing to take the risk of using distributed ledger technology without permission. 1. Overview of Stablecoin Data

First, let’s review the changes in the basic information of the various stablecoins in the past week(September 5, 2020 ~ September 11, 2020, same below).

Market Circulation

Source: MYKEY, CoinMarketCap, Coin Metrics

At present, the market capitalization of major stablecoins has increased by $987 million to $18.53 billion.

Source: MYKEY, Coin Metrics

In the past week, Tether additionally issued 500 million USDT on Tron twice. The circulation of USDC, PAX, BUSD, TUSD, HUSD, and GUSD increased by 263 million, 630,000, 103 million, 120 million, 3.6 million, and 700,000. The circulation of DAI decreased by 1.48 million.

The Number of Holding Addresses

Source: MYKEY, DeBank

Last week, the number of main stablecoin holding addresses on Ethereum all increased by 27,703.

Source: MYKEY, DeBank

The number of holding addresses of USDT, USDC, TUSD, and DAI increased by 24,525, 2,243, 319, and 916. The number of holding addresses of PAX decreased by 300.

The Number of Active Addresses

Source: MYKEY, Coin Metrics

The number of active addresses of stablecoins last week decreased by an average of 3.33% compared to the previous week.

The Number of 24-hour Transactions on the Public Blockchains

Source: MYKEY, Coin Metrics

Compared with the previous week, the number of daily transactions of major stablecoins increased by an average of 0.17%.

The Number of 24-hour Volume of Transactions on the Public Blockchains

Source: MYKEY, Coin Metrics

Source: MYKEY, Coin Metrics

The daily volume of transactions of major stablecoins last week increased by an average of 34.07% from the previous week.

2. The rise of CBDC

CBDC has received unprecedented attention, but the motives, policies, and technical design of each country are different. On August 24, BIS released a working paper entitled ‘The Rise of CBDC: Driving Factors, Methods, and Technologies’. The article introduces the current status of CBDC research by central banks in various countries, calls on central banks to learn from each other, and gives three representative cases of CBDC.

80% of the central banks surveyed are engaged in the research, experimentation, or development of CBDC

For centuries, a series of new payment technologies have met social needs. Coins, banknotes, checks, and credit cards are all innovations of their era. Now, there are more and more discussions about a new payment technology-CBDC. CBDC represents the digital liabilities of the central bank. Wholesale CBDC may become a new settlement tool between financial institutions, while retail CBDC will be the liability of the central bank that everyone can use. Although the concept of CBDC has been proposed decades ago, in the past year, the attitude of central banks on whether to issue CBDC has changed significantly.

During the Covid-19 epidemic, the public worried that the virus would spread through cash. The payment plan from government to individual is accelerating the transition to digital payment. Over time, the decline in cash usage in some countries has aroused people’s attention, and central banks in many countries have begun to consider issuing CBDC. As of the end of 2019, A total of central banks representing one-fifth of the world’s population stated that they may issue CBDC soon. Similarly, the proportion of central banks considering the issuance of retail CBDC within 1–6 years doubled in 2019 to 20%. Among the central banks surveyed, 80% are engaged in the research, experiment, or development of CBDC.

CBDC is mentioned more and more in speeches and reports. In the 12-week moving average of Google searches, the peak of Bitcoin appeared in early 2018, the peak of Libra in 2019, and the number of searches for CBDC is increasing.

Source: Rise of the central bank digital currencies: drivers, approaches and technologies

DC/EP may be the most advanced CBDC project currently

For many years, central banks around the world have been studying the concept and design of CBDC. Such as the ‘Dinerolectrónico’ of the Central Bank of Ecuador, the ‘Dukaton’ of the ABN-AMRO Bank, and the ‘Ubin’ of the Monetary Authority of Singapore.

Currently, the most advanced CBDC may be the Digital Currency Electronic Payment (DC/EP) of the People’s Bank of China. DC/EP represents debt in the form of cash from the People’s Bank of China, which can be used by the public and foreign tourists to China through an account-based interface.

The Bank of Canada stated that it is leading the work of the retail CBDC in response to emergencies such as a sudden decline in cash usage or widespread use of private digital currencies. The Central Bank of the Eastern Caribbean launched the ‘DXCD’ pilot. The Central Bank of the Bahamas launched the ‘Sand Dollar’ pilot.

Source: central banks’ websites

Commonality in CBDC

The development of global business digitization, the rise of private digital currencies, and concerns that cash may spread the COVID-19 virus have all contributed to the increase in interest in CBDC. However, the economic and institutional motivations for issuing CBDC vary from country to country. This part of the content hopes to cross the national dimension, explain the research and development of CBDC from the economic and institutional drive, and find common ground to explain why some countries/regions increase the research and development of CBDC. This will also help us understand how they designed CBDC projects.

Research has found that the development of CBDC is closely related to mobile and Internet utilization, innovation capabilities, and government efficiency. It develops faster in places with high per Capita GDP and high levels of financial development. Retail CBDC develops faster in areas with high innovation capabilities. Wholesale CBDC is positively related to financial development, which also reflects the role of wholesale CBDC in improving the efficiency of wholesale settlement. Many wholesale CBDC focus on cross-border dimensions.

Technical design of CBDC

‘CBDC pyramid’ is a classification method designed by CBDC. This method starts from the consumer needs addressed by retail CBDC, determines the technology, and then arrives at the final design plan.

The first thing to choose is the technical architecture. According to the roles played by central banks and private intermediaries in CBDC, the technical architecture of CBDC is divided into four types: direct CBDC, hybrid CBDC, intermediary CBDC, and indirect or synthetic CBDC.

The second layer is the infrastructure. It can be based on a traditional centralized database or distributed ledger technology (DLT). The difference in technology also leads to differences in the efficiency of CBDC and the degree of single point failure protection. However, among all the central banks engaged in the research of CBDC, no central bank is willing to take the risk of using the same distributed ledger technology without permission as Bitcoin.

The third layer is the admission of CBDC. The combination of account-based CBDC and identity verification can provide a basis for payment activities. But for those without a bank account or who care about privacy, it may be difficult to gain access. If the barriers to entry are lowered, it may bring new illegal activities.

The fourth layer is the cross-border payment of CBDC, which involves the interconnection of the retail and wholesale of CBDC.

In terms of structure, four central banks in the survey sample considered adopting a direct model, and seven central banks considered a hybrid or intermediary model. Many central banks have not yet determined the structure, but no central bank said they would adopt indirect or synthetic CBDC.

In terms of infrastructure, seven central banks are considering running CBDC on DLT, three are considering using traditional technology, and one is considering a combination of the two.

In terms of access, account-based access is the most common.

CBDC of most central banks focused on domestic use. Only a few projects, such as the CBDC built by the European Central Bank, France, Spain, and the Dutch Central Bank, will focus on cross-border payments.

Three examples of CBDC design method

Although the CBDC schemes of various countries may be different, central banks of different countries can also learn from each other. The following describes three unique CBDC schemes, which are located in Asia, North America, and Europe geographically. That is, the DC/EP of the People’s Bank of China, the e-krona of the Riksbank, and the work of the Bank of Canada on the CBDC.

The People’s Bank of China: DC/EP project (pilot phase)

Among all the current CBDC projects, the DC/EP of the Bank of China is at the highest stage of development, and China’s efforts on CBDC can be traced back to 2014. At the end of 2019, the Chinese announced a pilot study on the retail CBDC — digital currency and electronic payment tool (DC/EP) project. On April 20, 2020, a spokesperson for the People’s Bank of China confirmed that it is currently piloting projects in several cities including Shenzhen, Suzhou, Chengdu, Xi’an, and Beijing.

China’s CBDC has developed under the circumstances of a highly digital economy and the widespread use of digital payment services. In addition to facilitating online transactions, CBDC will also bring diversity to the current payment monopoly of Alipay and WeChat, which controls 94% of mobile payments. China’s DC/EP will serve as a supplement to M0, and it is not intended to completely replace physical cash.

The structure of DC/EP belongs to ‘hybrid CBDC’. CBDC represents direct claims to the People’s Bank of China, but access and real-time payment services are operated by intermediaries (authorized operators), and the central bank regularly receives and stores retail holdings and transaction copies.

The People’s Bank of China provides core infrastructure, while commercial banks, other payment service providers, telecommunications, and other intermediaries will provide services to the public. This method prevents risks from being concentrated in the central bank and also prevents duplication and waste of resources.

The People’s Bank of China does not require intermediaries to use specific infrastructure or specific technical routes. Financial intermediaries will be responsible for KYC verification obligations and retail services.

In terms of access, the People’s Bank of China decided to use value-based, semi-account-based, and account-based hybrid payment tools. The identity will be based on ‘loosely coupled account links’. Users can use DC/EP anonymously in daily transactions, but allow the central bank to track necessary data, implement prudential supervision, combat money laundering, and other criminal crimes.

Riksbank: Electronic Krona Project

Sweden is a highly digitalized economy. Sweden is experiencing the largest and fastest decline in cash usage, and more and more stores no longer accept cash. Therefore, Sweden is more likely to issue CBDC.

The Riksbank, like other central banks, has studied a variety of CBDC technologies and methods. The proof of the concept of the electronic Krona project is currently underway, and the purpose of CBDC is also to supplement cash.

The current proof-of-concept structure of the Riksbank is a hybrid CBDC. This design will require the Riksbank to provide emergency solutions when the intermediary fails to prevent users from being unable to use the electronic Krona in the end.

The architecture and technical implementation of electronic Krona are based on DLT.

In terms of access, the CBDC piloted by the Riksbank is account-based, but it also considers low-value prepaid cards. The Riksbank may also develop a CBDC payment card, which can be used directly for micropayments without accessing the wallet.

Bank of Canada: CBDC Emergency Plan

The Bank of Canada conducted effective research and policy communication on the topic of digital currency. Despite an early start, the Bank of Canada has not yet announced a retail CBDC pilot or proof of concept. Canada’s CBDC outlines a comprehensive plan, also lists potential structures, and accumulates relevant technology and knowledge through new projects, which also cooperate with other central banks.

However, the Bank of Canada stated that there is currently no compelling reason to issue CBDC, and Canadians will continue to receive good services from the existing payment ecosystem, provided that it is modern and still suitable for current goals. The world is changing rapidly, the Bank of Canada will consider the scenario of issuing the CBDC so that it can continue to provide Canadians with a reliable payment method.

The Bank of Canada considered reducing or eliminating the use of physical cash, as well as the significant progress made in the use of private cryptocurrencies or stablecoin as a means of payment.

Architecturally, Canada’s CBDC will not adopt an indirect/synthetic model and consider the use of direct CBDC, hybrid CBDC, or intermediary CBDC.

In terms of infrastructure, the Bank of Canada has a lot of DLT-based proof-of-concept experience. DLT can be used as an infrastructure solution, but it is not necessary.

In terms of access, the Bank of Canada considers access solutions based on accounts and coupons. Anonymous coupons will be allowed to be used for micropayments. Large usage will require the use of account-based access rights.

Sum up

CBDC is a new type of payment technology that may soon be launched in many countries around the world. Research shows that CBDC is developing faster in countries with higher mobile phone usage and innovation capabilities. Countries will vary according to their economic situation and priorities, but they also have some key common characteristics. All of the CBDC designs investigated are supplements of cash, none of the designs adopt an indirect model, and no central bank is willing to take the risk of using the same distributed ledger technology without permission as Bitcoin.

Tips

To better communicate with industry insiders, we decided to add two sections for questions of readers and opinions of guests. If readers have questions about stablecoins, please contact us. We will pick meaningful questions to answer in the next issue. At the same time, welcome guests from the industry to share your views on stablecoins. Contact information: jianghb@mykey.org.

This is what we’re sharing in this MYKEY Crypto Stablecoin Report, welcome to stay tuned for follow-up crypto stablecoin reports. We will provide more interpretations of the development status of stablecoins and analysis of their development trends to help you stay updated on the development status of stablecoin in the follow-up report.

PS: MYKEY Lab has the final right to interpret the content of the article, please indicate the source for the quotation. Welcome to follow our official account — MYKEY Lab: MYKEY Smart Wallet.

Past review

Crypto Stablecoin Report 13: The market capitalization of stablecoins reached $14.387 billion, Stablecoin pool Reserve

Crypto Stablecoin Report 14: The increase of Ethereum Gas Fee makes the transfers of stablecoin transactions on the blockchain

Crypto Stablecoin Report 15: The market capitalization of stablecoins increased to $15.961 billion, On-chain usage of stablecoins

Crypto Stablecoin Report 16: The connection between stablecoins and real assets

Crypto Stablecoin Report 17: The market capitalization of stablecoins increased to $17.544 billion, Decentralized payment protocol Celo

About Us

KEY GROUP: https://keygroup.me/

MYKEY Web: https://mykey.org/

BIHU: https://bihu.com/people/1133973

Telegram: https://t.me/mykey_lab

Twitter: https://twitter.com/mykey_lab

Medium: https://medium.com/mykey-lab

Github: https://github.com/mykeylab

Youtube: MYKEY Laboratory

Crypto Stablecoin Report 18: The market capitalization of stablecoins increased to $18.53 was originally published in MYKEY Lab on Medium, where people are continuing the conversation by highlighting and responding to this story.


PingTalk

Keep Me Safe, Make Me Happy Pt 3

In Part 1 of this series, we started our journey of understanding why customers want both security and great customer experiences by examining how customer expectations have changed in the past few years. Customers want more than just great digital experiences: they also expect companies to protect their privacy and security and insist on true security for their digital identities.   Toda

In Part 1 of this series, we started our journey of understanding why customers want both security and great customer experiences by examining how customer expectations have changed in the past few years. Customers want more than just great digital experiences: they also expect companies to protect their privacy and security and insist on true security for their digital identities.

 

Today we’ll conclude this series by explaining how frictionless and enjoyable customer experiences can only be achieved by addressing security first.

 


MATTR

Introducing the MATTR Platform

Here at MATTR, we have been hard at work building a suite of products to serve the next generation of digital trust. We’ve designed our products based on a few key principles: extensible data formats, secure authentication protocols, a rigorous semantic data model, industry-standard cryptography, and the use of drivers and extensions to allow modular […] The article Introducing the MATTR Platfor

Here at MATTR, we have been hard at work building a suite of products to serve the next generation of digital trust. We’ve designed our products based on a few key principles: extensible data formats, secure authentication protocols, a rigorous semantic data model, industry-standard cryptography, and the use of drivers and extensions to allow modular and configurable use of the platform over time. By combining our core capabilities with extensions and drivers, our platform offers developers convenience without compromising flexibility or choice.

The MATTR Platform delivers digital trust in a scalable manner. Our belief is that a modular security architecture is one which can work across many different contexts. When it comes to trust, context is everything, and we know our users each have their own unique requirements and expectations when it comes to their digital interactions.

We provide flexible and configurable building blocks for trust on the web in order to create a digital ecosystem that can support global scale.

The platform consists of 3 main components:

Platform Core Platform Extensions Platform Drivers

Our platform provides the capabilities needed for digital trust through a set of modular and flexible building blocks known as our Platform Core. This includes the ability to establish and use DIDs, sign and encrypt messages, manage the verifiable credentials lifecycle, and share privacy-preserving verifiable presentations. Platform Core is designed as a set of simple APIs that are available for all of our users, with operational tools and documentation.

We’ve designed the platform to have cryptographic agility and flexibility built in at a fundamental level. Platform Drivers are pre-configured integrations that allow our capabilities to be pluggable and extensible over time, preventing vendor lock-in and enabling user choice. They identify key areas where flexibility, choice, and optionality are desirable and surface them to the user to promote more resilient security architectures for the future. They are typically surfaced to the user as pluggable parameters in our Platform Core.

Extensibility is a key component of our platform architecture. Platform Extensions are higher level capabilities that plug in to our platform, providing convenient and easy-to-access application logic, such as service orchestration and workflow. They are built on top of our Platform Core, allowing users to smoothly onboard and extend our platform as well as enabling MATTR’s digital trust infrastructure to integrate with digital services and protocols that exist outside of our products. They are modular components in terms of logic and configuration, operating independently of Platform Core as an extensible set of APIs.

Finally, we offer a growing number of Developer Tools to simplify the user experience by providing additional interfaces and ways to interact with our platform. These tools are free and mostly optional to use, though they do simplify setting up the infrastructure needed to get started experimenting with the platform. Some tools, like some of the features exposed by MATTR’s Mobile Wallet, may be required to use certain features of the platform. Our Developer Tools are designed to work natively with Platform Core as well as our Platform Extensions.

Over the past 6 months, we have been working in close collaboration with a number of preview customers to create a great developer experience and identify features that are important for a wide variety of use cases. We’ve been working with partners from industry and government to make sure we’ve built a solution for the problems that matter to you.

Checkout MATTR Learn to find out more about our platform, view our API documentation, and follow our tutorials to start using the platform today.

The article Introducing the MATTR Platform appeared first on MATTR.

Wednesday, 16. September 2020

One World Identity

Socure: Fighting the Uptick in Identity Fraud


KuppingerCole

Zugriffsschutz für sensible Daten – mit Data Access Governance und Identity Governance

Damit Sie besagte Vorschriften rechtzeitig erfüllen können, ist es notwendig, sensible Daten zu erkennen und zu klassifizieren, unabhängig davon, wo sie sich befinden. Vor einer Cloud-Migration müssen Sie die Kritikalität von Daten verstehen und definieren, welche Informationen in die Cloud verlagert werden können, welche nicht und wie solche Informationen geschützt werden müssen. Die Sicherheit k

Damit Sie besagte Vorschriften rechtzeitig erfüllen können, ist es notwendig, sensible Daten zu erkennen und zu klassifizieren, unabhängig davon, wo sie sich befinden. Vor einer Cloud-Migration müssen Sie die Kritikalität von Daten verstehen und definieren, welche Informationen in die Cloud verlagert werden können, welche nicht und wie solche Informationen geschützt werden müssen. Die Sicherheit kann unter anderem durch proaktives Überwachen von unberechtigten und möglicherweise bösartigen Zugriffen erhöht werden. Die Ergebnisse Ihrer Arbeit sollten integriert in einem Identity-Management-System (IDM) dargestellt werden.

In diesem Webinar lernen Sie:

Wie Sie herausfinden, wer direkt und indirekt Zugriff auf geschäftskritische Applikationen und Daten hat Wie diese Informationen in das IDM-System kommen und wie Sie sie nutzen Mit welchen Methoden Sie sensible und unstrukturierte Daten in Ihrem Unternehmen aufspüren können Welche Informationen aus dem IDM-System Ihnen dabei helfen Wie Sie gleichzeitig die gesetzlichen Vorgaben zur Datensicherheit einhalten

KuppingerCole Principal Analyst Martin Kuppinger wird erklären, weshalb Data Access Governance und Identity Governance hervorragende Tools sind, um sensible Daten zu schützen und wieso sie darüber hinaus auch geeignet sind, um Compliance-Vorschriften umzusetzen.

Im zweiten Teil wird Klaus Hild, Senior System Sales Engineer von Sailpoint, in einer Live-Demo zeigen, wie man sensible D




Forgerock Blog

IAM 101 Series: What Is Identity Governance and Administration?

What is Identity Governance and Administration (IGA)?  Identity governance and administration (IGA) enables admins, security teams, and internal auditors to manage and reduce the risk that comes with excessive or unnecessary user access to applications, systems, and data. As the digital world continues to evolve, IGA is now mission-critical to secure every organization. Yet few know wha
What is Identity Governance and Administration (IGA)? 
Identity governance and administration (IGA) enables admins, security teams, and internal auditors to manage and reduce the risk that comes with excessive or unnecessary user access to applications, systems, and data.

As the digital world continues to evolve, IGA is now mission-critical to secure every organization. Yet few know what it is. With new data privacy and security regulations constantly emerging, organizations must now balance risk and customer experience while achieving regulatory compliance. Having the right identity governance and administration solution in place can play a crucial role in achieving this balance, keeping workforces productive, and enterprises secure. To fully understand what IGA is and why it’s become such a priority, we must look at how the need for it emerged in the first place.

The Early Years: User Provisioning and Mounting Regulations

To understand IGA, it’s important to understand what provisioning is and how user data was initially stored. User provisioning is the process that ensures that user accounts are created with the proper permissions. IT administrators use provisioning to monitor and control access to systems and applications. 

In the early years of the digital age (1980s - early 90s), user provisioning was rather straightforward as it focused solely on users (employees) within an organization. Access to users outside an organization, like customers or citizens, was not common. Additionally, there weren’t as many systems within an organization to manage access to, making the provisioning process relatively manageable. 

During this time, servers housed user accounts and identity data centrally on on-premises systems within the enterprise. However, in the mid-late 1990s as the .com market rapidly took off and external user access to systems and applications became ubiquitous, more sensitive user data such as name, address, social security number, country code, email address, bank account number, etc. were collected by global organizations. The need to protect this personally identifiable information (PII), the systems and applications that hosted this information quickly became critical. To address these requirements, new regulations were enacted that mandated stricter security protocols for user access permissions, required improved controls and policies to prove to auditors that the protocols had been implemented. 

The Rise of Identity Governance Regulations

Introduced in 1996, the Health Insurance Portability and Accountability Act (HIPAA) was created to provide stronger data privacy and security provisions for safeguarding medical information. As physicians later moved to digitized health records, the HIPAA Security Rule was issued as a best practice for securing sensitive digital information and establishing national standards to protect individuals’ electronic personal health information. This rule required appropriate administrative, physical, and technical safeguards to ensure the security of patient data. 

In 2002, Sarbanes-Oxley Act (SOX) was introduced to bolster stronger trust and security around the financials of publicly traded companies. SOX imposed even more regulatory protocols regarding electronic records. It mandated the joint responsibility of auditors and management for the detection of fraud and external threats, requiring stringent record keeping, audits, and controls. Noncompliance with SOX can cost organizations up to $25 million in fines, criminal and civil prosecution, and prison sentences of up to 20 years for those found in breach of the mandate.

In 2006, the PCI Council (formed by American Express, Discover Financial Services, JCB International, MasterCard and Visa) created a body of security standards known as the Payment Card Industry Data Security Standard (PCI DSS). Every merchant that accepts credit card payments must be in compliance with PCI DSS. PCI DSS includes requirements for security management, policies, procedures, and other critical protective measures. Failure to comply with PCI mandates leaves businesses vulnerable to the negative impacts of data breaches, such as fines, fees, and lost business.

With these new regulations and stricter protocols, organizations began to feel the strain of ensuring and proving compliance. This pressure only intensified in the mid-2000s as the market saw a massive increase in enterprise user demand for access to cloud-based applications and systems. As a result, this created a larger provisioning problem. Existing user provisioning solutions only supported internal user (employee) populations. They were not equipped to handle the growing numbers of users, accounts, systems and applications while trying to continue to meet regulatory compliance requirements. The need for a solution that supported user provisioning and management for internal and external systems and applications thus emerged. 

Turning to Identity Management as a Possible Fix

As traditional provisioning solutions struggled to keep up with increasing identity demands and regulations, many organizations turned to identity management (IDM) solutions to address these challenges. With the digital landscape evolving at a rapid pace as the introduction of cloud and software-as-a-service (SaaS) applications and solutions began sweeping through the enterprise landscape. The transition to these technologies meant that internal user identities were now being used to access new external cloud-based applications and systems outside of the enterprise network. The result was a tangled web of access to internal and external systems; a disorganized mass of accounts for workforce, consumers, and partners; and varying levels of access across multiple environments. 

Because of these new and ever-growing challenges, identity management solutions were unable to meet compliance regulations to ensure user access was reviewed, allowed, and/or revoked periodically. As a result, organizations would manually create and review user access certifications via spreadsheets distributed by email to business line managers annually or biannually for review and approval. Yet, with the exploding number of internal and external user identities, systems, and cloud applications, this process was no longer a scalable or viable option. With pressure mounting on organizations to achieve regulatory compliance, a new approach was needed.

The Emergence of Identity Governance and Administration

With a new approach, the existing user provisioning market morphed into identity management. In parallel, the genesis of identity governance came about due to the growing number of compliance regulations. Over time, both the identity management and identity governance markets merged into one market: identity governance and administration (IGA). IGA solutions address the needs of regulatory compliance through identity governance and user provisioning requirements through administration. In addition, identity governance and administration addresses user access privileges for both on-premises systems and applications, as well as cloud-based applications and systems, bridging the gap where previous solutions fell short.

Today, identity governance and administration helps organizations address common business challenges throughout their network and users. Benefits include better access compliance through certifying the appropriate level of users’ access and enhanced business productivity by providing this access to the right resources at the right time. IGA also benefits security and risk management by allowing organizations to govern user access with policy-based controls and minimizing operational inefficiencies by streamlining business processes.

In addition to helping overcome business challenges, Identity Governance and Administration supports a number of underlying use cases. These use cases include;  access requests (users requesting access to systems and applications), access approvals (managers approving user requests), access reviews (managers confirming user approvals or revoking user access), and role optimization (reviewing and updating role definitions).

 

  ForgeRock Identity Governance and Administration

The ForgeRock Identity Governance and Administration solution is an integral part of ForgeRock’s comprehensive identity platform. It allows you to establish policies for user access rights and continuously monitor their proper implementation from a centralized location. Through a periodic access review process — tied to a powerful workflow engine to ensure closed-loop remediation and built-in risk management and reporting — you can strengthen your security posture and automatically drive regulatory compliance.

Learn more about identity governance and administration and ForgeRock IGA by watching the webinar The Evolution and Modernization of Identity Governance or contact us today.

 


Nyheder fra WAYF

RMC ny brugerorganisation i WAYF

I dag er Rytmisk Musikkonservatorium (RMC) indtrådt i WAYF som brugerorganisation. Studerende og ansatte herfra kan derfor nu identificere sig som RMC-brugere over for de mange webtjenester i WAYF og eduGAIN af relevans for forskning og uddannelse. Language Danish Read more about RMC ny brugerorganisation i WAYF

I dag er Rytmisk Musikkonservatorium (RMC) indtrådt i WAYF som brugerorganisation. Studerende og ansatte herfra kan derfor nu identificere sig som RMC-brugere over for de mange webtjenester i WAYF og eduGAIN af relevans for forskning og uddannelse.

Language Danish Read more about RMC ny brugerorganisation i WAYF

IDnow

IDnow welcomes agreement on a transition period for online gambling law in Germany

Munich / September 16th, 2020 – IDnow, a leading Identity verification specialist, welcomes that Germany’s 16 states have agreed to a transition...

Munich / September 16th, 2020 – IDnow, a leading Identity verification specialist, welcomes that Germany’s 16 states have agreed to a transition period incorporating the new and upcoming legal framework by October 15th. This transition clarifies existing and new mandates including, and imporatantly, age verification requirements for online gambling. The new regime officially enters into effect in July of next year.

IDnow, in its mission is to make the connected world a safer place, is pleased about this decision among the German states. The company looks forward to supporting the operators in order to provide greater safety for users. For IDnow, this ensures better protection for customers under German law as well as clearer rules in regard to responsible gambling.

“This is a big step for Germany’s online gambling industry, and again, it shows the importance of eKYC methods and their need to evolve. In a world that becomes more and more digital by the day, we need to stay vigilant and constantly adapt our security requirements,” says Rayissa Armata, Head of Regulatory Affairs at IDnow. “IDnow strives to contribute to responsible corporate citizenship, ensuring that the social responsibility for this industry can be achieved effectively through innovative methods. This is the strong desire of the federal and state governments within Germany, and IDnow has been and –  will continue to be – an active supporter of those efforts,” she adds.

“We have developed our products together with our clients – the biggest players in the market – , to perfectly meet their needs. IDnow offers AML compliant video verification, but also an approved automated verification solution. Gambling operators can choose, depending on their security needs, which of those multiple solutions they want to use. All of them fulfill mandatory age verification requirements.” says Oliver Obitayo CSO at IDnow. “To us its more than important that our clients can offer  safe service platforms to customers so underage use can be prevented,” he adds.

After a period of uncertainty, each of Germany’s 16 states have agreed to a transition period for online gambling in Germany before the official start date of July 1, 2021. This transition period is noted by the shared agreement to not punish gambling companies who conform to the new law that will officialy go into effect next year.

In January 2020, the German states agreed on an amendment to the State Treaty on Gambling (Glücksspielstaatsvertrag) that also comes into effect July 1, 2021. This date was partly put into jeopardy, as was the process to issue sports betting licenses, in the first half of this year when matters came to a halt due to a court injunction sought by an Austrian gambling operator.

In a welcome development last week, Germany’s 16 states agreed to a transition period for gambling activities. This will allow operators to offer casino style gaming and poker as long as such activities are fully-compliant with the draft for the new Glücksspielstaatsvertrag. Online gambling operators must meet all licensing requirements by an October 15 deadline. This will include an Age Verification (AV) solution for online operators.

Under the new and upcoming regulatory regime, the following changes will be implemented:

It will be possible for German players to play online casino games and online poker under strict regulations A wagering limit of €1 will be placed per spin for slot games The number of licenses for table games will be limited to the number of physical locations for casinos in each state A €1,000 deposit limit will now be implemented across all online gambling verticals in Germany.

 With the recent acquisition of Wirecard Communication Services into the IDnow Group, the Munich based company has created additional capacities and possibilities for more flexibility in order to be able to adequately support its customers in every situation.


IDnow announces Bettina Pauck as new COO

Identity Verification Provider from Munich welcomes Bettina Pauck as Manager of the Operations Division.   Munich – September 4th, 2020, IDnow, a...

Identity Verification Provider from Munich welcomes Bettina Pauck as Manager of the Operations Division.

 

Munich – September 4th, 2020, IDnow, a leading provider of Identity Verification-as-a-Service solutions, is welcoming Bettina Pauck as Chief Operations Officer to its management team. She will head the Operations division at the Munich site as well as the Leipzig site, which will be part of the IDnow Group as of September. IDnow took this new subsidiary over as part of the acquisition of Wirecard Communication Services.

Following the successful takeover of Wirecard Communication Services GmbH at the beginning of this week, IDnow announces the appointment of Bettina Pauck as Chief Operations Officer to the management team of the Munich-based identity verification provider. Within the scope of the acquisition, she has already assisted in the valuation of the company. Her main task in the coming months will be to integrate the former Wirecard Communication Services GmbH – now IDnow Services GmbH – into the existing processes and to optimally position the business unit for the foreseeable growth.

In the last 12 years Bettina Pauck has been working with her own company as a consultant for companies like N26, reBuy or Axel Springer and has optimized their customer operations. She has been working in customer service since 2004 and now brings her many years of experience in strategic, tactical and operational customer management as well as in the conception and control of service structures to IDnow.

“Identity verification is at the heart of many industries, but most of all it ensures the security and sense of security of many customers. Since the customers are the focus of all my activities, I am particularly pleased to be able to make a real difference for the customers – and for IDnow – in this central function,” says Bettina Pauck. “IDnow is a company with a strong vision and my goal is to take the Operations division to a new level together with the outstanding team,” she adds.

“I am pleased that we were able to win Bettina Pauck as COO for IDnow. She will play a central role in the scaling of our operations area, also and especially in the context of the recent strong increase in demand. With the acquisition of Wirecard Communication Services, we have created additional capacity and infrastructure in order to further improve our range and service quality,” says Andreas Bodczek, CEO of IDnow.


KuppingerCole

Oct 29, 2020: What’s Really Going on in Your Microsoft Active Directory and Azure AD Infrastructure

Most small and mid-sized businesses rely on Microsoft technology in their IT infrastructure. For the vast majority of larger organizations, solutions such as Microsoft Active Directory also form a vital part of their IT infrastructure. Understanding what is going on in these infrastructures thus is essential. Only then, organizations will be able to react quickly and focused.
Most small and mid-sized businesses rely on Microsoft technology in their IT infrastructure. For the vast majority of larger organizations, solutions such as Microsoft Active Directory also form a vital part of their IT infrastructure. Understanding what is going on in these infrastructures thus is essential. Only then, organizations will be able to react quickly and focused.

Ontology

Ontology Weekly Report (September 8–15)

This week we have exciting news as Wing, the credit-based cross-chain DeFi platform situated on the Ontology blockchain, has opened their genesis pool for ONT and other digital assets available to be deposited in their Flash Pool. Generous incentives in WING tokens will soon be released for mining in the genesis pool. Users can use either a Cyano Wallet or ONTO Wallet to participate in Flash Pool,

This week we have exciting news as Wing, the credit-based cross-chain DeFi platform situated on the Ontology blockchain, has opened their genesis pool for ONT and other digital assets available to be deposited in their Flash Pool. Generous incentives in WING tokens will soon be released for mining in the genesis pool. Users can use either a Cyano Wallet or ONTO Wallet to participate in Flash Pool, which possesses a collateral rate that is significantly lower than similar platforms.

Back-end

- Completed 50% of Ontology GraphQL interface development

- The Rust Wasm contract development hub released ontio-std v0.3

Product Development

ONTO

- ONTO v3.3.0 released

- Supported logging into Wing from ONTO and depositing ONT on Wing. ONT deposited from ONTO taking up 55% of the total amount

- ONTO new users and daily active users increased over 500% from last month

dApp

- 80 dApps now live on Ontology

- 6,087,908 dApp-related transactions since genesis block

- 22,221 dApp-related transactions in the past week

Bounty Program

- 1 new application for the Technical Documentation Translation

Community Growth

- We onboarded 230 new members across Ontology’s Spanish, Korean, German communities.

Newly Released

- Starting from September 8, ONT can be deposited in Wing Flash Pool. Flash Pool will start releasing WING tokens as incentives for mining in the genesis pool. Users can participate in Flash Pool by using Cyano Wallet or ONTO Wallet.

- By September 11, an amount of ONT worth USD 15,000,000 has been deposited in Wing, the first credit-based, cross-chain DeFi platform based on Ontology blockchain. Wing also possesses a collateral rate significantly lower than similar platforms. During the first phase (September 15–19) of Wing Mining Celebration, 10 times the incentives will be released exclusively for depositing ONT.

Global Events

On September 9, Nick ZHANG, initiator of Wing, showed up at an online AMA co-organized by RealSatoshi, Winkrypto and Wing, during which he briefed blockchain enthusiasts and Wing users on their vision, mechanisms and core values. Nick believed that in the DeFi course, Wing has its defining value in its credit lending function, enabled by integrating OScore, which is an on-chain credit evaluation system. He also noted that Wing’s infrastructure demands are met by the credit-based identity and data framework exclusively provided by Ontology. He added, “The power of blockchain lies in consensus. It could be found in something as trivial as the record of a single transaction, or as big as a decentralized autonomous organization (DAO). It’s what enables us to envision a future where blockchain technology facilitates the process of globalization, and is on the way to breaking the barriers between races, national borders, assets and social classes. The debated catchphrases in the blockchain arena, such as decentralization, immutability and equitability, all check the boxes of derivative values born out of consensus built on blockchain.”

Find Ontology elsewhere

Ontology website / ONTO website / OWallet (GitHub)

Twitter / Reddit / Facebook / LinkedIn / YouTube / NaverBlog / Forklog

Telegram Announcement / Telegram English / GitHubDiscord

Ontology Weekly Report (September 8–15) was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


Smarter with Gartner - IT

5 Strategic Cost Optimization Action Items for Security Leaders

Eighty-two percent of security and risk leaders do not adjust their budgets based on environmental or business impact, which means they operate in a silo and are not aligned with the business. Without that solid alignment, when a disruption happens, it can be difficult to ensure security is supporting what is important to the organization. “With any disruption, with any crisis, you’re always g

Eighty-two percent of security and risk leaders do not adjust their budgets based on environmental or business impact, which means they operate in a silo and are not aligned with the business.

Without that solid alignment, when a disruption happens, it can be difficult to ensure security is supporting what is important to the organization.

“With any disruption, with any crisis, you’re always going to have uncertainty, and uncertainty has a negative impact on your budget,” said Sam Olyaei, Director Analyst, during the virtual Gartner Security & Risk Management Summit, 2020. “Ultimately there has to be some sort of trigger or change within the budget planning process to navigate past the challenges.”

Gartner suggests five action items to guide security and risk leaders through cost optimization to enable a balanced and valuable outcome.

[swg_ad id="28186"]

Action item 1: Identify crisis phase and what actions you will take

Leaders must be able to know which phase of the crisis the organization is currently in to be able to respond accordingly. The first is the respond phase, which focuses on keeping the lights on, essential services and making sure cost optimization is in place. In this phase, an organization might make policies more flexible or focus on tech that provides immediate value.

Read more: Gartner Top 10 Security Projects for 2020-2021

The next phase is recovery. This is where the strategic cost optimization begins now that leaders can look beyond the day to day. In the recovery phase, optimize for value and manage risk in comparison to cost. Effective security and risk leaders will use this phase to demonstrate value in the business by stretching staff skills and accelerating automation.

The final stage is the renew phase. Discussions will move past cost cuts to drive innovation and exploit opportunities to create value. This is the phase to scale digital with agile practices and prepare for a new normal. Arguably, this is where most leaders differentiate themselves from the rest of the pack.

Action item 2: Equip yourself with data for decision making

Whether it’s gathered from business reports, benchmarking, current state assessment or asset inventory, data is critical. Ideally, you’ll want to use a combination of all the sources to get the best pictures of the organization. Data can be used to move away from making decisions based on legacy or emotions, and toward efficiency and metrics. It will enable you to showcase and highlight how security and risk is making decisions and why you’ve reached a particular conclusion.

Action item 3: Build adaptable and realistic budget scenarios

Although scenario planning isn’t usually a large part of the security and risk business unit, it’s important to plan, test and design budgets for scenarios that you might face in the near or short term.

Read more: Gartner Keynote: Balance Risk, Trust and Opportunity in an Uncertain World

For example, how would security handle an emergency budget cut, like being asked to cut spending by half through 2Q? What about selective cost reduction of 10% of the budget every quarter? What if security was asked to preserve costs and emphasize growth by maintaining the current budget but focusing aggressively on delivering business value?

Action 4: Align based on business unit value, risk and cost

Consider how to allocate resources to particular business units based on the potential value of each unit as it relates to the business. Look for indicators like revenue, business value or number of employees to help balance risk, value and cost.

For example, at an entertainment company, hospitality has high business value, but a bad risk posture. For security and risk, this represents a true opportunity to increase investment in ensuring better security posture for a business unit that is vital to the overall business.

Action 5: Take a portfolio view of cost optimization

Security and risk leaders need a holistic view of cost optimization. It boils down to two pieces: Supply and demand. Supply has two main areas, security contract management and cost savings, within security. Here are opportunities to negotiate strategies and deliver technology value efficiently.

On the demand side, focus on joint business as well as security and business optimization. This includes activities/actions such as accelerating business outcomes by looking a little further out and enabling new digital business models.

[swg_ad id="36784"]

The post 5 Strategic Cost Optimization Action Items for Security Leaders appeared first on Smarter With Gartner.


Otaka - Secure, scalable, and highly available authentication and user management for any app.

Build a React App with ANT Design Principles

For years the go-to HTML/CSS framework of choice for developers was Bootstrap. A new contender has appeared in the form of Ant Design. Ant should feel familiar to veteran developers but it’s built on new principles. Their site spends a good amount of effort distinguishing between good and bad design. There is an emphasis on clarity and meaning. Ant Design is heavily based on psychological principl

For years the go-to HTML/CSS framework of choice for developers was Bootstrap. A new contender has appeared in the form of Ant Design. Ant should feel familiar to veteran developers but it’s built on new principles. Their site spends a good amount of effort distinguishing between good and bad design. There is an emphasis on clarity and meaning. Ant Design is heavily based on psychological principles to anticipate—and be customized for—user behavior.

Ant Design is built for React. In this tutorial, you will build a small React app that displays transactions to the user based on the Ant Design principles. You will use Okta to secure your web application. Okta is very easy to set up and using the libraries for integrating into React is simple. You will learn how to secure your app with Okta and how to put certain pages under authentication.

What is Ant Design?

Ant Design principles were developed for Ali Baba. Ali Baba is one of the largest companies in the world who’s B2B E-Commerce site is the backbone of a multinational Chinese corporation. Ant Design of React is a set of React components using the Ant Design principles.

It’s virtually impossible to cover all of Ant’s concepts in one article but I’ll attempt to give you a high-level look. Ant aims to be natural, clear, and concise in the presentation of material. It relies on natural user cognition and natural user behavior to dictate where elements are most likely to be seen and how to enable users to quickly identify what type of information they are being shown. Ant is designed to allow for growth in your website as your company and/or line of business grows with it. Ant focuses on developer certainty, rather than relying on what component to use. Ant attempts to guide the developer to user-specific components for a task. Ant is also designed around the concept of helping users achieve their mission. To do this, Ant is focused on providing clear feedback to the user if they are on the right path.

Ant provides many common components you will need to develop any website. This includes the basics like tables and forms, down to alerts, calendars, and tabs. You can also upgrade to a professional license and get access to graphs, charts, dashboards, and much more. Ant also provides several templates along with an online editor to rapidly prototype your site’s design.

Set Up Your Okta Application

Okta aims to make its SSO service as simple as possible for developers. They have taken on most of the middleware logic with their suite of packages for React. Before you can build your application you will need to set up an application in Okta’s developer console. Navigation to your developer’s console and log in. Click on Applications and click on Add Application. On the next page select Single-Page App and click Next.

In the next section, you will configure your application settings. Give your application a meaningful name. I named my application Transactions but you can name yours whatever you like. Next, you should change the references to localhost:8080 to localhost:3000 as that is the default development server for React. Click Done and take note of your Client ID on the next page. You will need this in your application.

Create Your React Application

As this application will be in react you can use the create-react-app toolchain to quickly scaffold your app. To do this open the folder where your application will be and run:

npx create-react-app transactions

It takes a moment but after it’s complete you can use cd transactions to switch to your new React app.

You’ll need to get a couple of packages from npm. First is the Ant Design of React package that contains the components based on Ant Design.

npm i antd@4.3.3

Next, you will need to get Okta’s React library to help manage your authentication.

npm i @okta/okta-react@3.0.2

You’ll also need the React Router package for the web from npm.

npm i react-router-dom@5.2.0

Finally, you will want to use dotenv to store your sensitive values in the .env file. This file can be added to your .gitignore.

npm i dotenv@8.2.0

After dotenv is completed you can add a new file to your root directory called .env. Add the following code to it. REACT_APP_OKTA_URL_BASE is the same that you use to log in to create your application. REACT_APP_OKTA_CLIENTID is the ClientId that you obtained after creating your application on Okta’s developer console.

REACT_APP_OKTA_CLIENTID={yourClientId} REACT_APP_OKTA_URL_BASE={yourOktaDomain} REACT_APP_OKTA_APP_BASE_URL=http://localhost:3000 Add Your React Components

The basic setup is done and it’s time to start putting Ant Design of React to work. Add a new folder in your src directory called Components. In this folder add a new file called LoginForm.jsx. Add the following code.

import React, { useState } from 'react'; import OktaAuth from '@okta/okta-auth-js'; import { useOktaAuth } from '@okta/okta-react'; import { Form, Input, Button, Alert, Row, Col } from 'antd'; const LoginForm = ({ baseUrl, issuer }) => { const { authService } = useOktaAuth(); const [sessionToken, setSessionToken] = useState(); const [username, setUsername] = useState(); const [password, setPassword] = useState(); const [error, setError] = useState(); const handleSubmit = (e) => { e.preventDefault(); const oktaAuth = new OktaAuth({ url: baseUrl, issuer: issuer }); oktaAuth.signIn({ username, password }) .then(res => setSessionToken(res.sessionToken)) .catch(err => setError(err)); }; const layout = { labelCol: { span: 8 }, wrapperCol: { span: 8 }, }; const tailLayout = { wrapperCol: { offset: 8, span: 16 }, }; const handleUsernameChange = (e) => { setUsername(e.target.value); }; const handlePasswordChange = (e) => { setPassword(e.target.value); }; if (sessionToken) { authService.redirect({ sessionToken }); return null; } const errorAlert = error ? <Row> <Col span="8"></Col> <Col span="8"> <Alert message="Authentication Failed" type="warning"></Alert> </Col> </Row> : '' return ( <Form {...layout} onSubmit={handleSubmit} > <Row> <Col span="8"></Col> <Col span="8"><p>Please Login with your Okta Account</p></Col> </Row> <Form.Item label="Username" name="username" value={username} onChange={handleUsernameChange} rules={[{ required: true, message: 'Please input your username!' }]} > <Input /> </Form.Item> <Form.Item label="Password" name="password" value={password} onChange={handlePasswordChange} rules={[{ required: true, message: 'Please input your password!' }]} > <Input.Password /> </Form.Item> <Form.Item {...tailLayout}> <Button type="primary" htmlType="submit" onClick={handleSubmit}> Login </Button> </Form.Item> { errorAlert } </Form> ); }; export default LoginForm;

This component is a basic login form with username and password fields along with a button and some validation. You can see that your components are imported from the Ant library. Rows and Cols are fairly self-explanatory. One note is that Ant uses a 24-cell grid rather than the 12-cell grid you may be used to from Bootstrap. The form component here has some layout options and an override for the submit. You are tying into the submission to use Okta’s React package for authentication. Ant also provides Form.Item components. These components can have their own validation rules right on the component. Finally, you have an Alert component that will flash a simple validation message if the authentication fails.

Next, you can add a new file in Components called SiteHeader.jsx. The code follows.

import React from 'react'; import { useOktaAuth } from '@okta/okta-react'; import { Layout, Menu } from 'antd' const { Header } = Layout; const SiteHeader = (props) => { const {authState, authService} = useOktaAuth(); if (authState.isAuthenticated) { return ( <Header> <div className="logo"/> <Menu theme="dark" mode="horizontal" defaultSelectedKeys={[props.selectedKey]}> <Menu.Item key="dashboard">Dashboard</Menu.Item> <Menu.Item key="logout" onClick={() => { authService.logout() }}>Logout</Menu.Item> </Menu> </Header> ); } else { return ( <Header> <div className="logo"/> <Menu theme="dark" mode="horizontal" defaultSelectedKeys={[props.selectedKey]}> <Menu.Item key="home"><a href="/">Home</a></Menu.Item> <Menu.Item key="login"><a href="/Login">Login</a></Menu.Item> </Menu> </Header> ); } }; export default SiteHeader;

To avoid a naming conflict with the Header component from antd, this component is called SiteHeader. This component will be placed at the top of each of your pages. The component uses Okta to check the authState and chooses which menu to show. Here is your first exposure to the Menu component provided by Ant. You are passing a selectedKey into the props of this component to set the defaultSelectedKeys. This property will highlight the selected menu item to make it clear to the user what page they are on.

Finally, add a new component called SiteFooter.jsx. Add the following code to it:

import React from 'react'; import { Layout } from 'antd'; const { Footer } = Layout; const SiteFooter = (props) => { return ( <Footer style={{textAlign: 'center'}}>Ant Design ©2020 Created with Ant Design of React using Okta by <a target="_blank" href="https://profile.fishbowlllc.com">Nik Fisher</a></Footer> ); } export default SiteFooter; Create Your Pages

Now you can use these components to help build your pages. Create a new folder in your src directory called Pages. Add a new file for Home.jsx first.

import React from 'react'; import { Redirect } from 'react-router-dom'; import { useOktaAuth } from '@okta/okta-react'; import SiteHeader from '../Components/SiteHeader' import SiteFooter from '../Components/SiteFooter'; import { Layout, Row, Col, Card } from 'antd'; import { SearchOutlined } from '@ant-design/icons'; const Home = () => { const { authState } = useOktaAuth(); const { Content } = Layout; const { Meta } = Card; return (authState.isAuthenticated ? <Redirect to={{ pathname: '/Dashboard' }} /> : <Layout> <SiteHeader selectedKey='home'> </SiteHeader> <Content> <Row style={{ padding: 20 }}> <Col span="4"></Col> <Col span="4"></Col> </Row> <Row style={{ padding: 20 }}> <Col span="4"></Col> <Col style={{ padding: 10 }} span="4"> <Card cover={ <img alt="Okta" src="https://www.okta.com/sites/all/themes/Okta/images/blog/Logos/Okta_Logo_BrightBlue_Medium.png" height="100px" /> } actions={[ <a href="https://www.okta.com" target="_blank"> <SearchOutlined key="ellipsis" /></a> ]} > <Meta title="Okta"/> </Card> </Col> <Col style={{ padding: 10 }} span="4"> <Card cover={ <img alt="Ant.Design" src="https://gw.alipayobjects.com/zos/rmsportal/KDpgvguMpGfqaHPjicRK.svg" height="100px" /> } actions={[ <a href="https://ant.design/" target="_blank"> <SearchOutlined key="ellipsis" /></a> ]} > <Meta title="Ant.Design"/> </Card> </Col> <Col style={{ padding: 10 }} span="4"> <Card cover={ <img alt="React" src="https://upload.wikimedia.org/wikipedia/commons/a/a7/React-icon.svg" height="100px" /> } actions={[ <a href="https://reactjs.org/" target="_blank"> <SearchOutlined key="ellipsis" /></a> ]} > <Meta title="React"/> </Card> </Col> <Col style={{ padding: 10 }} span="4"> <Card cover={ <img alt="Fishbowl" src="https://fishbowlllc.com/images/logo_web.png" height="100px" width="50px" /> } actions={[ <a href="https://profile.fishbowlllc.com/" target="_blank"> <SearchOutlined key="ellipsis" /></a> ]} > <Meta title="Fishbowl Software"/> </Card> </Col> </Row> </Content> <SiteFooter></SiteFooter> </Layout> ); }; export default Home;

Here you see the usage of the selectedKey property on the SiteHeader. As you’ll recall this property will be set on the defaultSelectedKeys property of the Menu component. You are also checking the authState here and redirecting the user to their dashboard page if they are already logged in.

You can also see the Layout and Content section. Ant Design provides many well-designed examples for the basic layout of your page. The home page (and the login page) will have a simple Header/Content/Footer layout.

Next, you can add the Login.jsx page which will make use of your LoginForm component.

import React from 'react'; import { Redirect } from 'react-router-dom'; import LoginForm from '../Components/LoginForm' import { useOktaAuth } from '@okta/okta-react'; import SiteHeader from '../Components/SiteHeader'; import SiteFooter from '../Components/SiteFooter'; import { Layout } from 'antd'; const { Content } = Layout; const Login = ({ baseUrl, issuer }) => { const { authState } = useOktaAuth(); if (authState.isPending) { return <div>Loading...</div>; } return authState.isAuthenticated ? <Redirect to={{ pathname: '/Dashboard' }} /> : <Layout> <SiteHeader selectedKey="login"></SiteHeader> <Content> <LoginForm baseUrl={baseUrl} issuer={issuer} /> </Content> <SiteFooter></SiteFooter> </Layout> }; export default Login;

Again you are passing the selectedKey value of login to the SiteHeader component. This page also checks an authenticated user and moves the user to the Dashboard page.

Finally, add Dashboard.jsx to your Pages folder and add the following code.

import React, { Component } from 'react'; import SiteHeader from '../Components/SiteHeader'; import SiteFooter from '../Components/SiteFooter' import { Layout, Breadcrumb, Menu, Anchor, Table, Tag, Row, Col } from 'antd'; import { UserOutlined } from '@ant-design/icons'; const { Content, Header, Sider } = Layout; const { SubMenu } = Menu; const { Link } = Anchor class Dashboard extends Component { constructor(props, context) { super(props, context); const accounts = [ { id: 1, name: 'Checking', transactions: [ { id: 1, amount: -100.00, type: 'debit', tags: [ 'groceries' ] }, { id: 2, amount: 2000.00, type: 'credit', tags: [ 'payroll' ] }, { id: 3, amount: -50.00, type: 'debit', tags: [ 'credit card', 'bills' ] }, { id: 4, amount: -300.00, type: 'debit', tags: [ 'car', 'bills' ] }, { id: 5, amount: -1000.00, type: 'transfer out', tags: [ 'savings' ] }, { id: 5, amount: -1000.00, type: 'transfer out', tags: [ 'mm account' ] } ] }, { id: 2, name: 'Savings', transactions: [ { id: 1, amount: 1000.00, type: 'transfer in', tags: [ 'savings' ] } ] }, { id: 3, name: 'Mutual Market', transactions: [ { id: 1, amount: 1000.00, type: 'transfer in', tags: [ 'groceries' ] } ] }, ] var selectedAccount = {}; if (this.props.account) { selectedAccount = this.state.accounts.filter(account => account.id == this.props.account)[0]; } else { selectedAccount = accounts[0] } this.state = { selectedAccount: selectedAccount, accounts: accounts, viewingTransactions: false } } changeDashboard = (e) => { var key = e.key; this.setState({ selectedAccount: this.state.accounts.filter(account => account.id == key)[0] }); } render() { const columns = [ { title: 'Type', dataIndex: 'type', key: 'type', }, { title: 'Amount', dataIndex: 'amount', key: 'amount', sorter: { compare: (a, b) => a.amount - b.amount }, }, { title: 'Tags', dataIndex: 'tags', key: 'tags', render: tags => { return tags.map(tag => { return (<Tag color="blue" key={tag}> {tag} </Tag>); }) } }, ]; var table = this.state.selectedAccount.transactions ? <Table dataSource={this.state.selectedAccount.transactions} columns={columns}></Table> : '' return ( <Layout style={{ minHeight: "100vh" }}> <Sider collapsible> <div style={{ height: "32px", margin: "16px" }}></div> <Menu defaultOpenKeys={['accounts']} defaultSelectedKeys={[this.state.selectedAccount.id ? this.state.selectedAccount.id.toString() : '']} theme="dark" mode="inline"> <SubMenu key="accounts" icon={<UserOutlined />} title="Accounts"> { this.state.accounts.map((account, i) => { return <Menu.Item onClick={(e) => this.changeDashboard(e)} key={account.id}>{account.name}</Menu.Item> }) } </SubMenu> </Menu> </Sider> <Layout className="site-layout"> <SiteHeader /> <Content style={{ margin: '0 16px' }}> <Breadcrumb style={{ margin: '16px 0' }}> <Breadcrumb.Item> <a href="Dashboard">Dashboard</a> </Breadcrumb.Item> </Breadcrumb> <Row> <Col> <h2>{this.state.selectedAccount.name}</h2> </Col> </Row> <Row> <Col span="4"></Col> <Col span="16">{table}</Col> <Col span="4"></Col> </Row> </Content> <SiteFooter></SiteFooter> </Layout> </Layout > ); } }; export default Dashboard;

The layout here is a little more exotic. You are adding a sidebar menu, called a Sider in Ant Design, that will contain the user’s accounts. You can see at the top of this file you added some sample data to display on this page. There are three accounts. Of course, the Sider menu also accepts defaultSelectedKeys which you are setting to the first account unless a specific account is passed into this page. The Sider is collapsible, providing a collapse button on the bottom of the menu. The Breadcrumb navigation can help users navigate on more complex web structures. This app is simple enough that the breadcrumbs are mostly for show, but they can be instrumental if your users are going to navigate down multiple paths.

You’ll see here you are also using the Row/Col paradigm that you are likely familiar with. In keeping with the 24-grid system, your column spans add up to 24, although the last column isn’t necessary. Finally, you are making use of the Table component provided by Ant Design of React. The table accepts a data source and some column definitions and uses these to generate a table for you. No more loops or maps in your code. The column definitions are very robust. In this example, you are presenting the user with some tags that describe the transactions. Because these tags are in an array you need to use the render function on the column definition to tell Ant what to do with this. In this case, you are creating a Tag for each tag in your array. A sorter on your Amount column allows the user to sort by the transaction amount.

Set up your App.js

Finally, you will need to define your routes and set up your App.js file. First, add a new file to the src folder called AppWithRouterAccess.jsx and add the following code.

import React from 'react'; import { Route, useHistory } from 'react-router-dom'; import { LoginCallback, SecureRoute, Security } from '@okta/okta-react'; import Home from './Pages/Home' import Dashboard from './Pages/Dashboard' import Login from './Pages/Login' const AppWithRouterAccess = () => { const history = useHistory(); const onAuthRequired = () => { history.push('/login'); }; const baseDomain = process.env.REACT_APP_OKTA_URL_BASE; const issuer = baseDomain + '/oauth2/default' const clientId = process.env.REACT_APP_OKTA_CLIENTID; const redirect = process.env.REACT_APP_OKTA_APP_BASE_URL + '/login/callback'; return ( <Security issuer={issuer} clientId={clientId} redirectUri={redirect} onAuthRequired={onAuthRequired} pkce={true}> <Route path='/' exact={true} component={Home}/> <Route path='/login' render={() => <Login baseUrl={baseDomain} issuer={issuer}/>}/> <SecureRoute path='/Dashboard' exact={true} component={Dashboard}/> <Route path='/login/callback' component={LoginCallback}/> </Security> ); }; export default AppWithRouterAccess;

This is where you tie together the magic of Okta. By defining Dashboard as a SecureRoute to ensure that the application will check for authentication before allowing the user to process. The function onAuthRequest() is passed into the Security component and moves the user to the Login page if the user isn’t authenticated. The rest of the routes are defined here as well, including one for /login/callback which Okta will use when returning user information to your application.

You need to make your App.js show the application rather than the boilerplate react page.

import React from 'react'; import { BrowserRouter as Router } from 'react-router-dom'; import AppWithRouterAccess from './AppWithRouterAccess'; import './App.css' const App = () => { return ( <Router> <AppWithRouterAccess/> </Router> ); } export default App;

And finally, you will need to import the Ant Design CSS. To do this, open your App.css file and add the line @import '~antd/dist/antd.css'; to the top of the file.

Run and Test

Your application is now complete. In the terminal run the command npm start and see the results. You should be presented with the Home page. From here you can click on Login and use your Okta credentials to log in. Afterward, you will be directed to the dashboard page.

Honestly, there is far more to Ant Design than what you have read here. One could take an entire course on the subject. But Ant can help developers understand what components they should be using and why. The principles developed of studies of human behavior can streamline the design process and make developers certain of which tools to use. I encourage you to try a couple of projects in Ant Design and take the time to learn the principles. In the long run, the knowledge and experience will make you that much better.

Learn More About JavaScript and React

If you’d like the final version of this code, you can clone it from GitHub. If you have questions about this post, please post them to the comments below. If you have questions about Okta, please submit them to the Okta Developer Forum. If you enjoyed this post, you might also enjoy these related posts:

Build Your First Deno App with Authentication Build a Simple React Application Using Hooks How to Use CSS Grid to Build a Responsive React App

Also, don’t forget to follow us on Twitter and subscribe to our YouTube Channel!

Tuesday, 15. September 2020

Global ID

The GiD Report #126 — Forget Zoom, the video revolution begins now

The GiD Report #126 — Forget Zoom, the video revolution begins now Welcome to The GiD Report, a weekly newsletter that covers GlobaliD team and partner news, market perspectives, and industry analysis. You can check out last week’s report here. The GiD Report is back after a brief hiatus. Today’s newsletter will be a bit of a quickie as I’m still getting up to speed. What we have
The GiD Report #126 — Forget Zoom, the video revolution begins now

Welcome to The GiD Report, a weekly newsletter that covers GlobaliD team and partner news, market perspectives, and industry analysis. You can check out last week’s report here.

The GiD Report is back after a brief hiatus. Today’s newsletter will be a bit of a quickie as I’m still getting up to speed.

What we have for you this week:

The beginning of a video revolution Book Club preview: Marc Andreessen The need for positive tech narratives The Economist on digital identity What’s going on with Epic v. Apple The NYTimes on WeChat Stuff happens 1. The pandemic has changed the way we work, play, and connect, accelerating brewing trends. What was once a ten-year horizon has transpired overnight, the domain of nerds and early adopters suddenly mainstream. And Zoom is one of the hottest stocks around because it’s all about video. The thing is, this is just the beginning.

Here’s a great overview from the Telegraph via /gregkidd (pdf attached):

If Wu and other start-ups get their way, Zoom, Google Meet and Microsoft Teams will one day seem like the Model T of video calling services: functional and pioneering, but primitive. As the stubborn coronavirus continues, and the likelihood of increasing reliance on screens even after it passes, investors and entrepreneurs are now going through an explosion of dedicated video apps intended to replace today’s one-size-fits-all services.

In a sense, it’s not so different from how Ayo thinks about fintech (re: vertical neobanks). One-size-fits-all is a reasonable way to start. But it’s only a matter of time before fine-tuned UX for specific target groups will rule the roost.

Photo: cottonbro
Cannon says Zoom fulfils the specific and valuable task it was designed for: virtual workplace meetings. But it was not designed for classrooms, doctors’ appointments, live concerts or hen parties, all of which it has been co-opted for in the last six months.
The most recent batch of companies at Y Combinator, the leading Silicon Valley start-up school that is seen as a barometer of new tech trends, had at least a dozen video-related services. They ranged from Together, a video chat app for grandparents and grandchildren that includes games and bedtime stories, to Zuddl, for holding and streaming large business conferences online.
Another, Rally, aims to let people hold parties online without everyone shouting over one another: individuals can “take the stage”, for example when delivering a speech, or groups can split off into “tables”, where the muffled noise of the rest of the party can still be heard in the background.
Former yoga teacher Rachel Lea Fishman founded Sutra, which lets fitness instructors host classes online, earlier this year. The service was originally designed to let instructors rent out space and host classes, but was forced to shift to video as coronavirus closed gyms.

Related:

Facebook introduces a co-viewing experience in Messenger, ‘Watch Together’ — TechCrunch KNOW Identity Digital Forum Recap: Identity’s Role in Re-Opening the Economy — One World Identity Via /mg — The Social Dilemma | Netflix Official Site Zoom is killin’ it 2. Book Club preview: Marc Andreessen On Productivity, Scheduling, Reading Habits, Work, and More

It’s an interview with a16z founder Marc Andreessen (via /gregkidd). We’ll dive into this later in the week with the Book Club.

3. The last decade of tech has naturally gravitated us towards a more dystopian outlook on technology, further emphasized by popular culture with TV shows like Black Mirror and books like The Circle. Given the outcomes, challenges, and unintended consequences of this last cycle of tech, the reaction feels natural. But maybe it’s time for a more positive narratives to take hold to balance all the negativity.

Here’s the NYTimes tech newsletter:

Sriram Krishnan, a technology executive whom I respect, tweeted a few days ago asking for more optimistic descriptions in movies and television of people building technology. He didn’t put it quite this way, but I imagined he wanted less fiction like “The Circle,” about a surveillance-state corporate cult, and more like “Iron Man,” in which a tech nerd cobbles together a suit that saves his life and gives him superhero powers.
I get what Krishnan is saying, and there’s a bigger meaning behind it. Right now, there’s a lot of pessimism about the harm of social media, the creepiness of digital surveillance of our smartphones and our faces and the nefarious power of tech giants.
Those downers sometimes drown out the ways that we know technology has made many of our lives immeasurably better. Both “The Circle” and “Iron Man” encompass some form of reality, but it’s easy to see technology as either one or the other.

Which I believe is very apt for the work we’re trying to do here at GlobaliD. Technology shouldn’t force you to undermine your values or concede your individuality. There’s a hugely positive story to tell here and we’re writing it every day.

Speaking of which — why telling that story in the right way from day one is so important:

Zuckerberg says he regrets not explaining Facebook’s free-speech ideals from the start.
“I just wish that I’d spent more time earlier on communicating about what our principles are and what we stand for — you know, things like free expression and voice and that we’re going to defend those.”
“Now a lot of people look at us and they see this as a successful company. With a lot of money. And it’s a little hard now, I think, for us to go back and talk about our principles and have people see the principles for anything but, you know, some talking points.”
4. Along with video, the pandemic has also put a spotlight on digital identity.

Here’s The Economist:

In countries without a system of secure digital identities, the closure of bricks-and-mortar government offices and the shift of public services online have caused havoc (see article). Divorces and adoptions have run into a virtual brick wall. Italy’s system for doling out emergency payments crashed and then demanded paperwork that applicants could not obtain because government offices were shut. In America, Washington state paid $650m in unemployment insurance to fraudsters who made applications using stolen identities.
No such havoc occurred in Estonia, a tiny Baltic state where every citizen has an electronic identity. More than just an identity card, it links every Estonian’s records together. So when the government created a furlough system for workers affected by the pandemic, it already knew where they worked and how to pay them. Nobody in Estonia had to join a queue on a pavement to claim benefits, as people in other places did.

See also: Apple Pay Was Not Disruptive But Apple ID Will Be (via Luka)

Related:

U.K. digital ID tool passes big test — SecureIDNews Phil Windley — Authentic Digital Relationships LG CNS to Take Lead in Developing Next-generation Digital Identification Technology 5. What’s going on with Epic v. Apple:

The Information:

Apple on Tuesday countersued Epic Games, the developer of “Fortnite,” asking a judge for monetary damages in the escalating legal battle between the two companies.
Apple made its claims in a filing with a federal court in the Northern District of California on Tuesday, in which the company also rejected the legal claims Epic made in an antitrust lawsuit it filed against Apple last month.
While Apple’s counter-suit was anticipated and didn’t contain any big surprises, the company tucked a noteworthy detail into the filing, saying that Epic has earned over $600 million in revenue through iOS apps in the past. That’s a substantial figure that likely represents the success of “Fortnite,” Epic’s hit battle game.
Apple justified its demand for financial damages in part because it said Epic has deprived it of App Store commissions by allowing “Fortnite” users to bypass an Apple payment system. The company also said Epic has hurt Apple’s image through an “extensive smear campaign” and requested punitive damages for its conduct.

See also: Tim Sweeney on open platforms

6. The NYTimes talks WeChat.

Because WeChat is a really, really, really big deal:

Still, to be free she would have to delete WeChat, and she can’t do that. As the coronavirus crisis struck China, her family used it to coordinate food orders during lockdowns. She also needs a local government health code featured on the app to use public transport or enter stores.
“I want to switch to other chat apps, but there’s no way,” she said.
“If there were a real alternative I would change, but WeChat is terrible because there is no alternative. It’s too closely tied to life. For shopping, paying, for work, you have to use it,” she said. “If you jump to another app, then you are alone.”
7. Stuff happens: Via /jvsDigital Bank Neon Pagamentos Raises $300M Series C India Bans 118 Chinese Apps as Indian Soldier Is Killed on Disputed Border Inside China’s unexpected quest to protect data privacy Cash App’s Surge During Covid-19 Pandemic Fuels Square Stock ING and Albert Heijn to pilot online payments service that tokenizes customers’ bank account details • NFCW PayPal has a fraud problem Alejandro Machado: Venezuelans Look to Crypto-Dollars — CoinDesk Top regulator pushes ahead with plan to reshape banking, sparking clash with states Chinese Bank Disables Digital Yuan Wallet After Soft Launch Draws Wide Attention Via /lauraParticl.io • Privacy-focused Decentralized Applications Justice Dept. Plans to File Antitrust Charges Against Google in Coming Weeks Kenosha militia group, not Facebook, took down its event page Apple to Delay iOS Change Roiling Mobile Ad Market Tech platforms hold dry runs to game out election night chaos scenarios Briefing: Oracle Chosen as ByteDance’s Partner for TikTok Briefing: TikTok Submits Oracle Deal Proposal to U.S. Treasury Via /mg — Why Amazon Has A Fake Review Problem Square Forms Group to Stop Patent Hoarding From Stifling Crypto Innovation — CoinDesk Pandemic Will Speed Bitcoin Adoption, Says DBS Bank Economist — CoinDesk Druckenmiller is worried about inflation, says it could hit 10% in coming years CBDCs: Geopolitical Ramifications of a Major Digital Currency Facebook’s Sandberg Says TikTok Ban ‘Very Unusual’ in Business History Facebook just invented… Facebook

The GiD Report #126 — Forget Zoom, the video revolution begins now was originally published in GlobaliD on Medium, where people are continuing the conversation by highlighting and responding to this story.


KEYLESS

Why legacy security models don’t work

And how zero-trust biometric solutions can help you transition to a more secure digital workplace We’re rapidly approaching a future fueled by digital technology — one where users and employees frequently access private systems outside the security perimeters of a workplace. Traditional security systems, which are flawed to begin with, fail to offer adequate protection from cyber threats outside
And how zero-trust biometric solutions can help you transition to a more secure digital workplace

We’re rapidly approaching a future fueled by digital technology — one where users and employees frequently access private systems outside the security perimeters of a workplace.

Traditional security systems, which are flawed to begin with, fail to offer adequate protection from cyber threats outside of the security perimeters of a workplace.

To properly secure remote working environments, organizations must start thinking about implementing radically different security models.

In this article, we’ll cover: Traditional security models and their failings Zero-trust security models A biometric multi-factor authentication alternative What are traditional security models?

Traditional security models refer mainly to perimeter network firewalls.

Firewalls were designed to protect corporate systems from malicious attacks by erecting security defenses that ward off threats from outside of the perimeters of the network.

Security experts often refer to this as the “castle and moat” approach. The idea is based on the assumption that anyone outside the “moat” (firewall), cannot be trusted; and that anyone in the “castle” (the network) can be.

The issue with this approach is that it assumes anyone within the network can be trusted. This is problematic for two reasons: The assumption that everyone inside a network is trusted leaves organization’s vulnerable to insider-orchestrated attacks This approach fails to protect organizations when employees are accessing systems from outside the security perimeters of a firewall. With the dramatic increase in remote work — firewalls are not just weak, they’re redundant.

Remote-work security is traditionally tackled with VPNs, but with the rising adoption of smart technology comes the proliferation of security backdoors — greatly increasing the likelihood of a hacker successfully breaching a corporate network via poorly protected remote devices.

If a hacker compromises one smart-device within a home network, then the potential to launch further attacks on other devices (used to access company systems) greatly increases. In theory, hackers could potentially hijack a user’s device, steal their login credentials, and use them to bypass a VPN.

Hackers only need to find the weakest entry point to a network in order to infiltrate it.

Once a hacker has gained access to a network, they’re then free to move laterally within the network until reaching a target — usually a highly sensitive database that contains the private data of users, employees or clients.

According to IBM, the average breach went unnoticed for up to 206 days (just under six months), giving malicious parties plenty of time to steal such data.

Zero-trust security: assume no one is trusted

Zero-trust authentication systems take a radically different approach to network security.

They assume that no-one can be trusted — whether inside a network or not. Therefore anyone attempting to access systems must be authenticated no matter where, when or how they access systems.

Since no trust is assumed, zero-trust models make a point of frequently authenticating users as they move through a network.

Zero-trust solutions protect against insider threats

Insider threats and cybersecurity incidents doubled between 2018 and 2019 — meaning it’s absolutely true that not everyone who has permissions to a network can be trusted.

Since zero-trust models assume that no one is trusted, they protect organizations from such threats. Insider organized threats can otherwise have large-scale financial and reputational consequences for organizations.

Zero-trust multi-factor authentication

Zero-trust multi-factor authentication solutions combine the principles of zero-trust security, with the principles of strong authentication.

Strong authentication, or MFA, improves security by presenting users with at least two different kinds of security challenges.

For example, an MFA solution may require a user to enter their login credentials, followed by a one-time-password sent to the user’s registered phone number.

Traditional MFA solutions create a range of user-experience problems when combined with the idea of zero-trust security: having to authenticate at every level of access can be disruptive when the authentication experience is cumbersome.

Plus, legacy MFA solutions are not immune to threats — since 2015, customers in the UK lost £9.1 million due to SIM swapping attacks. (SIM swapping is where a bad actor is able to bypass 2FA by having the ‘one-time code’ sent to the victim’s phone number, diverted to a new SIM).

Read more about security issues with 2FA solutions like YubiKey in this article.

The Keyless solution: Zero-Knowledge Biometrics

At Keyless, we combine multi-modal biometrics with privacy-enhancing secure multiparty computation to provide a passwordless, secure and privacy-first way to authenticate users that is minimally disruptive to the user-experience.

Our privacy-enhancing biometric authentication solutions offer multi-factor (MFA) security by design:

Keyless verifies users are authenticating from their trusted device. If a device is not registered, the user won’t be able to authenticate. Keyless uses facial biometrics to verify users across every touchpoint — a universal inherence factor as an added level of security. Keyless will soon also leverage behavioral biometrics, which serves as another, transparent third authentication factor.

Once a user is registered, all they need to do to access their accounts is look into the camera of their device.

This simple, user-friendly solution can be implemented at multiple access points, ensuring that only the right users have the right access at the right time, while offering cutting-edge security.

To protect end-users and organizations against fraudulent takeovers, Keyless leverages advanced liveness detection and anti-spoofing techniques, in addition to the built-in multi-factor security. This allows Keyless to ensure that the user is in fact, real.

Simple, secure, and above all, private authentication

To recap that’s:

No passwords No one-time codes No secret questions

Or in other words that’s nothing to remember, nothing to type, nothing to lose, nothing to forget, nothing to phish and nothing to copy and paste.

To find out how Keyless’ ZKB™ biometrics work, watch the video!

How we protect biometric data

Keyless never stores sensitive information on a user’s device or on centralized servers. Instead, encrypted shares of data are stored on the Keyless Network, a distributed cloud network.

This is possible thanks to our patent-pending technology zero-knowledge biometric (ZKB™) authentication — enabled by the unique combination of state-of-the-art biometrics with zero-knowledge cryptography and privacy-enhancing multi-party computation.

This breakthrough technology allows Keyless to authenticate users, without needing or being able to access the raw contents of someone’s sensitive information. In other words, we don’t trust anyone, not even ourselves.

Request a Free Trial of Keyless

Keyless™ authentication can help deliver secure and seamless digital experiences for your end-users and for your increasingly remote workforce.

Head to our website to learn more about our biometric authentication and identity management solutions.

www.keyless.io

Alternatively, you can email us directly at info@keyless.io

Why legacy security models don’t work was originally published in KeylessTech on Medium, where people are continuing the conversation by highlighting and responding to this story.


Smarter with Gartner - IT

Gartner Top 10 Security Projects for 2020-2021

“Are you trying to ensure security for your remote workforce but don’t want to hinder business productivity?” “Are you struggling with identifying risks and gaps in security capabilities?” “Where should CISOs focus time and resources?”  Security and risk management experts constantly ask these questions, but the real question should be what projects will drive the most business value and re

“Are you trying to ensure security for your remote workforce but don’t want to hinder business productivity?” “Are you struggling with identifying risks and gaps in security capabilities?” “Where should CISOs focus time and resources?” 

Security and risk management experts constantly ask these questions, but the real question should be what projects will drive the most business value and reduce risk for the organization in a constantly shifting security landscape. 

“We can spend too much precious time overanalyzing choices we make about security, striving for this notion of perfect protection that just simply does not exist,” said Brian Reed, Sr. Director Analyst, during the virtual Gartner Security & Risk Management Summit, 2020. “We must look beyond basic protection decisions and improve organizational resilience through innovative approaches to detection and response, and ultimately, recovery from security incidents.“

[swg_ad id="28186"]

The key is to prioritize business enablement and reduce risk — and communicate those priorities effectively to the business. 

This year’s top 10 security projects, based on Gartner forecasts and adjusted for the impact of COVID-19 — feature eight new projects, focused heavily on risk management and understanding process breakdowns. These projects, which aren’t listed in order of importance, can be executed independently. 

No. 1: Securing your remote workforce

Focus on business requirements and understand how users and groups access data and applications. Now that a few months have passed since the initial remote push, it’s time for a needs assessment and review of what has changed to determine if access levels are correct and whether any security measures are actually impeding work.

No. 2: Risk-based vulnerability management

Don’t try to patch everything; focus on vulnerabilities that are actually exploitable. Go beyond a bulk assessment of threats and use threat intelligence, attacker activity and internal asset criticality to provide a better view of real organizational risk.  

No. 3: Extended detection and response (XDR)

XDR is a unified security and incident response platform that collects and correlates data from multiple proprietary components. The platform-level integration occurs at the point of deployment rather than being added in later. This consolidates multiple security products into one and may help provide better overall security outcomes. Organizations should consider using this technology to simplify and streamline security. 

No. 4: Cloud security posture management

Organizations need to ensure common controls across IaaS and PaaS, as well as support automated assessment and remediation. Cloud applications are extremely dynamic and need an automated DevSecOps style of security. It can be challenging to secure the public cloud without a means to ensure policy uniformity across cloud security approaches. 

Read more: Top Actions From Gartner Hype Cycle for Cloud Security, 2020

No. 5: Simplify cloud access controls

Cloud access controls typically are done through a CASB. They offer real-time enforcement through an in-line proxy that can provide policy enforcement and active blocking. CASBs also offer flexibility by, for example, starting out in monitoring mode to better ensure fidelity of traffic and understand security access. 

No. 6: DMARC

Organizations use email as the single source of verification, and users struggle to determine real messages from fakes. DMARC, or domain-based message authentication, reporting and conformance, is an email authentication policy. DMARC is not a total solution for email security, and should be one piece of a holistic security approach. However, it can offer an additional layer of trust and verification with the sender’s domain. DMARC can help domain spoofing but will not address all email security issues. 

No. 7: Passwordless authentication

While employees may not think twice about using the same password for their work computer as they do for the personal email, it can cause major security headaches. Passwordless authentication, which can functionally work in a few different ways, offers a better solution for security. The goal should be to increase trust and improve the user experience. 

No. 8: Data classification and protection

All data is not the same. A one-size-fits-all security approach will create areas of too much security and others of too little, increasing the risk for the organization. Start with policies and definitions to get the process right before beginning to layer in the security technologies. 

No. 9: Workforce competencies assessment

Install the right people with the right skills in the right roles. It’s critical but challenging to combine hard technical skills with softer leadership expertise. There are no perfect candidates, but you can identify five or six must-have competencies for each project. Assess competencies in a range of ways, including cyber-ranging and cybersimulations and softer skill assessments. 

No. 10: Automating security risk assessments 

This is one way to help security teams understand risks related to security operations, new projects or program-level risk. Risk assessment tends to be either skipped entirely or done on a limited basis. These assessments will allow for limited risk automation and visibility into where risk gaps exist.  

[swg_ad id="36784"]

The post Gartner Top 10 Security Projects for 2020-2021 appeared first on Smarter With Gartner.


KuppingerCole

Nov 05, 2020: Choosing the Right Fraud Tool in a Digital Evolving Economy

Fraud is a major cost to businesses worldwide and it is on the rise. In it’s 2019 annual report, the Internet Crime Complaint Center (IC3) put a specific focus on methods such as elder fraught, credit card fraud and confidence/romance fraud, emphasizing the growing occurrence of intentional deception in cybercrime. Companies of all industries face the same challenge: The complex and constantly chan
Fraud is a major cost to businesses worldwide and it is on the rise. In it’s 2019 annual report, the Internet Crime Complaint Center (IC3) put a specific focus on methods such as elder fraught, credit card fraud and confidence/romance fraud, emphasizing the growing occurrence of intentional deception in cybercrime. Companies of all industries face the same challenge: The complex and constantly changing world of fraud detection.

Working for the Business, not the Auditors

by Matthias Reinwarth Forward-thinking companies no longer see cybersecurity, governance, compliance and privacy as something they are just forced to do. Instead, a comprehensive alignment with applicable policies, regulations and laws is the basis for consistently and competitively enabling and sustainably operating digital business.

by Matthias Reinwarth

Forward-thinking companies no longer see cybersecurity, governance, compliance and privacy as something they are just forced to do. Instead, a comprehensive alignment with applicable policies, regulations and laws is the basis for consistently and competitively enabling and sustainably operating digital business.


Atos DirX Directory

by Martin Kuppinger Atos DirX Directory is one of the few enterprise-grade directory services in the market, delivering a high-performance, high-scalability, high-availability, and high-secure implementation that is required by many of today’s large-scale use cases in the Digital Transformation. Atos DirX Directory builds on a purpose-built and optimized data layer and delivers comprehensive supp

by Martin Kuppinger

Atos DirX Directory is one of the few enterprise-grade directory services in the market, delivering a high-performance, high-scalability, high-availability, and high-secure implementation that is required by many of today’s large-scale use cases in the Digital Transformation. Atos DirX Directory builds on a purpose-built and optimized data layer and delivers comprehensive support for X.500 and LDAPv3 protocols and specifications.


R&S®Trusted Gate - Secure Glocalization by Rohde & Schwarz Cybersecurity

by Matthias Reinwarth Rohde & Schwarz Cybersecurity offers reliable management of regulated and sensitive information to promote compliant collaboration and file sharing in unified SharePoint platforms for organizations spanning countries and regions with different laws and regulatory requirements. Rohde & Schwarz Cybersecurity utilizes globally distributed, efficient and secure infrastru

by Matthias Reinwarth

Rohde & Schwarz Cybersecurity offers reliable management of regulated and sensitive information to promote compliant collaboration and file sharing in unified SharePoint platforms for organizations spanning countries and regions with different laws and regulatory requirements. Rohde & Schwarz Cybersecurity utilizes globally distributed, efficient and secure infrastructure with central and consolidated administration while maintaining compliance and privacy.


Forgerock Blog

ForgeRock Identity Cloud Gets Even Better

Since launching ForgeRock Identity Cloud earlier this year, we have seen strong interest and innovative usage from a variety of customers. The COVID-19 crisis has contributed to the surge in our momentum. Financial services and retail customers have seen their foot traffic to physical locations drop by 80 percent or more. At the same time, online traffic is skyrocketing. For many companies, these

Since launching ForgeRock Identity Cloud earlier this year, we have seen strong interest and innovative usage from a variety of customers. The COVID-19 crisis has contributed to the surge in our momentum. Financial services and retail customers have seen their foot traffic to physical locations drop by 80 percent or more. At the same time, online traffic is skyrocketing. For many companies, these spikes are resulting in massive increases in costs because cloud vendors are doubling their overage fees. We can help with this.

We built our cloud platform as a scalable service for cost-effectively modernizing large, complex, and diverse application portfolios at companies navigating their cloud migration journey. ForgeRock Identity Cloud has become even more flexible with identity platform as a service functionality, delivering on our commitment to provide the most comprehensive cloud solution possible.

Today, we’re happy to share exciting enhancements now available that make our cloud service even more powerful.

Seamless Orchestration: The one overwhelming request we get from ForgeRock Identity Cloud customers is: “Don’t dumb it down.” We listened. This release builds on our aim of extreme configurability. You can continue to deliver omnichannel experiences and security for all identities using the power of ForgeRock Intelligent Access to seamlessly orchestrate self-service and authentication journeys for your users. One Subscription for Maximum Flexibility: With one subscription to ForgeRock Identity Cloud, we give you complete flexibility to not only consume as a service from us, but also deploy the ForgeRock Identity Platform anywhere -- in your datacenter, private cloud or public cloud -- in a hybrid configuration. That one subscription also means you enjoy predictable pricing that includes unlimited annual usage per user with surplus user coverage that protects you even if your business grows in unexpected ways.  Full Tenant Isolation: We take security very seriously at ForgeRock. Our approach ensures your data is never commingled with other customer data. This not only prevents accidental data spillage issues, but also prevents the noisy and nosey neighbor issue.  Getting Started 

All of the features discussed today are now available. Download the ForgeRock Identity Cloud white paper to learn more. 

Coming Up! 

If you’re unsure how to start planning your future in the cloud, don’t miss ForgeRock Identity Live: Cloud Edition. I’ll be hosting this virtual event and can guarantee you’ll walk away with useful tips on how to transform your organization. I hope to see you there next week! 

Thanks, from the entire ForgeRock Identity Cloud team!  

 


KuppingerCole

Dec 02, 2020: Managing Azure AD – Regardless of How You Use It

Microsoft Azure Active Directory (Azure AD) has gained widespread adoption. Coming with Microsoft Azure Cloud as well as Microsoft 365 (i.e. Office 365), it appears in many organizations just because of decisions made outside of the IAM team.
Microsoft Azure Active Directory (Azure AD) has gained widespread adoption. Coming with Microsoft Azure Cloud as well as Microsoft 365 (i.e. Office 365), it appears in many organizations just because of decisions made outside of the IAM team.

Monday, 14. September 2020

KuppingerCole

The Fast Track to Optimized Operations With IAM-as-a-Service

IAM-as-a-Service provides the operational agility by bringing-in skills and expertise to implement a precise strategy, technology deployment, process automation, service delivery and support model, thus, gaining operational efficiency and streamlining budgets. Therefore, IAM-as-a-Service, with its pre-built integrations, is the ideal solution for short-staffed IAM teams. Join this webinar and le

IAM-as-a-Service provides the operational agility by bringing-in skills and expertise to implement a precise strategy, technology deployment, process automation, service delivery and support model, thus, gaining operational efficiency and streamlining budgets. Therefore, IAM-as-a-Service, with its pre-built integrations, is the ideal solution for short-staffed IAM teams.

Join this webinar and learn

    Why IAM projects often stall     How you implement initiatives faster with IAM-as-a-Service     How you can make use of automation to take the burden off your IAM teams     To optimize IAM budgets by outsourcing repetitive, low-value tasks with IAM managed service

Matthias Reinwarth, Lead Advisor at KuppingerCole, will explain why IAM projects typically stall when companies don’t have the necessary human or financial resources or expertise and why this hampers the overall digital transformation of a company.

He will be joined by Swapnil Mehta, General Manager, Identity, Access, and Privacy at Persistent Systems, who will address IAM requirements with new digital transformation efforts, process automation with IAM deployments, cost optimization with 24x7 hosted IAM service




Smarter with Gartner - IT

Gartner Keynote: Balance Risk, Trust and Opportunity in an Uncertain World

2020 has been a year of immense challenge for organizations, and especially for those tasked with securing them. When COVID-19 swept the world, security and risk leaders were suddenly responsible for securing remote employees on an extremely aggressive timeline. While some organizations had an established infrastructure for a seamless transition, others were scrambling to supply employees with har

2020 has been a year of immense challenge for organizations, and especially for those tasked with securing them. When COVID-19 swept the world, security and risk leaders were suddenly responsible for securing remote employees on an extremely aggressive timeline. While some organizations had an established infrastructure for a seamless transition, others were scrambling to supply employees with hardware, often resorting to older machines with less than ideal security parameters. 

Enterprise project portfolios are changing on the fly, and of course, increasing our risk landscape

Distracted employees in less-than-ideal work-from-home environments became potential sources of risk, and hackers began exploiting newly presented opportunities. After the initial response, CISOs needed to continue to recommend and guide their businesses, which were operating under unprecedented and daunting conditions.

“Over the past six months, defining risk appetite has become even more of a challenge for security leaders. Enterprise project portfolios are changing on the fly, and of course, increasing our risk landscape,” said Jeffrey Wheatman, VP Analyst, during the opening keynote of the virtual Gartner Security & Risk Management Summit, 2020

[swg_ad id="28186"]

“Security leaders are focusing on reprioritizing projects and initiatives, which involves dropping some, adding and accelerating others, all while trying to hit the moving target of risk appetite.”

The security risks of 2020

Even before COVID-19, new and different security challenges were present in 2020. More digital services were being delivered on a global scale and tensions were high in international communities; for example, trade disputes that often incited cyberwars. Further, more digitalization of physical objects means more cyber-physical risks that security and risk leaders must address. And a general increase in digital services with increased customer touchpoints, from banks to utility companies, creates even more vulnerabilities ripe for exploitation by a nefarious party.

And then came COVID-19. 

Some industries — travel, retail, entertainment — experienced catastrophic impact, while others — online shopping, telemedicine — saw huge growth. Each of these situations comes with unique security and risk challenges. 

The initial scramble to keep businesses operating, people working and money moving afforded an opportunity to identify new risks, reassign resources and shift investments to meet outcomes. Now that organizations have moved past the initial response phase, security and risk leaders can review what was done and identify new risks on the horizon — as well as new opportunities. 

The drive to digital business

New opportunities bring with them new risks, and CISOs worry the business will fail to prepare for some of them. The Gartner CEO survey revealed that 82% of CEOs have a digital transformation or management initiative, up from 62% in 2018. However, a Gartner survey of CISOs found that while 90% of respondents believe digital business will create new types and new levels of risk, 70% felt the investment in risk management was not keeping up with the newer higher level of risk. 

The good news is that the business continues to value cybersecurity as an essential function and executives will look to CISOs to secure the business and limit risk, while also enabling opportunities for technology to transform operating models. This is especially important as CEOs look to accelerate digital business to survive and thrive in a post-COVID-19 environment. 

The goal should be to balance risk, trust and opportunity as businesses and organizations enter the “renew” stage of pandemic planning. New technologies will help accelerate and guide this process, whether it’s XDR or the rapidly increasing push to cloud, or what role automation and artificial intelligence (AI) will play as the world reacclimates.  

Your disciplines are fundamental to addressing the risks inherent in the technology solutions your enterprises are embracing

For example, at the beginning of 2020, a large government agency responsible for gathering, analyzing and storing immense amounts of personal data started the year with a slate of 19 active security projects. Once COVID started to ramp up, the agency realized it would be unable to sustain the work effort. But when the organization reframed its approach to security projects with a renewed focus on balancing risk, trust and opportunity, the team was able to narrow the list to just nine active projects — including three new opportunities. 

With a renewed mindset and focus on balance, security and risk leaders must guide and drive the business to remain secure and limit risk as they accelerate digital and move into a new phase. 

“As risk and security experts, you cannot change the course of the major disruptions impacting your enterprises,” said Wheatman. “But your disciplines are fundamental to addressing the risks inherent in the technology solutions your enterprises are embracing to help them recover and renew in the post-pandemic era — and with your expertise in identifying, assessing and managing the new risks inherent in these technologies, the opportunity to succeed is endless.”

[swg_ad id="36784"]

 

The post Gartner Keynote: Balance Risk, Trust and Opportunity in an Uncertain World appeared first on Smarter With Gartner.


KuppingerCole

Data-Driven Decision Making for Identity Security

Symantec Enterprise: With more informed decisions comes more automated security. In today’s Zero Trust world, where the principle of least privilege is ubiquitous, enterprises are struggling to balance security while simultaneously enabling a highly agile business environment. There has always been friction with security and making highly specific security decisions quickly and efficiently contr
Symantec Enterprise: With more informed decisions comes more automated security.

In today’s Zero Trust world, where the principle of least privilege is ubiquitous, enterprises are struggling to balance security while simultaneously enabling a highly agile business environment. There has always been friction with security and making highly specific security decisions quickly and efficiently contributes to this. Moreover, decision-making in enterprises exists on a spectrum from completely manual to completely automated. Regardless of where your organization resides on this scale, you are moving more and more towards automation--whether you know it or not. The real question is, as your business becomes more agile, how can you keep your security posture from falling off a cliff?

Traditional security tools can help with decision-making, but oftentimes there isn’t enough data to automate this decision-making process, or there isn’t enough confidence with the data that does exist. This results in a lot of manual effort, which cannot be supported by overworked and understaffed security teams. The solution to this challenge is more data; the more data you have, the more informed any automated decision will be to grant access (and specifically which type of access). Thus, from a Zero Trust standpoint more data is almost always better, and as we move towards a more secure enterprise environment, it will be essential to draw on data from numerous sources, including access requests, authentication, authorization, session activity, user behavior, etc. The more data, the clearer the picture - the clearer the picture, the safer and faster the decision.

But where do we collect this data? The answer is, you already have some of it - you just aren’t leveraging it. The data that your existing identity and access management solutions are collecting while they are continuously monitoring user access and activity is invaluable. This data might include things like login times, login locations, associated roles/access, etc. But the marginal benefits of additional data, which might initially seem irrelevant, should not be underestimated. For example, suppose we gather average session data for users, which on its own might not be predictive of risk. It could be that longer average session time combined with administrator access is the single most predictive measure of malicious behavior, but if we don’t bother collecting the data, we’d never know that.

The upside of collecting and consolidating identity and other data is that machine learning tools can continually analyze this data to search for new patterns that enable more informed decisions. These tools are capable of learning not only which data is valuable for risk assessment at any given time, but also how to apply that risk assessment to make informed security decisions. And through more informed decisions comes more automated security, which enables business agility.

At Broadcom, we are focused on a future state where all data inputs - from authorization, access requests, authentication, session monitoring, and contextual user behavior - are used at the business level. Symantec’s Global Intelligence Network (GIN) is one of the largest civilian data repositories in existence, providing an extraordinary amount of data to inform security decisions. As the mindset of security has evolved, we are capable of making much more granular decisions with a greater level of context, creating and enforcing policies that are highly contextual. We are evolving beyond identity, because the data we gather allows businesses to function at the next level.


Otaka - Secure, scalable, and highly available authentication and user management for any app.

Build Your First Deno App with Authentication

The creator of Node.js, Ryan Dahl has authored a new framework for designing web applications. He went back and fixed some mistakes he made in hindsight, taking advantage of new technologies that were not available at the time he originally wrote Node. The result is Deno (pronounced DEH-no), a framework for writing “Node-like” web applications in TypeScript. Here, I will walk you through creating

The creator of Node.js, Ryan Dahl has authored a new framework for designing web applications. He went back and fixed some mistakes he made in hindsight, taking advantage of new technologies that were not available at the time he originally wrote Node. The result is Deno (pronounced DEH-no), a framework for writing “Node-like” web applications in TypeScript. Here, I will walk you through creating a basic web application with authentication.

You can find almost all the information you need at the Deno website—along with information on all the third-party libraries that are currently available for Deno. That is really the biggest drawback to the framework right now. It just hit version 1.0 on May 13th of 2020, so even though there are quite a few essential libraries, there are not nearly as many libraries as there are for Node. For those who are proficient in Node however, the transition to Deno should be pretty easy.

You can find the installation instructions at https://deno.land/#installation.

Create Your Deno Application

There aren’t any basic scaffolding libraries that I could find, so I just started with an empty folder. In the application’s root folder, create a file called index.ts that will be the starting point of your Deno application. You’ll use Opine, which is an Express clone for Deno to make building and routing easier.

One thing that is different about Deno is that there are no package managers for bringing in third-party libraries. You do this by using the library’s full URL. Do that at the top of the index.ts file, then set up a basic web application.

import { opine } from 'https://deno.land/x/opine@0.12.0/mod.ts'; const app = opine(); app.get('/', (req, res) => { res.send('Deno Sample'); }); app.listen(3000); console.log('running on port 3000');

You can then run this very basic application by going to the terminal in the application’s folder and entering:

deno run -A index.ts

The -A is a shortcut for development purposes. Deno is completely locked down by default, so you’ll need to pass arguments to the run command to allow access like --allow-net to allow networking, and --allow-read to allow the application to read from the file system. The -A used here allows everything, effectively disabling all security. When you run this application and then go to http://localhost:3000 you should be greeted with Deno Sample on a blank page.

Build a Real Web Application with Deno

While this is a good first step, it’s not very useful. You’ll want to add some real functionality that’s a little more “real-world”, so change the index.ts file so that the contents are:

import { opine, serveStatic } from 'https://deno.land/x/opine@0.12.0/mod.ts'; import { renderFileToString } from 'https://deno.land/x/dejs@0.7.0/mod.ts'; import { join, dirname } from 'https://deno.land/x/opine@main/deps.ts'; import { ensureAuthenticated } from './middleware/authmiddleware.ts'; import users from './controllers/usercontroller.ts'; import auth from './controllers/authcontroller.ts'; const app = opine(); const __dirname = dirname(import.meta.url); app.engine('.html', renderFileToString); app.use(serveStatic(join(__dirname, 'public'))); app.set('view engine', 'html'); app.get('/', (req, res) => { res.render('index', { title: 'Deno Sample' }); }); app.use('/users', ensureAuthenticated, users); app.use('/auth', auth) app.listen(3000); console.log('running on port 3000');

You’ll notice some more import statements which bring in some third-party libraries. Here, I am using dejs which is an EJS port for Deno. I’ve also included some utility classes from the Opine library for manipulating directory names. I will explain what the three files imported locally are in a moment. For now, just know that you’re importing them.

The line below the instantiation of the opine() app creates a reference to the local directory. The three lines below use this to set the view engine to DEJS for processing the HTML-like files, similar to the way EJS does for Node. The next section has been changed slightly to render one of those HTML template files, and the last two lines bring in some external routes. One thing of note is that the /users route has an ensureAuthenticated() middleware function. This will force users to log in before being allowed to visit the page. You’ll create that middleware shortly.

Fill In Your Deno Application

Now, you’ll want to create some of the missing pieces that you imported above. Start with the routes. Create a folder called controllers in the root of the application. Then add a usercontroller.ts file inside that folder with the following contents:

import { Router } from 'https://deno.land/x/opine@0.12.0/mod.ts'; const users = new Router(); // users routes users.get('/me', (req, res) => { res.render('users/me', { title: 'My Profile', user: res.app.locals.user }); }); export default users;

This is a simple routing file. It gets the router from Opine and creates a new instance to hang routes from. Then there is code to add a route for /me to render the HTML view in users/me. The render() call also passes a title and the logged-in user to the page. This page will be protected so that there will always be a user to pass to the page.

Next, create some views to show when the routes are hit. In the root folder, add a views folder. Inside that, create a shared folder and a users folder. In the shared folder create a header.html and footer.html file. In the users folder add a me.html file. Finally, in the views folder itself create an index.html file.

These are pretty bare-bones, but it demonstrates how to create views that can be reused by other views. In the shared/header.html file add the following:

<!DOCTYPE html> <html lang="en"> <head> <meta charset="utf-8"> <meta name="viewport" content="width=device-width,initial-scale=1"> <title><%= title %></title> </head> <body>

This outputs the top of an HTML page and injects the title into the page. Next, add the following to the shared/footer.html file:

</body> </html>

Now you can use those partials in the index.html file:

<%- await include('views/shared/header.html', { title }); %> <a href="/users/me">My Profile</a> <%- await include('views/shared/footer.html'); %>

This includes the footer and header partials and adds a link to the profile page. The contents of the users/me.html file are:

<%- await include('views/shared/header.html', { title }); %> <h1>My Profile</h1> <ul> <% for(var p in user){ %> <li><strong><%= p %>: </strong><%= user[p] %></li> <% } %> </ul> <%- await include('views/shared/footer.html'); %>

Again, this page includes the header and footer, and loops through the properties of the user object. Granted, it’s not a super-sexy profile page, but it will let you know that the authentication steps all worked.

Add Authentication with Okta

If you don’t already have an Okta account, you can get a free developer account here. Once you’ve signed into Okta, you’ll land on the dashboard. You’ll need to create an Okta application to take advantage of Okta as an Identity Provider for your project.

Click on Applications in the menu, then Add Application. This will take you to the application wizard. Choose Web for your platform, then click Next. The next page is the Application Settings page. Give your application a name (I named mine DenoExample). Change all the URLs to use port 3000 instead of 8080, then change the Login Redirect URIs to http://localhost:3000/auth/callback. This is a route you’ll be implementing shortly. Finally, click Done to finish creating the application in Okta.

Once you’re on the page for your newly-created application, make sure you’re on the General Settings tab and scroll to the bottom until you see a Client Credentials section. You’ll be using these values momentarily, so keep this window open.

Back in your application, create a new file in the root of the application called .env. The contents of the file will be:

issuer=https://{yourOktaOrgUrl}/oauth2/default clientId={yourClientID} clientSecret={yourClientSecret} redirectUrl=http://localhost:3000/auth/callback state=SuPeR-lOnG-sEcReT

Copy the Client ID and Client Secret from the Client Credentials section of your Okta application. Then go back to the dashboard and copy your Okta org URL from the right-hand side just below the menu.

Now you’re ready to start talking to Okta for authentication. Unfortunately, I couldn’t find any OpenID Connect (OIDC) libraries to make authentication with OAuth 2.0 and OIDC easier than this, so you’ll have to create it by hand. However, this can be an awesome exercise to help understand how OAuth and OIDC work. In the root folder of your application, create a new folder called middleware and add a file called authmiddleware.ts. Then add this content:

import { config } from 'https://deno.land/x/dotenv/mod.ts'; export const ensureAuthenticated = async (req:any, res:any, next:any) => { const user = req.app.locals.user; if (!user) { const reqUrl = req.originalUrl; const {issuer, clientId, redirectUrl, state} = config(); const authUrl = `${issuer}/v1/authorize?client_id=${clientId}&response_type=code&scope=openid%20email%20profile&redirect_uri=${encodeURIComponent(redirectUrl)}&state=${state}:${reqUrl}`; res.location(authUrl).sendStatus(302); } next(); }

First, bring a library for reading the .env file. The dotenv does this beautifully. Then you’ll implement the ensureAuthenticated() middleware that starts the first step of the authentication process. First, it checks to make sure the user isn’t already logged in. If they are, it just calls next() because there is nothing to do.

If there isn’t a currently logged in user, it builds a URL made up of the issuer, clientId, redirectUrl, and state properties from the .env file. It makes a call to the /v1/authorize endpoint of the issuer’s URL. It then redirects to that URL. This is a login page hosted by Okta. Kind of like when you’re redirected to Google to log in with Google as the identity provider. The URL that it will call when login is done is the http://localhost:3000/auth/callback URL that’s in the .env file. I’ve also tagged the original URL that the user was going to when they were redirected to the state query parameter. This will make it easy to direct them back there once they’ve logged in.

Next, you’ll need to implement the auth/callback route to handle the result from the login page and exchange the authorization code that you’ll receive from Okta. Create a file called authcontroller.ts in the controllers folder with the contents:

import { Router } from 'https://deno.land/x/opine@0.12.0/mod.ts'; import { config } from "https://deno.land/x/dotenv/mod.ts"; const auth = new Router(); // users routes auth.get('/callback', async (req, res) => { const { issuer, clientId, clientSecret, redirectUrl, state } = config(); if (req.query.state.split(':')[0] !== state) { res.send('State code does not match.').sendStatus(400); } const tokenUrl: string = `${issuer}/v1/token`; const code: string = req.query.code; const headers = new Headers(); headers.append('Accept', 'application/json'); headers.append('Authorization', `Basic ${btoa(clientId + ':' + clientSecret)}`); headers.append('Content-Type', 'application/x-www-form-urlencoded'); const response = await fetch(tokenUrl, { method: 'POST', headers: headers, body: `grant_type=authorization_code&redirect_uri=${encodeURIComponent(redirectUrl)}&code=${code}` }); const data = await response.json(); if (response.status !== 200) { res.send(data); } const user = parseJwt(data.id_token); req.app.locals.user = user; req.app.locals.isAuthenticated = true; res.location(req.query.state.split(':')[1] || '/').sendStatus(302); }); function parseJwt(token:string) { const base64Url = token.split('.')[1]; const base64 = base64Url.replace(/-/g, '+').replace(/_/g, '/'); const jsonPayload = decodeURIComponent(atob(base64).split('').map(function(c) { return '%' + ('00' + c.charCodeAt(0).toString(16)).slice(-2); }).join('')); return JSON.parse(jsonPayload); }; export default auth;

There is actually a lot less going on here than you might think. First, the imports bring in the Router from Opine and read in the .env file again. Then they instantiate the router like in the usercontroller.ts file. The next thing I did was deconstruct the config object to make it easier to use the values. Next, I checked the state query parameter to make sure it matches. This helps ensure that Okta is the one who sent the authorization code. Then the authorization code gets pulled off the query string with req.query.code.

What happens next is a call to the token endpoint. You’ll send the authorization code in a POST request to Okta to exchange for an ID Token. So, here I’ve built some headers for the request. The most important is the Authorization header that has a value of Basic {yourClientId}:{yourClientSecret} the client ID and secret are base64 encoded. Then the POST call is finally made to the token endpoint with those headers and a body with a grant_type of authorization_code—the same redirect URL as before—and the authorization code I just received from Okta.

The fetch() call returns a promise resolved with the then() function. I get the response object’s JSON value, make sure the call was successful, parse the id_token value with the parseJwt() function below and stick it into a local variable called user. Finally, the user is sent to the URL they originally requested before being redirected for authentication.

Run the Deno Application

You can now run the application from the terminal again with:

deno run -A index.ts

Once it’s running you will be able to click on the profile link on the home page and will be redirected to Okta’s hosted login page. Once you’ve logged in, you’ll be directed back to the profile page and you’ll see your ID token’s properties displayed in a list.

Learn More About Deno and Node.js

The completed code for this post, is on Github. If you want to learn more about Deno and Node, check out the links below.

Painless Node.js Authentication Build a Node.js API with TypeScript Build and Understand Express Middleware Through Examples

If you liked this post, follow us on Twitter and subscribe to our YouTube Channel so you never miss any of our awesome content!


MyKey

MYKEY Weekly Report (September 7th~September 13th)

MYKEY Weekly Report 16 (September 7th~September 13th) Today is Monday, September 14, 2020. The following is the 16th issue of MYKEY Weekly Report. In the work of last week (September 7th to September 13th), there are mainly 3 updates: 1. HashKey Hub&MYKEY launched the new period of BTC financial products MYKEY and the third-party partner HashKey Hub launched a new period of 5% BTC 30-d
MYKEY Weekly Report 16 (September 7th~September 13th)

Today is Monday, September 14, 2020. The following is the 16th issue of MYKEY Weekly Report. In the work of last week (September 7th to September 13th), there are mainly 3 updates:

1. HashKey Hub&MYKEY launched the new period of BTC financial products

MYKEY and the third-party partner HashKey Hub launched a new period of 5% BTC 30-day regular financial products on September 8, 2020. Both parties will further deepen cooperation and jointly explore the development of digital currency financial products.

2. MYKEY officially supports TRON and starts a public beta, airdropping 80,000 TRX to users

From September 8, 2020, 12:00 to September 15, 2020, 12:00(UTC+8), enable MYKEY Tron accounts during the activity to get 10 TRX, for details, please click: https://bit.ly/33c7O8s

3. The seventeenth MYKEY Crypto Stablecoin Report was published

We release MYKEY Crypto Stablecoin Report every week to share our interpretation of the development status of stablecoins and analysis of their development trends to help the participants in the crypto market stay updated on the development status of stablecoin. The seventeenth Crypto Stablecoin Report was published on September 10th, click to read: https://bit.ly/3if2lUE

!!! If you encounter any abnormal situation while using MYKEY, remember not to uninstall MYKEY APP, please contact MYKEY Assistant: @mykeytothemoon in Telegram.

!!! Remember to keep the 12-digit recovery phrase properly from [Me] — [Manage Account] — [Export Recovery Phrase] in MYKEY even if your account is not real-named.

About Us

KEY GROUP: https://keygroup.me/

MYKEY Web: https://mykey.org/

BIHU: https://bihu.com/people/1133973

Telegram: https://t.me/mykey_lab

Twitter: https://twitter.com/mykey_lab

Medium: https://medium.com/mykey-lab

Github: https://github.com/mykeylab

Youtube: MYKEY Laboratory

MYKEY Weekly Report (September 7th~September 13th) was originally published in MYKEY Lab on Medium, where people are continuing the conversation by highlighting and responding to this story.

Sunday, 13. September 2020

KuppingerCole

KuppingerCole Analyst Chat: Meet the Citizen Developer

Alexei Balaganski and Matthias Reinwarth look at the citizen development movement and discuss the potential risks of letting business users create their applications without proper governance and security.

Alexei Balaganski and Matthias Reinwarth look at the citizen development movement and discuss the potential risks of letting business users create their applications without proper governance and security.



Friday, 11. September 2020

Evernym

Building the Future of Staff Access: Supporting INTEROPen’s Staff Passport Hackathon

We’re proud to announce that we’ll be supporting the world’s first Digital Staff Access Hackathon, which will be held virtually September 21-22.  The hackathon is organized by INTEROPen with support from the UK’s National Health System (NHS), the world’s fifth-largest employer with 1.5 million employees. The NHS has been an early champion of verifiable credentials […] The post Building the

We’re proud to announce that we’ll be supporting the world’s first Digital Staff Access Hackathon, which will be held virtually September 21-22.  The hackathon is organized by INTEROPen with support from the UK’s National Health System (NHS), the world’s fifth-largest employer with 1.5 million employees. The NHS has been an early champion of verifiable credentials […]

The post Building the Future of Staff Access: Supporting INTEROPen’s Staff Passport Hackathon appeared first on Evernym.


SELFKEY

Liquidity Pools: The Foundation of DeFi

Liquidity pools are the foundation of the DeFi ecosystem. Moreover, it tries to resolve some specific issues that have been bugging the crypto community for a long time. The post Liquidity Pools: The Foundation of DeFi appeared first on SelfKey.

Liquidity pools are the foundation of the DeFi ecosystem. Moreover, it tries to resolve some specific issues that have been bugging the crypto community for a long time.

The post Liquidity Pools: The Foundation of DeFi appeared first on SelfKey.


Nyheder fra WAYF

Ilisimatusarfik nu med i WAYF

Grønlands Universitet, Ilisimatusarfik, er i dag indtrådt i WAYF som brugerorganisation. Studerende og ansatte herfra kan derfor nu identificere sig som Ilisimatusarfik-brugere over for de mange webtjenester i WAYF og eduGAIN af relevans for forskning og uddannelse. Language Danish Read more about Ilisimatusarfik nu med i WAYF

Grønlands Universitet, Ilisimatusarfik, er i dag indtrådt i WAYF som brugerorganisation. Studerende og ansatte herfra kan derfor nu identificere sig som Ilisimatusarfik-brugere over for de mange webtjenester i WAYF og eduGAIN af relevans for forskning og uddannelse.

Language Danish Read more about Ilisimatusarfik nu med i WAYF

Otaka - Secure, scalable, and highly available authentication and user management for any app.

Welcome Nick Gamb

My name is Nick Gamb and I am excited to be joining the Okta Developer Advocacy team for the .NET community after a year and a half as a developer specialist for enterprise sales. The crazy pace and high pressure of enterprise sales was an interesting experience that allowed me to rapidly become intimately close to the Okta platform. As great as drinking water from a fire hose was, I longed to

My name is Nick Gamb and I am excited to be joining the Okta Developer Advocacy team for the .NET community after a year and a half as a developer specialist for enterprise sales. The crazy pace and high pressure of enterprise sales was an interesting experience that allowed me to rapidly become intimately close to the Okta platform. As great as drinking water from a fire hose was, I longed to be back doing what I love most… providing a great experience for developers… and breaking stuff to see how it works.

Who Am I

At heart, I am just an inquisitive nerd who has had a very fortunate career getting to do many different things. I love video games, anime, boating, computers, programming, hacking, security, dev ops, data, animals, nature, film, photography, sound design, editing -it’s a long list. For the brave and interested, the longer, but still condensed, version of my professional story is below. Outside of my profession, my life is my family, and I would not have it any other way. I have an amazing wife who is my best friend. I have two beautiful daughters who fill my life with love, joy, and chaos. I have 3 big dogs who constantly keep us moving, cuddled, and slobbery. I have a full, happy, life.

Doing All the Things

I have a personality trait that has been both good and bad for me. I am interested in everything and can’t settle on one thing. This has been the case since before young Nick decided to go to film school, when he spent all his spare time taking every electronic apart (to see how it worked) and hacking up video game engine code (Never-ending thanks to John Carmack for id Tech!). While tech might have been the obvious path for me, my MO seems to be to try to do all of the things. Hang on tight…whiplash ahead.

Invent, Transform, Create, and Destroy

I have always loved technology… especially taking it apart. As a kid, nothing was safe from my screwdriver. This destructive inquisitiveness paved the way to a healthy and productive love for building computers and programming them… and exploiting them. I built my first computer with a Pentium 200 MHz processor, 16 MB RAM, and Windows 95. This gaming rig was a beast and ran Quake like a champ. As great of an achievement as this was, especially without Google to guide me, it was later that my interest in hardware turned software. It was the release of the id Tech 1 engine source code for me. This was the beginning of a snowball effect that inevitably ended up with my learning the foundations of programming, and then expanding into DirectX/OpenGL and physics. I was no longer limited to just playing games like everyone else. I now had the power to invent, transform, create, and destroy. But even more exciting to me was when I discovered how video games can be used to influence people.

The Detour

So you will understand everyone’s surprise and confusion when young Nick started applying to film school instead of pursuing tech. Because of this detour I have worked on the set of major TV sitcom productions, I have been a director for a Fox News affiliate, I have been a chief editor/photojournalist for an NBC news affiliate, and I have hung suspended out of the side of a helicopter with a camera that cost more than I made in 2 years. Really cool right? I hated it. From a young age, I was really drawn to the way that a story can affect people’s emotions and perceptions. It is why I was so drawn to video games and game engines. It’s also why I am so interested in psychology. Film to me was a draw because of its ability to reach more people than any other medium. Sure, I also love photography, sound design, and editing just as much as everyone else - but the drive for me was in using the art and the technology to make people feel… anything… everything. Unfortunately, my romanticized vision of the film and broadcasting industry was not what I had hoped it would be, and ultimately that led me to decide to leave it.

The Reboot

I rebooted my career in tech climbing a common ladder… in support. I quickly automated my job with my programming skills and ended up on a team coding new apps for the support people to use. From there I moved on to sales engineering, programming data transformations, coding HP printer drivers and installers, professional services, and many other things. Every role I took I spent the majority of my time reverse-engineering everything and programming whatever interested me. I particularly liked finding ways to break or manipulate software. It was developing data transformations that got me into C# and .Net, and I have been faithful ever since. Yes, I have had a tryst or two with the likes of Python and Java, but C# is and will always be my one true love.

Finding Focus

That brings me to identity security and the part where I finally found a focus. In early 2014 I was brought on to build professional services for a new cloud identity solution that was getting rave reviews from Gartner. It did not take long for me to reverse engineer the platform and start creating against it. When my code started to be sold it grabbed the attention of our leadership. You would think it was positive attention since I just identified a new market for revenue generation, but apparently there are other things to consider like support, quality of the code, and API’s that were built to scale. It’s not only about breaking stuff and putting it back together in different ways. The storm to follow was my entry into corporate politics and all of the fun that brings.

Burnout

Fast forward to 2016. I had managed to convince the powers that be that we needed to be building a platform, not just a product. A platform that could be adopted by developers, not used by just an IT Admin. One that could power identity security across the internet. The product was rapidly changing in this direction and I began building a new developer experience/relation organization. I was the head of developer relations but I was also solo for the majority of it… which was a fast track to burnout. The struggle became real in 2017 when Okta, my competitor at the time, acquired Stormpath. How can one person compete with that?

Breaking Stuff and Telling People How It Works

Now it’s 2020. With one quick step into sales engineering again, I found my footing at Okta. Learning the platform was quick and easy considering my experience as an identity security expert and developer advocate. Sales, however, is not where my passion lies. In joining Okta Developer Advocacy, I am finally back to doing what I love most. Breaking stuff and telling people how it works! My focus as an Okta Developer Advocate is .NET/C# but you can expect to see a variety of things from me that are as wide-ranging as my never-ending list of interests, passions, and skills. .NET, Security, DevOps, even video games / XR is on the list, along with much more.

Stay tuned and buckle up. This is going to be an exciting ride!

Let’s Connect

Let’s connect on: Twitter, Twitch, YouTube, GitHub

Or email me nick.gamb@okta.com!

Check out my recent blog post here on Identity Security for Games in C# with Unity!

We are always posting new content. If you like this content, be sure to follow us on Twitter and subscribe to our YouTube Channel.

Thursday, 10. September 2020

IdRamp

World’s first Verifiable Credential Service Delivery Platform – Video


KuppingerCole

Die Demokratisierung der Cybersicherheit

Im Laufe der vergangenen Jahrzehnte haben Unternehmen vielen Anstrengungen auf sich genommen, um ihre IT-Sicherheit zu verbessern und so ihre Daten und Netzwerke zu schützen. Eine Konsequenz daraus wird immer deutlicher sichtbar: CISOs und ihre Teams müssen sich um eine (zu) große Zahl an Produkten und Lösungen kümmern, die der Unternehmenssicherheit dienen sollen. Oft sorgt schon die schiere Mass

Im Laufe der vergangenen Jahrzehnte haben Unternehmen vielen Anstrengungen auf sich genommen, um ihre IT-Sicherheit zu verbessern und so ihre Daten und Netzwerke zu schützen. Eine Konsequenz daraus wird immer deutlicher sichtbar: CISOs und ihre Teams müssen sich um eine (zu) große Zahl an Produkten und Lösungen kümmern, die der Unternehmenssicherheit dienen sollen. Oft sorgt schon die schiere Masse an Lösungen für eine hohe Komplexität und jährlich steigende Gesamtbetriebskosten (TCO), wobei der Mehrwert für die Sicherheit durch eine immer höhere Zahl an Tools fraglich ist.




Forgerock Blog

ForgeTalks: What is Single Sign-On?

At ForgeRock we help people access the connected world. How do we achieve it? In part, with an important digital identity tool called Single Sign-On (SSO). What is SSO? How does it work? What is the purpose of it? I was joined this week by ForgeRock's VP of Product & Solution Marketing, Ashley Stevenson, who took me through the ins and outs of Single Sign-On, using an incredibly helpful (and s

At ForgeRock we help people access the connected world. How do we achieve it? In part, with an important digital identity tool called Single Sign-On (SSO). What is SSO? How does it work? What is the purpose of it? I was joined this week by ForgeRock's VP of Product & Solution Marketing, Ashley Stevenson, who took me through the ins and outs of Single Sign-On, using an incredibly helpful (and slightly nostalgic) analogy.

We take a look at:

What is SSO and what are its benefits? How do privacy and security tie into SSO? What is federated SSO?

If you enjoy this episode make sure you check out our previous episodes here. Next week I'll be meeting with ForgeRocks VP for Cloud Success, Renee Beckloff, to debunk common myths around moving to the cloud.


One World Identity

IDology: Multi-layered Identity Verification

IDology Chief Operating Officer Christina Luttrell joins State of Identity to discuss the ongoing impact of COVID-19 on the customer experience across industries, how fraudsters targets have changed, and the technologies helping to friction low without increasing risk.

IDology Chief Operating Officer Christina Luttrell joins State of Identity to discuss the ongoing impact of COVID-19 on the customer experience across industries, how fraudsters targets have changed, and the technologies helping to friction low without increasing risk.


Trinsic (was streetcred)

Combining Verifiable Credentials and Smart Contracts for COVID-19 Data Management

Over the past few months, a number of apps have been developed that use verifiable credentials to streamline the transfer of data in the COVID-19 testing process. However, Bart Cant, Founder and Managing Partner of Rethink Ledgers, decided to use a unique approach in developing his prototype app. Not only did he use verifiable credentials […] The post Combining Verifiable Credentials and Smart C

Over the past few months, a number of apps have been developed that use verifiable credentials to streamline the transfer of data in the COVID-19 testing process. However, Bart Cant, Founder and Managing Partner of Rethink Ledgers, decided to use a unique approach in developing his prototype app. Not only did he use verifiable credentials for the secure sharing of COVID-19 test results and vaccine information, but he also based his app on smart contracts. 

 

Bart’s app uses the Trinsic platform for the verifiable credentials portion, so we interviewed him further about his app, how it works, and future plans.

Let's dive right in. What is the app called, and what does it do?

The app is called “State Surveillance System for Covid19 Testing and Vaccine Distribution Management”. It is a prototype app developed using DAML (Digital Assets Modeling Language) and W3C’s verifiable credentials. The app showcases a prototype solution that provides a digital, secure experience for citizens, health clinic providers, and state agencies to share COVID-19 test results, “proof of vaccine” administration, and other “immunity proofs” using a centralized ledger.

What inspired you to build this app? What problems are you trying to solve?

My background has been in digital transformation, and I spent the last few years building blockchain-based solutions for large enterprises at Capgemini and IBM. While blockchain and distributed ledger technology provide many opportunities, I realized in the last few years that for many use cases and processes, there are some components of blockchain that are really valuable (e.g., cryptography, tamper-proof evidence), but some processes could be better served in a more centralized way. 

 

When the COVID-19 pandemic hit, I wanted to apply my knowledge and talent, so I decided to join the COVID-19 Credentials Initiative (CCI). By contributing to the use case and development of a prototype application, I hope to contribute to building solutions that can help keep people safe and bring back some normalcy in our daily lives.

 

The problems I am trying to solve include the secure and timely sharing of COVID-19 testing data. Specifically in the case of COVID-19 testing, it is essential that all test results are communicated in a secure, privacy-enhanced, and safe environment. Secondly, accurate and timely data is critical for policy setting, so sharing data with state or local agencies in an automated way is also achieved.

How does the app use both smart contracts and verifiable credentials?

As I mentioned previously, the app was developed using a combination of the DAML privacy-enhancing, smart contract language and W3C’s verifiable credentials. DAML guarantees privacy by coding the solution as rights and obligations. The information is stored, so it cannot be accessed unless the party accessing that data has been explicitly configured in the smart contracts to access it. DAML is also one of the few solutions that allows interoperability across networks. The solution is built on a centralized ledger based solution (https://projectdabl.com/) but could be easily deployed on other blockchain networks (e.g., Hyperledger Fabric, Sawtooth, Besu or R3 Corda), centralized ledgers (e.g., AWS QLDB), or traditional databases (e.g., Postgres).

 

The second major technical component of the solution is how it interacts with verifiable credentials. Verifiable credentials are used to securely transmit COVID-19 test information from the health clinics to the patient. The application provides a digital workflow for this experience (including a mobile experience) while still being HIPAA compliant and providing a safe and secure mechanism for sharing COVID-19 test data that allows the recipient to be in control of his or her data and decide on what to share and with whom.

 

The application is already looking forward to the future situation where COVID-19 vaccines will become available. When vaccines are available, at least a portion of the population will receive the COVID-19 vaccine and therefore will be better protected from contracting the virus or potentially infecting others in their communities. By providing verifiable credentials for vaccine proofs, we enhance the trust factor of sharing and disclosing medical information.

What role does Trinsic play in the app? How was Trinsic's platform helpful in the development of your app?

Trinsic is a critical component of the overall architecture of the solution. It is the glue for connecting the smart contracts to the distribution of verifiable credentials. 

 

It uses the Trinsic Studio for setting up DIDs (Decentralized Identifiers) and schemas and can be used as a monitoring tool for connections and distribution of the verifiable credentials. The Trinsic Studio is an easy-to-use web interface for managing verifiable credential exchange with no code. It’s so easy to use that it makes it possible to issue a verifiable credential in less than five minutes! The Trinsic Wallet is also used for accepting and presenting the verifiable credentials.

How does this solution balance the need to track, analyze, and share COVID-19 related data and the privacy concerns that come with that?

With COVID-19, there is a great emphasis on privacy and regulation. Verifiable credentials allow you to share medical data directly from the issuer to the holder without the need to store the data. Smart contracts can be leveraged to orchestrate this process. Additionally, smart contracts can be used to anonymize data by using DIDs, for instance. Additionally, the critical medical data can be blinded and encrypted by the DAML smart contract, so the solution can be both HIPAA and GDPR compliant.

What is the next step from here?

Since the release of my prototype and associated video, I have received a lot of feedback from healthcare professionals and self-sovereign identity advocates. I also am exploring further enhancements to a variety of use cases. There are many similar opportunities to leverage smart contracts and verifiable credentials for COVID-19 tests or vaccines in the entertainment, travel, or education industry.  

 

For example, the solution could be very helpful in international travel. By using the app, tourists from the US that are planning to go to the Caribbean islands can easily share their pre-boarding screening information about COVID-19 tests or vaccines and then provide additional test information upon returning back to the US, avoiding long lines upon entry or mandatory quarantine requirements.

 

(end of interview)

 

Recently, Trinsic CEO Riley Hughes joined Bart and DAML representatives in a webinar to discuss and demo the prototype app. To view the webinar, click here. If you are building a COVID-19 related solution that uses verifiable credentials, we are offering the Trinsic platform free of charge. Feel free to reach out to us for more information.

The post Combining Verifiable Credentials and Smart Contracts for COVID-19 Data Management appeared first on Trinsic.


Nuggets

We’re now members The Open Identity Exchange

We’re now members The Open Identity Exchange We’re delighted to announce that we are now members of The Open Identity Exchange (OIX). Formed in 2010 to address the increasing challenges of building trust in online identity, OIX is a membership organisation uniquely dedicated to ID Trust. Fellow members range from buyers of ID services, solution providers and regulators including compan

We’re now members The Open Identity Exchange

We’re delighted to announce that we are now members of The Open Identity Exchange (OIX). Formed in 2010 to address the increasing challenges of building trust in online identity, OIX is a membership organisation uniquely dedicated to ID Trust.

Fellow members range from buyers of ID services, solution providers and regulators including companies such as Barclays, Microsoft, LexisNexis, HSBC and IAG. All members collaborate and work together to drive adoption of ID Trust.

As a member we can now collaborate on thought-leadership initiatives and sector specific working groups. We’ll be able to support pilot projects and work with the other members, a full list of whom you can find here.

With Nuggets, we’ve built a platform where verified self-sovereign digital identity is a key component in us being able to give consumers the chance to take back control of their data. We’re excited in becoming the latest member and share OIX’s bigger vision of a world where anyone can prove their identity using a simple, universally trusted ID.

We look forward to connecting with other members and participating in the conversations around building trust in online identities.

On joining, Nick Mothershaw, OIX CEO said: “I am very excited to welcome Nugget’s as OIX latest new member. It’s great to see more entrants into the Identity Provider space. Relying parties need more than just trusted ID services: payments capability and access to eligibility information really helps round off the ID offering — a connection that Nuggets have made and designed for from day one.”

To learn more about OIX, visit their site — https://openidentityexchange.org/

We’re now members The Open Identity Exchange was originally published in Nuggets on Medium, where people are continuing the conversation by highlighting and responding to this story.


Ontology

OScore: Everything You Need to Know

OScore is a self-sovereign credit evaluation system based on user data stored on the Ontology blockchain. With full integration of ONT ID, Ontology’s decentralized identity framework, OScore supports cross-chain interaction and verifiable credentials, connecting user identities with personal accounts on the Ontology blockchain, eliminating third parties from the process. Once a user authorizes the

OScore is a self-sovereign credit evaluation system based on user data stored on the Ontology blockchain. With full integration of ONT ID, Ontology’s decentralized identity framework, OScore supports cross-chain interaction and verifiable credentials, connecting user identities with personal accounts on the Ontology blockchain, eliminating third parties from the process. Once a user authorizes their financial data, Ontology’s OScore system generates a quantifiable credit score, while ensuring the user’s privacy is fully protected.

OScore gives users full control over their digital assets and contracts addresses through a mechanism that gives them full autonomy regarding who has access to their data. Users are permitted to bind their digital assets or contract addresses to their ONT ID, and subsequently unbind them whenever they wish to do so. It is entirely up to the user to authorize third-party access to their data. Data used in OScore calculations includes the user’s digital asset balance, historical holding records, transaction records, records dApp usage.

OScore is an independent on-chain reputation system, generated by specific onchain user data that is not associated with their off-chain identities. In this way, OScore protects the user’s privacy and supports full anonymity.

In the financial sector, a huge chasm remains in the data collaboration between both the off-chain and on-chain worlds. For most blockchain users, on-chain assets and actions constitute a significant aspect of their financial profile. However, most credit-related scenarios in the world today still require the user’s off-chain information to be reliable, without taking into account that their onchain data could also give insight into their credit history.

OScore seeks to bridge the gap between users and DeFi services, allowing users to enjoy the DeFi projects, dApps and blockchain platforms suited to their needs. Users can manage their on-chain credit data at their will, and if they have a positive credit history, they can put their credit data to use in more rewarding financial products, all the while knowing that their privacy is safe from intrusion. In addition, users can bind their OScore and off-chain real-name data to their ONT ID, pooling their off-chain and on-chain data to form a more complete credit profile for themselves that can be applied in both the on-chain and off-chain worlds.

OScore finds an optimal application in Wing, a DeFi project based on the Ontology blockchain. Wing integrates OScore into its credit evaluation module to support its DeFi services. Each Wing product pool can use OScore to increase the credibility of its DeFi services. Users can benefit from OScore in the following ways:

OScore helps users avoid over-collateralization when looking to borrow a variety of digital assets

An OScore is generated for the user based on a hybrid calculation of their digital identity and assets. Users with a high OScore may enjoy under-collateralization in crediting digital assets, such as collateralizing 8 bitcoins to get 10.

OScore users can enjoy WING incentives

Based on the user’s level of participation, the product pool will distribute extra WINGs in every transaction as incentives for crediting participants with an OScore. Users who return their credited assets on time will receive extra OScore points as a reward. Other DeFi projects or credit-evaluating products can also utilize OScore for similar functions.

OScore protects user privacy, even when they have not met contractual terms

Users who breach terms will be listed on an OScore privacy-protecting list. Despite this, they still have control over which entities have access to this list, as they have completed the authorization process before joining the crediting activities.

OScore offers a framework where users can find credit evaluating tools that serve different purposes. Every user can write their own credit evaluation algorithm or re-use the credit score issued by a trusted third party. From the user’s perspective, OScore can be used to desensitize their credit data and provide verifiable credentials with scoring algorithms and credit data as trust endorsement for credit-related business.

Currently, OScore’s primary credit evaluation mechanism calculates the user’s balance and historical holding records of digital assets and supports blockchain platforms based on BTC, ETH, and ONT. In the future, updated OScore systems will continue to incorporate more types of on-chain data and evaluation mechanisms. In addition, Ontology also has plans to integrate OScore into the ONTO wallet. Relevant SDKs and APIs will also be released to allow integration and deployment by third-party apps.

Find Ontology elsewhere

Ontology website / ONTO website / OWallet (GitHub)

Twitter / Reddit / Facebook / LinkedIn / YouTube / NaverBlog / Forklog

Telegram Announcement / Telegram English / GitHubDiscord

OScore: Everything You Need to Know was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


SELFKEY

The August Progress Report is here!🔔

SelfKey Weekly Newsletter Date – 09th September, 2020 In this edition, read all about our monthly progress report and more. The post The August Progress Report is here!🔔 appeared first on SelfKey.

SelfKey Weekly Newsletter

Date – 09th September, 2020

In this edition, read all about our monthly progress report and more.

The post The August Progress Report is here!🔔 appeared first on SelfKey.


MyKey

Crypto Stablecoin Report 17: The market capitalization of stablecoins increased to $17.544

Crypto Stablecoin Report 17: The market capitalization of stablecoins increased to $17.544 billion, Decentralized payment protocol Celo Original link: https://bihu.com/article/1131020362 Original publish time: September 8, 2020 Original author: HaiBo Jiang, researcher of MYKEY Lab We released MYKEY Crypto Stablecoin Report to share our interpretation of the development status of
Crypto Stablecoin Report 17: The market capitalization of stablecoins increased to $17.544 billion, Decentralized payment protocol Celo

Original link: https://bihu.com/article/1131020362

Original publish time: September 8, 2020

Original author: HaiBo Jiang, researcher of MYKEY Lab

We released MYKEY Crypto Stablecoin Report to share our interpretation of the development status of stablecoins and analysis of their development trends to help the participants in the crypto market stay updated on the development status of stablecoin. The MYKEY Crypto Stablecoin Report will be published every week, looking forward to maintaining communication with the industry and exploring the development prospects of stablecoin together.

Quick Preview Last week, the market capitalization of major stablecoins has increased by $981 million to $17.544 billion. Last week, Tether additionally issued 270 million USDT on Ethereum and 500 million USDT on Tron. DeFi on EOS continued to heat up. On September 4, Tether additionally issued 25 million USDT on EOS, and continued to additionally issue 50 million USDT on September 5. The circulation of USDT on EOS reached 90.25 million. Celo has special features in address-based encryption schemes, reserve assets, and shared reserves. The market capitalization of CELO has reached $423 million, while the circulation of cUSD is only 7.33 million. 1. Overview of Stablecoin Data

First, let’s review the changes in the basic information of the various stablecoins in the past week(August 29, 2020 ~ September 4, 2020, same below).

Market Circulation

Source: MYKEY, CoinMarketCap, Coin Metrics

At present, the market circulation of major stablecoins has increased by $981 million to $17.544 billion.

Source: MYKEY, Coin Metrics

In the past week, Tether additionally issued 270 million USDT on Ethereum and 500 million USDT on Tron. The circulation of USDC, PAX, HUSD, DAI, and GUSD increased by 230 million, 10.91 million, 5.85 million, 19.16 million and 650,000, and the circulation of BUSD and TUSD decreased by 2.69 million and 52.21 million.

The Number of Holding Addresses

Source: MYKEY, DeBank

Last week, the number of main stablecoin holding addresses on Ethereum all increased by 142,707.

Source: MYKEY, DeBank

The number of holding addresses of USDT, USDC, PAX, TUSD, and DAI increased by 129,348, 8,210, 216, 582, and 4,351.

The Number of Active Addresses

Source: MYKEY, Coin Metrics

The number of active addresses of stablecoins last week decreased by an average of 1.15% compared to the previous week.

The Number of 24-hour Transactions on the Public Blockchains

Source: MYKEY, Coin Metrics

Compared with the previous week, the number of daily transactions of major stablecoins increased by an average of 12.01%.

The Number of 24-hour Volume of Transactions on the Public Blockchains

Source: MYKEY, Coin Metrics

Source: MYKEY, Coin Metrics

The daily volume of transactions of major stablecoins last week increased by an average of 12.05% from the previous week.

2. Decentralized payment protocol Celo

The stablecoin in the blockchain combines the advantages of digital currency and can maintain price stability. The payment market has a very broad prospect and is an area that many people are optimistic about. The decentralized payment protocol Celo contains stablecoins that can be used for payment, mainly for mobile payment systems. Celo is based on the two core values of ‘special purpose’ and ‘connection’. Its pronunciation /ˈtselo/ means ‘purpose’ in Esperanto. In this report, we will introduce to you the differences of Celo.

Address-based Encryption

With the development of stablecoins, USDT and USDC with stable value gradually replaced the Bitcoin feature of ‘point-to-point payment’. When using these stablecoins for payment or transfer, we must first know the address of the other party, which increases the threshold for the use of stablecoins.

Celo proposed an address-based encryption scheme. The user generates a public/private key pair traditionally, and then registers his public key to a public database, which stores the tuple of [address->public key]. Users can attach any character string to the address of the database key, and each address can store multiple public keys. Through this protocol, users can use e-mail addresses, mobile phone numbers, etc. as public keys, but need to be able to receive secure messages.

In the digital currency wallet disclosed by China Construction Bank last month, the digital RMB also has a similar feature, which can be transferred through the wallet address of the other party or the bank account.

Price Stabilization Mechanism

The Celo protocol contains two tokens. One is the reserve token CELO (formerly known as Celo Gold), and the other is the stablecoin Celo USD (cUSD, Celo USD) used for payment. Although the stablecoin cUSD is also supplied flexibly, unlike Ampleforth, Celo only maintains the value of cUSD by adjusting the total supply of cUSD, without changing the number of stablecoins held by users. When the price of cUSD is higher than $1, arbitrageurs can buy $1 CELO, exchange it for 1 cUSD, and sell it on the market. When cUSD is lower than $1, arbitrageurs can buy cUSD and exchange it for $1 CELO.

Diversity of Reserve Assets

In MakerDAO, the tokens selected by the holders of MKR through voting, such as ETH, WBTC, etc., are mortgaged, while Synthetix is used to mortgage its native token SNX. The reserve assets of Celo include not only the native token CELO, but also commonly used cryptocurrencies such as BTC, ETH, and DAI. This increases the diversity of collateral assets and makes it easier to expand the scale of cUSD. Initially, the target of the Celo reserve was 50% CELO, 30% BTC, 15% ETH, and 5% stablecoins.

Source: cello.org

Shared Reserve System

At present, there is the only cUSD as a stablecoin in Celo, but Celo has positioned itself as a multi-asset encryption protocol for decentralized payments and introduced a shared reserve system. For example, when the protocol is to introduce the Celo euro linked to the euro, the Celo euro uses the same reserve as the Celo dollar. When the supply of stablecoins expands, the protocol issues new stablecoins and uses them to buy a basket of encrypted assets. When the demand for stablecoins shrinks, the protocol uses the same mechanism to sell reserve assets for stablecoins. CELO has played the role of the central bank in traditional finance, maintaining the prices of stablecoin through open market operations.

The shared reserve system can also improve the efficiency of the process. For example, when the demand for Celo euros decreases, before preparing to sell reserves, it will first check whether the supply of Celo dollars needs to be expanded. If necessary, Celo USD will be issued and Celo Euro will be directly exchanged at the current exchange rate.

The shared reserve system does not require all new stablecoins to be used. The Celo protocol also allows the use of its reserves to issue new tokens. When using the shared reserve system to introduce a new stablecoin, thorough consideration must be made. If a stablecoin with negative effects on the ecosystem is introduced, it may have a marginally negative impact on the stability of other currencies. Only when people generally expect that the introduction of new stablecoins will increase the long-term demand for the monetary system, will they be considered.

The Value of CELO

CELO is also the governance and incentive token of the Celo protocol. Holding CELO will enjoy the benefits of network growth.

As the use of Celo increases and new stablecoins are introduced, more CELO needs to be mortgaged, and the demand for CELO will increase.

In order to maintain the BFT consensus of the Celo network, Celo needs to elect validator nodes, and validators need to mortgage a certain amount of CELO tokens.

The block rewards of Celo will be distributed to those who contribute to the system, including those who maintain the stability of the system by selecting validators, verifying transactions, verifying users, and participating in the discovery mechanism of the price of Schelling Point, as well as those who bear the risk of austerity, those who use the protocol as a payment method, those who invite others to use the protocol, and those who improve the protocol.

CELO holders can govern the protocol by voting, such as proposing a proposal to introduce a new stablecoin, and if the vote reaches a certain threshold, the new stablecoin is introduced into the shared reserve.

Similarly, holding CELO is also at risk. When the supply is tight, if the demand for Celo stablecoins shrinks for a long time, the value of CELO may decline.

Celo Usage

The current market capitalization of CELO has reached $423 million, and it is listed on Coinbase, the largest Exchange in the United States. Celo has formed alliances with payment institutions, digital wallets, payment solution companies, digital asset lending companies, educational institutions, investment institutions, etc. to promote the use of the Celo protocol.

But there are few stablecoins in the Celo protocol. According to the latest data on the official website, the stablecoin in Celo is now only cUSD, and the circulation of cUSD is only 7.33 million. Reserve funds include CELO, BTC, ETH, DAI, and each asset is stored in a specific blockchain address.

Sum up

Celo has special features in address-based encryption schemes, reserve assets, and shared reserves. It increases the usability of stablecoins, the diversity of reserve assets, and improves the efficiency of process. The market capitalization of CELO is high enough, but the current circulation of stablecoins in Celo is not high, and the types of stablecoins need to be increased.

3. Questions of Readers

1. Is DeFi disrupting the entire blockchain ecology?

Answer: DeFi is indeed changing the entire blockchain ecology on a large scale. For example, because of the demand of DeFi, the oracle Chainlink is used more. Because liquidity mining requires LP tokens, the liquidity and transaction volume of Uniswap have doubled. Because of the popularity of DeFi, the Ethereum network has become very congested, and miners’ fees have increased, which may make it impossible for some other types of DApps to develop. In addition, DeFi is also changing the pattern between public blockchains. EOS, whose stablecoin USDT is insufficient, lags behind TRON in this wave of DeFi boom.

2. Why was SUSHI listed in three major Exchanges in one day? Is it just because of the money-making effect?

Answer: It’s not just for making money. It is certain that the token listing strategy of centralized Exchanges is now different from before, and even the largest Exchanges have generated the FOMO sentiment of token listing. At the end of August, the daily trading volume of Uniswap surpassed that of Coinbase, the largest Exchange in the United States, and centralized Exchanges could no longer ignore decentralized Exchanges.

In addition, there are also reasons for the SushiSwap project itself. Now Uniswap is the largest decentralized Exchange, and the LP tokens obtained after depositing in Uniswap can be transferred, which gives SushiSwap an opportunity. SushiSwap wants to allow users to provide liquidity on Uniswap by giving users SUSHI token incentives, and pledge LP tokens to SushiSwap, and finally convert it into its own liquidity. It now appears that this approach is effective. In the past week, the liquidity on Uniswap has increased from $308 million to $1.9 billion, most of which are to pledge LP tokens in Uniswap to SushiSwap for liquidity mining. If there is indeed a lot of liquidity left in SushiSwap in the end, then the position of Uniswap will face serious challenges.

Tips

To better communicate with industry insiders, we decided to add two sections for questions of readers and opinions of guests. If readers have questions about stablecoins, please contact us. We will pick meaningful questions to answer in the next issue. At the same time, welcome guests from the industry to share your views on stablecoins. Contact information: jianghb@mykey.org.

This is what we’re sharing in this MYKEY Crypto Stablecoin Report, welcome to stay tuned for follow-up crypto stablecoin reports. We will provide more interpretations of the development status of stablecoins and analysis of their development trends to help you stay updated on the development status of stablecoin in the follow-up report.

PS: MYKEY Lab has the final right to interpret the content of the article, please indicate the source for the quotation. Welcome to follow our official account — MYKEY Lab: MYKEY Smart Wallet.

Past review

MYKEY Crypto Stablecoin Report 01: USDT continues to gain momentum as market capitalization exceeding $10 billion

MYKEY Crypto Stablecoin Report 02: USDT suspended additional issuance and the usage scenario of USDT in Tron is single

MYKEY Crypto Stablecoin Report 03: Where are the users of DAI?

Crypto Stablecoin Report 04: Tether additional issued 300 million USDT, commenting on various decentralized stablecoins

Crypto Stablecoin Report 05: DAI Maintains Steady Growth, Exploring Use of DAI by Users of Centralized Exchanges

Crypto Stablecoin Report 06: The latest 13 additional issuances of USDT all occurred on Tron, driving the increase use of Tron

Crypto Stablecoin Report 07: Security Analysis of Stablecoins

Crypto Stablecoin Report 08: Interpretation of Digital Dollar Project

Crypto Stablecoin Report 09: Analyze the lending leverage of Compound

Crypto Stablecoin Report 10: Introduce the Algorithmic Stablecoin Project Terra (Luna)

Crypto Stablecoin Report 11: The circulation of stablecoins has overall increased, Holding AMPL a month for 51 times incomes

Crypto Stablecoin Report 12: USDT is additionally issued 690 million The use of stablecoins outside the cryptocurrency market

Crypto Stablecoin Report 13: The market capitalization of stablecoins reached $14.387 billion, Stablecoin pool Reserve

Crypto Stablecoin Report 14: The increase of Ethereum Gas Fee makes the transfers of stablecoin transactions on the blockchain

Crypto Stablecoin Report 15: The market capitalization of stablecoins increased to $15.961 billion, On-chain usage of stablecoins

Crypto Stablecoin Report 16: The connection between stablecoins and real assets

About Us

KEY GROUP: https://keygroup.me/

MYKEY Web: https://mykey.org/

BIHU: https://bihu.com/people/1133973

Telegram: https://t.me/mykey_lab

Twitter: https://twitter.com/mykey_lab

Medium: https://medium.com/mykey-lab

Github: https://github.com/mykeylab

Youtube: MYKEY Laboratory

Crypto Stablecoin Report 17: The market capitalization of stablecoins increased to $17.544 was originally published in MYKEY Lab on Medium, where people are continuing the conversation by highlighting and responding to this story.


Smarter with Gartner - IT

Why Now Is the Time to Accelerate Digital

As organizations begin to shift from the recovery phase of COVID-19 to the renewal, many are focused on what comes next and capitalizing on the changes made to the business during COVID-19. The Gartner Board of Directors survey highlighted that seven out of 10 boards have accelerated digital business initiatives in the wake of COVID-19 disruption. One question we hear frequently when discussing

As organizations begin to shift from the recovery phase of COVID-19 to the renewal, many are focused on what comes next and capitalizing on the changes made to the business during COVID-19. The Gartner Board of Directors survey highlighted that seven out of 10 boards have accelerated digital business initiatives in the wake of COVID-19 disruption.

One question we hear frequently when discussing digital business strategy now is, “Why is this digitalization different from digital prior to COVID-19?” And the answer is pace — the pace of adoption. With a potentially short window of time before a resurgence of the virus or other related disruption, organizations need to act quickly.

The digital part isn’t new, but the acceleration part is crucial. This is about the urgent adoption of established patterns, many of them digital, but also patterns of psychology (how do we survive), organization (are we equipping people with the best skills and cohorts?) and governance (do we need so much ritual, or can we move faster?)

[swg_ad id="36189"]

Organizations have to change, now. There is no option to continue on as they have. It has to be done to preserve the business and the future of the business. When economics are going as well as they had been, more or less, for the past decade, organizations didn’t have to make the hard choices or tackle underlying faults. Pre-COVID business and operating models could probably improve with certain digitalization efforts, but why change what’s working well enough? Now, the path forward is clear: Accelerate digital business or risk the survival of the organization.

Organizations need to be making better-informed and faster decisions, focusing on automation, real-time risk assessment and mitigation, continuous value delivery and agile strategy making. And they need to be doing it now.

The acceleration we are experiencing is the search for new solutions in the face of broken assumptions

Disruption is useful: We need it to make progress. The acceleration we are experiencing is the search for new solutions in the face of broken assumptions. We have an opportunity to make things better, rather than falling back on what worked before. Over time, we will build a new set of business practices, forever changed by COVID-19.

What does digital business acceleration look like?

The answer to this question will depend on your organization's strategies and business. Consider the larger goals and initiatives that drive the business — how can those be improved through effective use of known technology and information design patterns?

Consider automation in the call center. Many companies have been increasing this specific automation over the years, but now implementation is accelerating. COVID-19 meant people couldn’t physically be in call centers, and at the same time, many call centers experienced a massive uptick in demand.

Chatbots become an efficient, but more importantly now, a necessary, digital tool for the organization. It’s not a new technology; it’s just that organizations were pushed to find solutions. Now they have to be adopted at an accelerated rate.

How will we speed up?

Acceleration has some specific requirements, both in the digital and the physical worlds. First, you must banish drags: Remove those things that will slow you down. This could mean modernizing a piece of your legacy technology or removing unnecessary ritual from change management processes.

Second, you need fuel to boost you: These patterns, like call center automation or social marketing (proven implementations), can get you moving quickly. Third, you must adjust resources: Shift funding from less valuable initiatives to those focused on new channels, products or services that will have greater, faster returns.

Finally, seize value: Sweat the assets you own, like cloud platforms, that are likely underutilized and that could be used for rapid innovation and resilience.

What should we stop doing?

In the spirit of banishing drags, a number of Gartner analysts had a brainstorming thread with a simple question: What is the one thing that you should stop doing to go faster? Here are a few of their suggestions:

Stop trying to centralize everything: Provide autonomy and accountability throughout the organization Stop believing (and working toward the idea) that digital will transform your company into Google or Amazon or Alibaba. Digital transformation doesn’t alter the purpose of your company’s existence. Stop staying in your lane. Change lanes to follow new demand. Stop pining for the “good old days” of January 2020 and get on to January 2021. Stop innovation initiatives that were conceived pre-pandemic. Reset. Then only restart innovation initiatives that make sense for the post-pandemic organization. Stop doing staff appraisals/performance reviews (this year only). What will digital business do for us?

Digital will “remote-stabilize” the organization to the right mix of remote and in-person business, which is a much-needed exercise after the initial pandemic push. It will enable organizations to react to external context and enable them to move resources in response to emergent needs in society. We’re looking for new patterns of scale that can be applied now, but can also be adjusted for future disturbances.

Right now, clients are focused on “Right-Scale”, which is asking things like “What do I keep? What do I change?” For example, this could mean moving aggressively to cloud providers instead of using an on-site data center. Essentially, these are projects that organizations might have been doing before, but now they’re being done with more urgency and speed.

Another part of this is “Right-Balance” which is examining the balance between traditional analog options and digital. COVID-19 drove this transformation more quickly than some organizations would have otherwise chosen.

Working toward the future, organizations should be applying digital business acceleration to these dimensions:

The “everything customer,” who requires both deep personalization and ironclad privacy Right-scoped growth, which may involve new customer segments and the abandonment of incumbent value propositions A composable technology foundation that balances efficiency with resilience An adaptable workforce, equipped with the skills, processes, information and autonomy to flex in the face of disruption Any-scale operations that can spin up and down with demand and unforeseen circumstances

Part of this conversation can begin with strategic scenario planning, which facilitates agile strategy setting. As a tool, it offers a way to frame the discussion about hypothetical future business scenarios — and helps business leaders to identify future uncertainties and develop suitable action plans to respond to change and capture opportunity. Scenarios enable you to determine suitable action plans or strategies for different possible futures, framing strategic planning conversations right down to the functional level.

Read more: What Functional Leaders Should Know About Scenario Planning

To quote every television commercial, these are unprecedented times. But we were more prepared for them than we realized. A pragmatic optimism is emerging, grounded in known practices and patterns. We need to stay open, curious and future-focused. Now is the time to renew our businesses, our work and ourselves. Don’t miss this opportunity.

The post Why Now Is the Time to Accelerate Digital appeared first on Smarter With Gartner.

Wednesday, 09. September 2020

KuppingerCole

Identity and Access Management Strategies That Grow With Your Business

For these organizations, an adaptable Active Directory-centered (AD) approach can address the areas of highest impact. By adding cloud-based access request and access certification functionality to the mix, a company can achieve a basic IGA solution for a fraction of the cost, complexity, and deployment time. This approach also provides the opportunity to expand the scope beyond AD and Azure AD by

For these organizations, an adaptable Active Directory-centered (AD) approach can address the areas of highest impact. By adding cloud-based access request and access certification functionality to the mix, a company can achieve a basic IGA solution for a fraction of the cost, complexity, and deployment time. This approach also provides the opportunity to expand the scope beyond AD and Azure AD by embracing many non-Windows systems (such as Unix/Linux) and SaaS applications (via SCIM connectivity). Learn how to build a strategy for a modular approach to identity that can be custom fit to company needs, size, complexity, and budget.

This webinar will equip you to:

Provision your AD and Azure environment – and beyond Approach baseline IAM implementation efficiently Plan a short track for improving security and compliance Set up your identity program for ongoing success Be well-positioned to expand to IGA and PAM when the time is right.

In the first part, KuppingerCole Principal Analyst Martin Kuppinger will give a brief overview of identity in general as well as of IAM, IGA and PAM strategies, and will look at what every business, regardless of size and industry, needs in IAM.

He will be joined by One Identity Field Strategist Dan Conrad, who will explain how to prioritize IGA capabilities for maximum impact and show why you should opt for a modular approach with AD-optimized tools.




Identity for All – Not for the Few

While digitalization is a major challenge for all kinds of enterprises, it is particularly hard for medium sized enterprises. For many years, medium sized enterprises have struggled to deploy Identity Management as well as Identity Governance (IGA) solutions and show true business value and a return on investment. This is largely due to the complexity of implementation and the level of configurati

While digitalization is a major challenge for all kinds of enterprises, it is particularly hard for medium sized enterprises. For many years, medium sized enterprises have struggled to deploy Identity Management as well as Identity Governance (IGA) solutions and show true business value and a return on investment. This is largely due to the complexity of implementation and the level of configuration and customization to fulfill the requirements. As a result, many companies have deployed Access Management solutions like single sign-on only to find that while great for user convenience, they bring little to the table for proper lifecycle management and governance.




Continuum Loop Inc.

VC Terms – Watch Out – Get Educated

I have been talking to numerous founders lately about the various oddities and niceties of raising funds - and the crazy things that VCs and aggressive angels do. The post VC Terms – Watch Out – Get Educated appeared first on Continuum Loop Inc..

I have been talking to numerous founders lately about the various oddities and niceties of raising funds – and the crazy things that VCs and aggressive angels do. Over the years I have advised many companies and I thought that there was a page that gave the founder-centric view of things – but I can’t find it. So… Here we go.

General Terminology – CB Insights does a good job here. Key for raising are the following phrases (go to the page for the definitions):

pre-money valuation and post-money valuation are crucial terms – simple to get but if you haven’t heard them they sound odd. liquidation preferences and especially any mention of a full ratchet.

Key warnings:

Be really careful about liquidation preferences.  Watch out for the employee stock option game that aggressive VCs play. This seemingly simple concept creates massive dilution that isn’t in a founder’s favour. 

Interesting Approach – Vesting for Founders – One interesting thought is that some of the best places have quite long vesting periods, even for founders. AngelList, for example, has a 6-year vesting period – for everyone, including Naval.

 

The post VC Terms – Watch Out – Get Educated appeared first on Continuum Loop Inc..


IDENTOS

IDENTOS access control technology is part of an innovative $5.9 million federal investment for a health access platform

At launch, the TRUSTSPHERE platform will help connect families, caregivers and clinicians to improve patient-centered care for children with Type 1 diabetes at BC Children’s Hospital, and improve the process to donate clinical data to research Toronto – September 9th 2020  As part of the Digital Technology Supercluster’s $153 million investment to solve some of […] The post IDENTOS access c

At launch, the TRUSTSPHERE platform will help connect families, caregivers and clinicians to improve patient-centered care for children with Type 1 diabetes at BC Children’s Hospital, and improve the process to donate clinical data to research

Toronto – September 9th 2020 

As part of the Digital Technology Supercluster’s $153 million investment to solve some of industry and society’s biggest problems through Canadian-made technologies, the TRUSTSPHERE platform is being developed by a consortium led by Careteam Technologies which includes: IDENTOS, the University of British Columbia, SecureKey,  Smile CDR, and MedStack to first be used by BC Children’s Hospital in helping children with Type 1 diabetes.  

TRUSTSPHERE’s Mandate

TRUSTSPHERE’s mandate is to empower people and patients with greater control over their health journey and enable healthcare providers with improved access to health data and clinical information through the creation of a scalable and trustworthy health access platform. Through the learning and success of this project, the consortium will establish a scalable commercial service that will become a digital solution model for other patient populations and healthcare delivery organizations to adopt across Canada and globally. 

Why IDENTOS?

Globally, IDENTOS designs and develops digital identity & access technology to meet modern demands of user-centricity, respect for privacy, and connectivity of distributed systems.  The access control technology being leveraged in this solution is based on an open specification, the Federated Privacy Exchange (FPX); built by IDENTOS to simplify & secure scaling of identity-based transactions and experiences with user consent and authorization. TRUSTSPHERE will use FPX to catalyze project development across hybrid networks or cloud environments, while diminishing the need for future, repeated custom integration costs. In conclusion, patient data is secure and connected across a spectrum of trusted parties with confidence, compliance, and agility.

The TRUSTSPHERE platform

The TRUSTSPHERE platform is the front face to a robust health ecosystem that will first be accessed and used by BC Children’s Hospital. The ecosystem integrates both public and private sector partners that enable trustworthy collaboration, access and consent driven exchange of data. 

Benefits to health ministries and healthcare organizations 

Easily, securely and remotely verify a patient’s identity with modern Single Sign On approach Manage multiple existing identities to provide a seamless user experience  Conveniently identify patient cohorts and connect them with their data and researchers Improve health outcomes Benefit from seamless patient connectivity across care teams, service providers, networks and jurisdictions


Benefits to patients and their families:

Easy remote and repeated access to digital health services launched within the TRUSTSPHERE service Control sharing and revocation to personal health information  Discover relevant healthcare apps to collect and easily connect their data Virtually connect with their care providers Consent to donate their data for research

For more information about IDENTOS: http://www.identos.com
Follow IDENTOS on Twitter @identos_inc, and on Linkedin
Media contact & inquiries: hello@identos.ca

For Supercluster-related media inquiries: Kathleen Reid, kreid@switchboardpr.com

About IDENTOS
At IDENTOS, we believe that digital trust is the linchpin to improving digital access everywhere. Globally, we design and develop digital identity & access technology to meet modern demands of user-centricity, respect for privacy and distributed system interoperability. With IDENTOS, organizations connect with far greater confidence, compliance and agility. 

Our built-in Canada access control software underscores our passion for improving people’s lives.  We believe we can do this by securing privacy in the exchange of access to data anywhere. This requires the ability to operate at system-level scale but with user level control.  We empower our clients to do both in real time.

About the Digital Technology Supercluster
The Digital Technology Supercluster solves some of industry’s and society’s biggest problems through Canadian-made technologies. We bring together private and public sector organizations of all sizes to address challenges facing Canada’s economic sectors including healthcare, natural resources, manufacturing, and transportation. Through this ‘collaborative innovation,’ the Supercluster helps to drive solutions better than any single organization could on its own.  The Digital Technology Supercluster is led by industry leaders such as D-Wave, LifeLabs, LlamaZOO, Lululemon, MDA, Microsoft, Mosaic Forest Management, Sanctuary AI, Teck Resources Limited, TELUS, Terramera, and 1Qbit. Together, we work to position Canada as a global hub for digital innovation. A full list of Members can be found here.

The post IDENTOS access control technology is part of an innovative $5.9 million federal investment for a health access platform appeared first on IDENTOS.


Ontology

Ontology Weekly Report (September 1–7)

We have had a very exciting week here at Ontology as Wing, the first credit-based DeFi project based on the Ontology blockchain, released its first whitepaper. Integrated with Ontology’s decentralized identity and credit evaluation systems, Wing fuels cross-chain interaction for various DeFi products and decentralized governance. We are also delighted to share that our tokens ONT and ONG can now

We have had a very exciting week here at Ontology as Wing, the first credit-based DeFi project based on the Ontology blockchain, released its first whitepaper. Integrated with Ontology’s decentralized identity and credit evaluation systems, Wing fuels cross-chain interaction for various DeFi products and decentralized governance.

We are also delighted to share that our tokens ONT and ONG can now be swapped to eONT and eONG on the Ethereum blockchain, and eONT and eONG are listed on the UniSwap platform.

Meanwhile, we continued performance testing with bloXroute by providing reliable technical assistance to our partners.

Back-end

- Completed 50% of Ontology GraphQL interface development

- The Rust Wasm contract development hub released ontio-std v0.3

Product Development

ONTO

- ONTO v3.3.0 released

- Completed the connection with Tron

- Completed the connection with Polkadot

- Upgraded credential functions

- Upgraded asset score functions by including stablecoins into the calculation of asset score

- Upgraded asset score and credential tutorial

dApp

- 71 dApps now live on Ontology

- 6,065,687 dApp-related transactions since genesis block

- 14,320 dApp-related transactions in the past week

Bounty Program

- 1 new application for the technical documentation translation and 1 new application for SDK bounty

Community Growth

- We onboarded 753 new members across Ontology’s Vietnamese, Swedish, and Romanian communities.

Newly Released

- Wing, the first credit-integrated DeFi project based on the Ontology blockchain, released its first whitepaper. According to the whitepaper, Wing serves as a DeFi platform based on Ontology’s decentralized identity and credit evaluating systems. It supports cross-chain interaction between various DeFi products and decentralized governance and introduces a risk control mechanism that promotes a healthy relationship between borrowers, creditors, and guarantors, breaking the barriers with traditional finance through the incorporation of credit.

- Ontology’s digital assets, ONT and ONG, have completed mapping to the Ethereum blockchain platform. ONT and ONG are now listed on UniSwap to support all types of DeFi products on Ethereum. This move marks Ontology as the first mainstream public blockchain that has completed cross-chain communication for Ethereum. Digital assets on the Ethereum blockchain can now be swapped to the Ontology blockchain as bi-directional cross-chain communications are enabled between Ontology and Ethereum.

- Ontology continued its performance testing with bloXroute, with a focus on BDN performance during periods of slow internet. Ontology improved the speed of block propagation, block recovering speed, and transaction stream speed, and announced a partnership with Chainstack.

Global Events

- From September 2–4, Ontology co-organized the event ‘Cointelegraph China’s DeFi Marathon’. On 2 September, the launch of the event, professionals including DeFi project teams, high-tech individuals, senior investors, community enthusiasts, investment research organizations, and media personnel met at the Shanghai site to discuss DeFi topics and the future growth of decentralized finance. Jun LI, Founder of Ontology, addressed the audience during “Ontology: Empowering DeFi with Credit” and joined a panel discussion themed “Public Chain’s Choice amid DeFi Pressure”. He shared inspiring insights on why Ontology decided to get involved with DeFi, our technical upgrading and ongoing development, our new governance and staking economic model, and empowering DeFi with Wing, Ontology’s latest credit-based DeFi project. He added, “By empowering DeFi with credit, we are poised to build the next ‘Super-Oracle’”. In addition, during the event, Erick Pinos, Americas Ecosystem Lead, was engaged in a discussion with the lead of Aave as well as other renowned projects.

- On September 4, Jun LI, Founder of Ontology, participated in a live AMA panel hosted by HyperPay on the topic of “Braving the DeFi-spurred Storm: Where Will Public Chains Head”. During the panel, he unveiled Ontology’s blueprint for DeFi, and briefed community users on the innovative and transformative features of Wing, the credit-based DeFi cross-chain platform.

- On September 3, Kendall MAO, Dean of Ontology Institute, discussed the dilemma posed by DeFi to public chains at a SheKnows live panel organized by 8BTCnews, along with representatives from Bytom Blockchain. Kendall mentioned, “Wing, Ontology’s latest cross-chain DeFi project, is integrated with the element of credit. We believe that a dynamic integration with the OScore system will create new paradigms for DeFi innovations, opening up new opportunities for Ontology.”

Find Ontology elsewhere

Ontology website / ONTO website / OWallet (GitHub)

Twitter / Reddit / Facebook / LinkedIn / YouTube / NaverBlog / Forklog

Telegram Announcement / Telegram English / GitHubDiscord

Ontology Weekly Report (September 1–7) was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


Smarter with Gartner - IT

5 Questions to Cut Through the AI Security Hype

The hype around artificial intelligence (AI) has led to exaggerated expectations. For security leaders, the reality is that current AI technology, including machine learning (ML) techniques, can augment security capabilities. In the area of anomaly detection and security analytics, humans working with AI accomplish much more than without it. And while not risk-free, AI within security is more like

The hype around artificial intelligence (AI) has led to exaggerated expectations. For security leaders, the reality is that current AI technology, including machine learning (ML) techniques, can augment security capabilities. In the area of anomaly detection and security analytics, humans working with AI accomplish much more than without it. And while not risk-free, AI within security is more likely to create jobs rather than eliminate them.

However, simpler solutions can be as effective and cost less. And AI solutions for security can still be immature technologies. Given current technology restraints, AI should be an addition to existing security practices rather than a complete solution.

[swg_ad]

CISOs should ask these five questions before investing in the technology for their security programs:

1. What should CISOs and their teams know about AI?

One major challenge surrounding AI is the hype. Buzzwords like “next-generation” and “holistic approach” make big promises but most likely just mean “our latest release” and “multifunction.” Security and risk (SRM) leaders and teams must be savvy about marketing and the myths that exist in the AI world.

Focus on the actual benefits of the technology rather than rely on vendor claims or assumptions. It is key that security teams understand the basics about AI to assess how the technology might reasonably help security strategy.

Read more: Gartner Top 9 Security and Risk Trends for 2020

2. What is AI’s impact on SRM?

The promise of AI technology is that it will process data and apply analytics much better than human teams. Improved automation and data analytics applied to security analytics and infrastructure protection offer to:

Find more attacks

Reduce false alerts

Perform faster detect-and-respond functions

The CISO should take the lead in establishing what the organization requires and how AI can assist in that. CISOs should also set reasonable expectations for what AI can realistically provide and select projects based on areas where AI can have the greatest impact.

Read more: Security Experts Must Connect Cybersecurity to Business Outcomes

3. What is the state of AI in security?

Recognize that the technology is not mature and continue to treat AI offerings as experimental, complementary controls. “AI as a feature” is applied on existing platforms across a variety of key initiatives, including:

Threat and anomaly detection Identity analytics and fraud detection Compliance and privacy risk management Bot mitigation Data discovery and categorization Asset discovery Policy automation Security orchestration 4. What should CISOs ask vendors about AI security?

Although AI has a coolness factor, other existing solutions can achieve similar results. Understand the risks of a new solution and how the AI offering will outperform what the team is already using. Some questions for vendors include:

How can we view/control data used by the solution? Does the solution send data outside of our organization (call home)? What are the relevant security and performance metrics to measure the results from AI? Are there peer reviews of the solution? How much staff and time are required to maintain the solution? How does your solution integrate into our enterprise workflow? Does your solution integrate with third-party security solutions?

Depending on the answers, leaders may decide the costs and risks outweigh the benefits and decide to skip the extra expense.

Read more: How Security and Risk Leaders Can Prepare for Reduced Budgets

5. How does AI impact your workforce strategy?

AI might require additional roles or skill sets. Competition for these new skills is fierce, and finding “data security scientists” or “threat hunters” can be challenging. Because skills are constantly evolving, it can be more productive to focus on hiring people with trainable traits like digital dexterity, innovation and business acumen. Consider how to approach talent and skills gaps before purchase.

CISOs armed with the answers to these questions will be better prepared to decide whether and how to invest in AI.

The post 5 Questions to Cut Through the AI Security Hype appeared first on Smarter With Gartner.


Otaka - Secure, scalable, and highly available authentication and user management for any app.

Migrate Your ASP.NET Framework to ASP.NET Core with Okta

Ah, migration! Let’s say you have an ASP.NET application that has been running fine for years. You have kept up with the various .NET Framework updates and then suddenly you get told that you need to migrate to the latest and greatest, ASP.NET Core using .NET Core. .NET Core is the successor to the .NET Framework we’ve been using for years. It is open-source and supports cross-platform application

Ah, migration! Let’s say you have an ASP.NET application that has been running fine for years. You have kept up with the various .NET Framework updates and then suddenly you get told that you need to migrate to the latest and greatest, ASP.NET Core using .NET Core. .NET Core is the successor to the .NET Framework we’ve been using for years. It is open-source and supports cross-platform applications. To a veteran .NET developer it should look relatively similar to ASP.NET applications but there are some differences. In this article, we will migrate an existing ASP.NET application to ASP.NET Core that has an external auth provider - like Okta!

Requirements A computer with .NET Framework and .NET Core-compatible operating system A modern internet browser An Okta Developer account Your IDE of choice (I used Visual Studio 2019) The .NET Core 3.1 SDK ASP.NET Create your Okta application

For this application, you will use Okta for your authentication. The Okta.AspNetCore makes implementing secure authentication in your application easier than ever. All you need to do is set up a new application on Okta’s developer website and configure the authentication in your project on startup, and Okta handles the rest.

Log in to your Okta Developer Console and click on Applications. Click on the Add Application button and select Web. On the next page, give your application a meaningful name. I named my application Conversion App. Change your URIs from localhost:8080 to localhost:3000.

Click Done at the bottom of the form and you will be redirected to the application’s settings page. Make sure to note your Client ID and Client secret as you will need these in your application.

Build an ASP.NET MVC Application to Migrate

Now it is time to create an ASP.NET application, that we will eventually migrate to .NET Core. You can use an existing application, of course, but for the purpose of this tutorial, we will create one from scratch. The application will consist of a home page and a dashboard page for users after they are authenticated. The dashboard will retrieve a list of users from a web service and provide a way to send emails. To authenticate users you will use Okta. Okta’s Okta.AspNet package makes using Okta for authentication as simple as a few lines of code.

Create an ASP.NET Framework MVC Application

Open Visual Studio and select New Project. Under the Web section select ASP.NET Web Application (.NET Framework) and give it a name. On the next screen select MVC and select OK. After the application is scaffolded, you can get to work.

Right-click on your project name and select Properties. Under the Web section set your Project Url to localhost:3000.

Install Okta’s ASP.NET library with the package command Install-Package Okta.AspNet -Version 1.5.0.

In your web.config file add the following entries to the <appSettings> node.

<add key="okta:OktaDomain" value="{yourOktaDomain}"/> <add key="okta:ClientId" value="{yourClientId}"/> <add key="okta:ClientSecret" value="{yourClientSecret}"/> <add key="okta:RedirectUri" value="http://localhost:3000/authorization-code/callback"/> <add key="okta:PostLogoutRedirectUri" value="Home/Index"/>

You should also add the SMTP configuration to your web.config. Add the code below and replace the asterisks with your SMTP configuration values.

<system.net> <mailSettings> <smtp from="*"> <network enableSsl="true" host="*" port="*" userName="*" password="*"/> </smtp> </mailSettings> </system.net> Create your Controllers

Under your Controllers folder add or update the following controllers; HomeController, AccountController, and DashboardController.

HomeController is the simplest. It has one action that returns a view called Index.

using System.Web.Mvc; namespace ConversionApp.Controllers { public class HomeController : Controller { public ActionResult Index() { return View(); } } }

AccountController will contain your LogOff method that will log the user out of your website using Okta.

using System.Web; using System.Web.Mvc; using Microsoft.Owin.Security.Cookies; using Okta.AspNet; namespace ConversionApp.Controllers { public class AccountController : Controller { [HttpPost] public ActionResult LogOff() { if (HttpContext.User.Identity.IsAuthenticated) { HttpContext.GetOwinContext().Authentication.SignOut( CookieAuthenticationDefaults.AuthenticationType, OktaDefaults.MvcAuthenticationType); } return RedirectToAction("Index", "Home"); } } }

Finally, DashboardController will have an Index action that returns a view, but it also has two more actions. One sends an email and the other gets a list of users from a mock web service.

using System; using System.Collections.Generic; using System.Linq; using System.Web; using System.Web.Mvc; using System.NET; using System.NET.Mail; using System.IO; using Newtonsoft.Json; namespace ConversionApp.Controllers { [Authorize] public class DashboardController : Controller { // GET: Dashboard public ActionResult Index() { return View(); } //send email [HttpPost] public ActionResult SendEmail() { SmtpClient smtpClient = new SmtpClient(); smtpClient.EnableSsl = true; MailMessage mail = new MailMessage(); //Setting From , To and CC mail.From = new MailAddress("nik@fishbowlllc.com", "Nickolas Fisher"); mail.To.Add(new MailAddress("nik@fishbowlllc.com", "Nickolas Fisher")); mail.Subject = "Hello From the Conversion App"; mail.Body = "this is my test email"; smtpClient.Send(mail); return new JsonResult(){ Data = true }; } public ActionResult GetUsers() { WebRequest request = WebRequest.Create("https://jsonplaceholder.typicode.com/users"); request.Method = "GET"; WebResponse response = request.GetResponse(); using (var reader = new StreamReader(response.GetResponseStream())) { var text = reader.ReadToEnd(); var userList = JsonConvert.DeserializeObject<List<Models.UserModel>>(text); return new JsonResult() { Data = userList, JsonRequestBehavior = JsonRequestBehavior.AllowGet }; } throw new Exception("There was a problem fetching the users"); } } }

A model called UserModel is added to the Models folder and contains the relevant data from the web service call to send down to your client.

namespace ConversionApp.Models { public class UserModel { public int id { get; set; } public string name { get; set; } public string username { get; set; } public string email { get; set; } public string website { get; set; } } } Add your Views

There are four views you need to add or edit. The first is the Shared/_Layout.cshtml file. Most of this is the automated code Visual Studio created.

<!DOCTYPE html> <html> <head> <meta charset="utf-8" /> <meta name="viewport" content="width=device-width, initial-scale=1.0"> <title>@ViewBag.Title - Conversion - An Okta Demo</title> @Styles.Render("~/Content/css") @Scripts.Render("~/bundles/modernizr") </head> <body> <div class="navbar navbar-inverse navbar-fixed-top"> <div class="container"> <div class="navbar-header"> <button type="button" class="navbar-toggle" data-toggle="collapse" data-target=".navbar-collapse"> <span class="icon-bar"></span> <span class="icon-bar"></span> <span class="icon-bar"></span> </button> @Html.ActionLink("Conversion", "Index", "Home", new { area = "" }, new { @class = "navbar-brand" }) </div> <div class="navbar-collapse collapse"> <ul class="nav navbar-nav"> <li>@Html.ActionLink("Home", "Index", "Home")</li> </ul> @Html.Partial("_LoginPartial") </div> </div> </div> <div class="container body-content"> @RenderBody() <hr /> <footer> <p>&copy; @DateTime.Now.Year - Conversion by <a href="https://profile.fishbowlllc.com/" rel="noreferrer" target="_blank">Nik Fisher.</a></p> </footer> </div> @Scripts.Render("~/bundles/jquery") @Scripts.Render("~/bundles/bootstrap") @RenderSection("scripts", required: false) </body> </html>

Next, you can edit the Shared/_LoginPartial.cshtml file. This partial view will display a login button to an authenticated user, and a logout button if the user is logged in. You do not need the register button that Visual Studio adds in, since Okta is handling the authentication.

@if (Request.IsAuthenticated) { using (Html.BeginForm("LogOff", "Account", FormMethod.Post, new { id = "logoutForm", @class = "navbar-right" })) { <ul class="nav navbar-nav navbar-right"> <li> Hello + @User.Identity.Name + ! </li> <li><a href="javascript:document.getElementById('logoutForm').submit()">Log off</a></li> </ul> } } else { <ul class="nav navbar-nav navbar-right"> <li>@Html.ActionLink("Log in", "Index", "Dashboard", routeValues: null, htmlAttributes: new { id = "loginLink" })</li> </ul> }

Next, you will edit the Home/Index.cshml page. Again, Visual Studio has some stock marketing for ASP.NET here. You can just replace it with some information about your site.

@{ ViewBag.Title = "Home Page"; } <div class="jumbotron"> <h1>Conversion</h1> <p class="lead">A demo using .NET Framework, .NET Core, and Okta</p> <p><a href="https://developer.okta.com/" class="btn btn-primary btn-lg">Learn more &raquo;</a></p> </div> <div class="row"> <div class="col-md-4"> <h2>Purpose</h2> <p> Learn how to convert a .NET Framework MVC web application using Okta to a .NET Core application. </p> <p><a class="btn btn-default" href="https://go.microsoft.com/fwlink/?LinkId=301865">Learn more &raquo;</a></p> </div> <div class="col-md-4"> <h2>Okta</h2> <p>Powerful Single Sign-On provider </p> <p><a class="btn btn-default" href="https://go.microsoft.com/fwlink/?LinkId=301866">Learn more &raquo;</a></p> </div> <div class="col-md-4"> <h2>Author</h2> <p>Written by Nik Fisher</p> <p><a class="btn btn-default" href="https://go.microsoft.com/fwlink/?LinkId=301867">Learn more &raquo;</a></p> </div> </div>

Finally, add the dashboard page where most of your functionality lives. This page will populate a table asynchronously with data that is fetched from the mock API. There is also some Javascript for sending the email.

@{ ViewBag.Title = "Index"; } <div class="row mb-2"> <div class="alert alert-dismissable"> <button class="btn btn-primary" onclick="sendEmail()">Send Email</button> </div> </div> <table class="table table-striped"> <thead> <tr> <th>Name</th> <th>Username</th> <th>Email</th> <th>Website</th> </tr> </thead> <tbody></tbody> </table> @section scripts { <script> $(document).ready(function () { $.get('@Url.Action("GetUsers", "Dashboard")', function (data) { $.each(data, function (key, value) { var row = '<tr><td>' + value.name + '</td><td>' + value.username + '</td><td>' + value.email + '</td><td>' + value.website + '</td></tr>'; $('table').append(row); }); }); }); function sendEmail() { $.post('@Url.Action("SendEmail", "Dashboard")', function (data) { console.log(data); }) } </script> } Set up the Startup.cs file

To allow Okta to work with your ASP.NET application you will need to set it up in the App_Start/Startup.Auth.cs file.

using Microsoft.Owin.Security.Cookies; using Owin; using Okta.AspNet; using Microsoft.Owin.Security; using System.Configuration; namespace ConversionApp { public partial class Startup { // For more information on configuring authentication, please visit https://go.microsoft.com/fwlink/?LinkId=301864 public void ConfigureAuth(IAppBuilder app) { app.SetDefaultSignInAsAuthenticationType(CookieAuthenticationDefaults.AuthenticationType); app.UseCookieAuthentication(new CookieAuthenticationOptions()); app.UseOktaMvc(new OktaMvcOptions { OktaDomain = ConfigurationManager.AppSettings["okta:OktaDomain"], ClientId = ConfigurationManager.AppSettings["okta:ClientId"], ClientSecret = ConfigurationManager.AppSettings["okta:ClientSecret"], AuthorizationServerId = "default", RedirectUri = ConfigurationManager.AppSettings["okta:RedirectUri"], PostLogoutRedirectUri = ConfigurationManager.AppSettings["okta:PostLogoutRedirectUri"] }); } } } Migrate your .NET Framework App to .NET Core

Your ASP.NET sample application is complete and you are ready to migrate to ASP.NET Core!

Open Visual Studio and create a new project. This time, select ASP.NET Core Web Application. Give it a name and click Create. For this, select No Authentication and Web Application (Model-View-Controller). You can also uncheck Configure for HTTPS as this won’t be necessary immediately.

You will want to set your App URL to localhost:3000 as you did with the ASP.NET application.

Install dependencies

As with any project, start by installing the dependencies. For this, you will use MailKit and Okta.AspNetCore. With the release of .NET Core, Microsoft now recommends not using the System.NET.Mail libraries that you may be familiar with. MailKit is the preferred library to send emails according to their documentation. Okta.AspNetCore is the .NET Core version of Okta’s authentication library. There are minor changes when migrating to this package but it’s just as easy as the .NET Framework version.

Install-Package Okta.AspNetCore -Version 3.1.1 Install-Package MailKit -Version 2.8.0 Configure Appsettings.json

Chances are you are familiar with the web.config that ASP.NET has used for years. With .NET Core, web.config is no longer used. Instead, you will use appsettings.json for any of your configuration settings. By default, Visual Studio will add both appsettings.json and appsettings.Development.json files. If you decide to add new appsettings files, such as appsettings.Production.json you can change the ASPNETCORE_ENVIRONMENT variable to Production to use that file.

For this application, you can use appsettings.json. Add the following code to it. Again, you will need to replace the SMTP values with your configuration settings.

{ "Logging": { "LogLevel": { "Default": "Information", "Microsoft": "Warning", "Microsoft.Hosting.Lifetime": "Information" } }, "AllowedHosts": "*", "Okta": { "OktaDomain": "{yourOktaDomain}", "ClientId": "{yourClientId}", "ClientSecret": "{yourClientSecret}" }, "Smtp": { "Host": "", "Port": 0, "Username": "", "Password": "" } } Startup.cs for .NET Core

Another core element of ASP.NET that is gone is the global.asax file. If you used OWIN in the past, you are familiar with the Startup.cs file. In ASP.NET Core the Startup file is the main way you set up your application. Here you will register middleware, leverage .NET Core’s built-in dependency injection, and configure your application.

Before you start that, create a new folder called Settings, and in it, add a file called SmtpSettings.cs. Add the following code to it.

namespace Okta_Conversion_Core.Settings { public class SmtpSettings { public string Host { get; set; } public int Port { get; set; } public string Username { get; set; } public string Password { get; set; } } }

Now open your Startup.cs file and add the following code to it.

using System.Collections.Generic; using Microsoft.AspNetCore.Builder; using Microsoft.AspNetCore.Hosting; using Microsoft.Extensions.Configuration; using Microsoft.Extensions.DependencyInjection; using Microsoft.Extensions.Hosting; using Okta.AspNetCore; using Microsoft.AspNetCore.Authentication.Cookies; namespace Okta_Conversion_Core { public class Startup { public Startup(IConfiguration configuration) { Configuration = configuration; } public IConfiguration Configuration { get; } // This method gets called by the runtime. Use this method to add services to the container. public void ConfigureServices(IServiceCollection services) { services.AddControllersWithViews(); services.Configure<Settings.SmtpSettings>(Configuration.GetSection("Smtp")); var oktaMvcOptions = new OktaMvcOptions() { OktaDomain = Configuration.GetSection("Okta").GetValue<string>("OktaDomain"), ClientId = Configuration.GetSection("Okta").GetValue<string>("ClientId"), ClientSecret = Configuration.GetSection("Okta").GetValue<string>("ClientSecret"), Scope = new List<string> { "openid", "profile", "email" }, }; services.AddAuthentication(options => { options.DefaultAuthenticateScheme = CookieAuthenticationDefaults.AuthenticationScheme; options.DefaultSignInScheme = CookieAuthenticationDefaults.AuthenticationScheme; options.DefaultChallengeScheme = OktaDefaults.MvcAuthenticationScheme; }) .AddCookie() .AddOktaMvc(oktaMvcOptions); services.AddMvc(); } // This method gets called by the runtime. Use this method to configure the HTTP request pipeline. public void Configure(IApplicationBuilder app, IWebHostEnvironment env) { if (env.IsDevelopment()) { app.UseDeveloperExceptionPage(); } else { app.UseExceptionHandler("/Home/Error"); } app.UseStaticFiles(); app.UseRouting(); app.UseAuthentication(); app.UseAuthorization(); app.UseEndpoints(endpoints => { endpoints.MapControllerRoute( name: "default", pattern: "{controller=Home}/{action=Index}/{id?}"); }); } } }

Most of this code is boilerplate from selecting MVC as your project type in the project creation. You will need to add app.UseAuthentication() to your Configure method. In ConfigureServices you will add the setup logic for Okta. With these two setups, you will be able to use the Authorize attribute on your actions or controllers the same as you did in ASP.NET.

You also added a line for configuring the SmtpSettings as a service in your application. This line will pull the Smtp section from your appsettings file and bind it to the selected model. Then it will make the configuration available using the IOptions Pattern in .NET Core.

Migrate controllers

You can now begin the process of migrating your controllers. First is the HomeController. This is also the simplest in your application. This controller will look also exactly as it did in ASP.NET, except that the directive will point at the new Microsoft.AspNetCore.Mvc namespace.

using Microsoft.AspNetCore.Mvc; namespace Okta_Conversion_Core.Controllers { public class HomeController : Controller { public ActionResult Index() { return View(); } } }

Next, you can implement your AccountController. In ASP.NET Core you will add a SignIn action that will return a Challenge if the user isn’t authenticated. The internal logic of this is contained in the Okta Asp.NET Core library. You also have the LogOff action which looks a little different than the classic ASP.NET version but is still simple to implement.

using Microsoft.AspNetCore.Http; using Microsoft.AspNetCore.Mvc; using Microsoft.AspNetCore.Authentication; using Microsoft.AspNetCore.Authorization; using Microsoft.AspNetCore.Authentication.Cookies; using Okta.AspNetCore; namespace Okta_Conversion_Core.Controllers { public class AccountController : Controller { public IActionResult SignIn() { if (!HttpContext.User.Identity.IsAuthenticated) { return Challenge(OktaDefaults.MvcAuthenticationScheme); } return RedirectToAction("Index", "Home"); } [HttpPost] [Authorize] public ActionResult LogOff() { return new SignOutResult( new[] { OktaDefaults.MvcAuthenticationScheme, CookieAuthenticationDefaults.AuthenticationScheme, }, new AuthenticationProperties { RedirectUri = "/Home/" }); } } }

Finally, you will need to add your DashboardController. Since this is where the bulk of your logic is, we should take a little extra time to go over it. Add your DashboardController and add the following code to it.

using System; using System.Collections.Generic; using Microsoft.AspNetCore.Mvc; using Microsoft.AspNetCore.Authorization; using MailKit.NET.Smtp; using MimeKit; using System.NET; using System.IO; using Newtonsoft.Json; using Microsoft.Extensions.Options; namespace Okta_Conversion_Core.Controllers { [Authorize] public class DashboardController : Controller { IOptions<Settings.SmtpSettings> _smtpSettings; public DashboardController(IOptions<Settings.SmtpSettings> smtpSettings) { _smtpSettings = smtpSettings; } // GET: Dashboard public ActionResult Index() { return View(); } //send email [HttpPost] public ActionResult SendEmail() { SmtpClient smtpClient = new SmtpClient(); MimeMessage message = new MimeMessage(); MailboxAddress to = new MailboxAddress("Nik Fisher", "nik@fishbowlllc.com"); MailboxAddress from = new MailboxAddress("Nik Fisher", "nik@fishbowlllc.com"); message.To.Add(to); message.From.Add(from); message.Subject = "Hello From the Conversion App"; BodyBuilder bodyBuilder = new BodyBuilder(); bodyBuilder.HtmlBody = "<p>this is my test email</p>"; bodyBuilder.TextBody = "this is my test email"; smtpClient.Connect(_smtpSettings.Value.Host, _smtpSettings.Value.Port, true); smtpClient.Authenticate(_smtpSettings.Value.Username, _smtpSettings.Value.Password); smtpClient.Send(message); smtpClient.Disconnect(true); smtpClient.Dispose(); return new JsonResult(true); } public ActionResult GetUsers() { WebRequest request = WebRequest.Create("https://jsonplaceholder.typicode.com/users"); request.Method = "GET"; WebResponse response = request.GetResponse(); using (var reader = new StreamReader(response.GetResponseStream())) { var text = reader.ReadToEnd(); var userList = JsonConvert.DeserializeObject<List<Models.UserModel>>(text); return new JsonResult(userList); } throw new Exception("There was a problem fetching the users"); } } }

First, the entire controller is under the [Authorize] attribute. This is the same as it was in ASP.NET.

You make use of the IOptions pattern by injecting the SmtpSettings into the controller. Now this controller will have access to the configuration you set up in your appsettings file.

The SendEmail action is considerably different. Microsoft has depreciated the System.NET.Mail libraries that were used in ASP.NET to send emails. Now they recommended using MailKit to send emails. MailKit is very simple though. If you are familiar with the System.NET.Mail libraries implementing the MailKit version is very similar.

The last thing to notice here is that JsonResult now has a constructor that takes a parameter containing your data. In ASP.NET you had to set the data manually after calling the constructor. The new syntax makes for cleaner code and allows you to inject the data right into the JsonResult object.

Migrate Your Views to .NET Core

Finally, it’s time to migrate your views. There are only a few changes here. To start, Dashboard/Index.cshtml and Home/Index.cshtml are the same as they were in the ASP.NET project. The nice thing is Razor hasn’t changed much.

For reference see below for the Home/Index page.

@{ ViewBag.Title = "Home Page"; } <div class="jumbotron"> <h1>Conversion</h1> <p class="lead">A demo using .NET Framework, .NET Core, and Okta</p> <p><a href="https://developer.okta.com/" class="btn btn-primary btn-lg">Learn more &raquo;</a></p> </div> <div class="row"> <div class="col-md-4"> <h2>Purpose</h2> <p> Learn how to convert a .NET Framework MVC web application using Okta to a .NET Core application. </p> <p><a class="btn btn-default" href="https://go.microsoft.com/fwlink/?LinkId=301865">Learn more &raquo;</a></p> </div> <div class="col-md-4"> <h2>Okta</h2> <p>Powerful Single Sign-On provider </p> <p><a class="btn btn-default" href="https://go.microsoft.com/fwlink/?LinkId=301866">Learn more &raquo;</a></p> </div> <div class="col-md-4"> <h2>Author</h2> <p>Written by Nik Fisher</p> <p><a class="btn btn-default" href="https://go.microsoft.com/fwlink/?LinkId=301867">Learn more &raquo;</a></p> </div> </div>

And now you can add your code for the Dashboard/Index page.

@{ ViewBag.Title = "Index"; } <div class="row mb-2"> <div class="alert alert-dismissable"> <button class="btn btn-primary" onclick="sendEmail()">Send Email</button> </div> </div> <table class="table table-striped"> <thead> <tr> <th>Name</th> <th>Username</th> <th>Email</th> <th>Website</th> </tr> </thead> <tbody></tbody> </table> @section scripts { <script> $(document).ready(function () { $.get('@Url.Action("GetUsers", "Dashboard")', function (data) { $.each(data, function (key, value) { var row = '<tr><td>' + value.name + '</td><td>' + value.username + '</td><td>' + value.email + '</td><td>' + value.website + '</td></tr>'; $('table').append(row); }); }); }); function sendEmail() { $.post('@Url.Action("SendEmail", "Dashboard")', function (data) { console.log(data); }) } </script> }

Next, you can update your Shared/_Layout.cshtml file. There are a couple of small changes here as the ASP.NET application was written using Bootstrap version 3 while this application was shipped with Bootstrap version 4.

<!DOCTYPE html> <html lang="en"> <head> <meta charset="utf-8" /> <meta name="viewport" content="width=device-width, initial-scale=1.0" /> <title>@ViewData["Title"] - Okta_Conversion_Core</title> <link rel="stylesheet" href="~/lib/bootstrap/dist/css/bootstrap.min.css" /> <link rel="stylesheet" href="~/css/site.css" /> </head> <body> <div class="navbar navbar-inverse navbar-fixed-top"> <div class="container"> <div class="navbar-header"> <button type="button" class="navbar-toggler" data-toggle="collapse" data-target=".navbar-collapse"> <span class="icon-bar"></span> <span class="icon-bar"></span> <span class="icon-bar"></span> </button> @Html.ActionLink("Conversion", "Index", "Home", new { area = "" }, new { @class = "navbar-brand" }) </div> <div class="navbar-collapse collapse"> <ul class="nav navbar-nav"> <li>@Html.ActionLink("Home", "Index", "Home")</li> </ul> </div> @Html.Partial("_LoginPartial") </div> </div> <div class="container body-content"> @RenderBody() <hr /> <footer> <p>&copy; @DateTime.Now.Year - Conversion by <a href="https://profile.fishbowlllc.com/" rel="noreferrer" target="_blank">Nik Fisher.</a></p> </footer> </div> <script src="~/lib/jquery/dist/jquery.min.js"></script> <script src="~/lib/bootstrap/dist/js/bootstrap.bundle.min.js"></script> <script src="~/js/site.js" asp-append-version="true"></script> @RenderSection("Scripts", required: false) </body> </html>

Finally, you can update your Shared/_LoginPartial.cshtml with the code below. It is very similar to the ASP.NET code but there are a couple of changes to be aware of. First, to check if the user is authenticated, you need to check the Context object rather than the Request object. Also, note that you are using the new TagHelpers.AnchorTagHelper class to build your anchor link for signing on. You can declare the controller and action as attributes of the anchor rather than manually type in a URL.

@if (Context.User.Identity.IsAuthenticated) { using (Html.BeginForm("LogOff", "Account", FormMethod.Post, new { id = "logoutForm", @class = "navbar-right" })) { <ul class="nav navbar-nav navbar-right"> <li> Hello @User.Identity.Name ! </li> <li><a href="javascript:document.getElementById('logoutForm').submit()">Log off</a></li> </ul> } } else { <ul class="nav navbar-nav navbar-right"> <a asp-controller="Account" asp-action="SignIn">Sign In</a> </ul> } Wrap up

At this point, it’s recommended that you test all your actions and make sure they are still working properly. Between the upgrade path provided by Microsoft and the ease of using Okta, converting ASP.NET applications to ASP.NET Core can be a painless process.

Learn more about ASP.NET Core and Okta ASP.NET Core 3.0 MVC Secure Authentication Build a Simple .NET Core App on Docker Build a Basic CRUD App with ASP.NET Core 3.0 and MongoDB

Tuesday, 08. September 2020

Smarter with Gartner - IT

5 Emerging Technologies Explained by Gartner Experts

Bidirectional brain machine interface, generative artificial intelligence (AI) and DNA computing are a few examples of the technologies highlighted on the Gartner Hype Cycle for Emerging Technologies, 2020. Although each of these may sound like a plotline from the latest Hollywood blockbuster, Gartner experts expect these emerging technologies and their corresponding trends to have a transformatio

Bidirectional brain machine interface, generative artificial intelligence (AI) and DNA computing are a few examples of the technologies highlighted on the Gartner Hype Cycle for Emerging Technologies, 2020. Although each of these may sound like a plotline from the latest Hollywood blockbuster, Gartner experts expect these emerging technologies and their corresponding trends to have a transformational impact on business in the next five to 10 years.

Kasey Panetta, Gartner Senior Content Marketing Manager, interviews Gartner experts to talk through the process of forming the Emerging Technologies Hype Cycle and related technologies.

2-part interview

This interview was conducted during a two-part podcast series. Both podcast episodes are available below; the transcript that follows has been edited for clarity and length.

Episode 1 (15 mins): Brian Burke, Research VP, on the Hype Cycle (00:50) Yefim Natis, Distinguished VP Analyst, on composable enterprises (6:53) Avivah Litan, Distinguished VP Analyst, on authenticated provenance (9:00)

Episode 2 (30 mins): Nick Heudecker, VP Analyst, on DNA computing and storage (3:10) Svetlana Sicular, VP Analyst, on generative AI (9:09) Sylvain Fabre, Senior Director Analyst, on bidirectional brain machine interface (20:23)

The Emerging Technologies Hype Cycle Explained – Brian Burke What is the Gartner Emerging Technologies Hype Cycle, and what makes it different from other Hype Cycles?

The Hype Cycle for Emerging Technologies is unique among Gartner Hype Cycles because we really look at all of the technologies on all of the Hype Cycles. So that's 1,700 technology profiles. And then we distill that down into a set of 30 or so technology profiles that we believe will be most impactful for organizations over the next five to 10 years.

Read more: 6 Trends on the Gartner Hype Cycle for the Digital Workplace, 2020

And how do you get from 1,700 technologies down to a list of 30?

It takes a couple of months, but we start by looking at all the technology profiles that we're creating and we create a shortlist of technologies that we believe will be the most impactful. We go from about 1,700 to about 150, and then we have a broader group of analysts who actually vote on those technology profiles. The top 30 are selected during the voting process.

We also have an algorithm that's applied to the scoring, which basically considers whether a technology is new to all Hype Cycles. If so, that technology will get a few points extra. If the technology existed on any of the previous year’s Hype Cycles, it loses some points.

This is to combat the fact that in the past we had technologies that hung around on the Hype Cycle for years and years. For example, smart dust, which was a technology that was on the Hype Cycle as a perennial favorite for six years. This approach ensures that we're having a fresher view on the Hype Cycle. This is especially important given that we have limited real estate.

What are this year's trends? Composite architectures Algorithmic trust Beyond silicon Formative AI Digital me

[swg_ad id="37038"]

Composite architectures: Composable enterprises – Yefim Natis What are composite architectures and why do they matter?

A composite architecture is made up of packaged business capabilities, built on a flexible data fabric. Basically this enables an enterprise to respond really rapidly to changing business needs.

The ultimate benefit of composable thinking, composable architecture, composable enterprise technology is that their organization unifies resources. Composite enterprises bring business expertise and technology expertise together to reengineer decision making and establish the policies and the structures of their organizations from a focus on stability to focus on agility and continuous change.

And why is this technology featured on the Hype Cycle?

Every organization today is seeking greater resilience, greater responsiveness to change, greater ability to integrate, and greater involvement of business and of IT together in making strategic, technology and business decisions. Composable enterprise promises to significantly improve each one of these capabilities of a modern enterprise. So it's no surprise that composable enterprise generates a lot of interest, a lot of hype, promise and investment from vendors and, increasingly, from users as well.

Algorithmic trust: Authenticated provenance – Avivah Litan What is authenticated provenance?  

 Authenticated provenance is part of algorithmic trust. Basically what it does is authenticate the origin of something. Algorithmic trust applies to the whole life cycle. Authenticated provenance asks how do you know something is real and valid when it is created? You can use many different methods to authenticate provenance.

One method is humans. You can have regulators go and look at the wheat field and say, ‘Yes, this is definitely organic wheat’, but that doesn't scale very well. The second way is to use AI models and have one that distinguishes organic wheat from nonorganic wheat by looking at the different composition and biology or DNA of the wheat itself.

The third way you can tell that something is authentic is through certifying at the point of origin, using some technique that's relevant for that domain. So let's take a pharmaceutical, a drug that's manufactured in a plant. As soon as it's signed off by the QA process in the factory, that data is locked in, and now you have a record of that pharmaceutical drug provenance that you can track until the time someone takes the drug.

This feels really relevant to the current state of the world. Is that why it's featured this year?

The reason this technology is featured now is because it's so needed in our digital world. You can't trust anything anymore. And I know that sounds very extreme, but it's actually true. There's so much ability to insert fakes and counterfeits into processes, whether it's manufacturing or content, that we need to be able to trust the source and trust the provenance. There's also a bigger demand from consumers to know that things are trustworthy, so the need for an authenticated provenance is stronger today than it's ever been in our history.

Beyond silicon: DNA computing and storage – Nick Heudecker What is DNA computing, and how does it work?

DNA computing plays into the beyond silicon trend because it introduces a brand-new computing substrate instead of using silicon. It use molecules and the reactions between those molecules to not just store data, but give you a new way to process it as well.

Storing data in DNA sounds hopelessly complex, but the technologies are well-established and understood. First, the digital content is compressed and mapped to the four nucleotides in DNA (adenine, thymine, guanine and cytosine, or “ATGC”). Because there are four nucleotides, each nucleotide can represent two digital bits. These nucleotide codes are used to create matching synthetic DNA, which is then replicated and stored in DNA strands. Those strands are then “amplified,” or copied millions of times, to make reading the data easier when material is extracted from its storage container.

When the data needs to be read, the opposite process occurs. The DNA strands are prepared and sequenced back into nucleotide codes, which are then converted back into digital content.

From a resiliency and storage density perspective, nothing beats DNA. Properly stored, DNA can last for at least 500 years. And a gram of DNA can store over 200PB of data

With digital data represented as DNA, the next step is introducing a processing mechanism to create a full DNA computing environment. While it is still a highly experimental domain in DNA computing, enzymatic processing is gaining prominence.

Enzymatic processing uses enzymes, which are proteins that act as catalysts, to perform a logical operation on a collection of DNA. This mechanism is inspired by how DNA is replicated and error-checked in organisms. Custom-designed enzymes can take the form of “logic gates” that process data and create new DNA strands as output, which can then be read by a DNA sequencer. Recent experiments have used enzymatic processing to perform machine learning over data represented as DNA.

From a resiliency and storage density perspective, nothing beats DNA. Properly stored, DNA can last for at least 500 years. And a gram of DNA can store over 200PB of data. Another advantage of DNA is it's never going to go out of style. We are made from it. Unlike other technologies that might be fads or become incredibly difficult to maintain, DNA is pretty straightforward. And the technologies that synthesize it and the technologies that sequence it are well- understood and falling in price every day, making it much more approachable.

How might this be used today? 

You might see DNA computing in any industry that has a massive amount of data. A good example is CERN with the Large Hadron Collider. They collect petabytes of data every year. Storing that in magnetic tape is incredibly expensive. It takes a lot of room and they can only store it for about 10 years before they have to move it to fresh tape. Other use cases include storing national archives, scientific endeavors producing large amounts of data like astronomy, or industries like oil and gas.

But that's only half the story — you also have to be able to process that data. And this is one of the real advantages of DNA computing. You can have millions of copies of a given dataset, and you can replicate it very cheaply. Once you have that data represented millions of times, you can introduce enzymes into that pool of DNA strands, and using enzymatic reactions, it will do whatever kind of computing you might want to do. Viable DNA processing is several years away, but the possibilities are fascinating.

Where is technology in terms of market adoption?

DNA computing is at a very early stage. We've seen some early investments from large and small technology vendors. A lot of research is happening at universities, but it is very early. I think we'll see DNA storage as a viable option within three to five years, likely in a cloud infrastructure scenario. And then DNA computing will take longer to develop. I predict that's going to happen within eight to 10 years.

[swg_ad id="36598"]

Formative AI: Generative AI – Svetlana Sicular What is generative AI?

Generative AI is not a single technology, but it's a variety of machine learning methods that learn a representation of artifacts from data and use the data to generate brand-new, completely original, realistic artifacts. Those artifacts preserve a likeness to the training data, but they don't repeat it. It can produce novel content such as images, video music, speech, text and even materials, and all of this can be produced in combination. It can improve or alter existing content and it can create new data elements or data itself.

What are the downsides of generative AI?

Generative AI has gained a partly negative reputation because of deep fakes. If AI can generate a face, text or video, it could be used to compromise someone for political or blackmailing purposes. We’ve already seen the first case of a generated voice being used to embezzle money. A voice of a CEO was generated and used to request the quick transfer of a large sum of money.  But we cannot negate the pluses, such as generative technology being used to predict how some conditions, like arthritis, will develop in the next three years.

Digital me: Bidirectional brain machine interface – Sylvain Fabre What does a bidirectional brain machine interface do?

Bidirectional brain machine interfaces can turn the human brain into an Internet of Things (IoT) device. It's an interface where you can record brain activities over time and guess or infer the mood of someone or their emotional state. We call it bidirectional because you can also write just like you would write to a memory device or a computer and you can send or remove currents from the brain.

One early application is sending currents to change people's moods. In China, for example, experimentation has started on monitoring whether coworkers started to become angry or agitated and so on. So it's basically reading the mental state of the individual, as well as potentially changing it.

Can you share an instance of bidirectional machine interface in practice?

In terms of the applications, early examples for wellness and fitness are monitoring recorded brain activity. Another example is professional driver safety, with detection of micro sleeps. You can monitor employees’ stress and wellness. We've seen early examples of controlling machines for medical applications, for example for people with paralysis, where they could use the brain to control an exoskeleton.

There could also be some outcomes that are not positive for the individual, fFor example, antidepressants that today are used mostly in chemical form. Antidepressant waves dispensed via a bidirectional brain machine interface could be used to make people more pliable. You could have addiction issues where people get accustomed to sending pleasure-inducing pulses via their brain machine interfaces. So there are some dark aspects that need to be monitored.

We looked at the investments from venture capitalists, which gives us a direction from what has been prioritized with bidirectional brain machine interfaces. We found nothing about security or privacy,which is a bit of a concern. You have this great potential for positive use cases, together with a nonnegligible risk both for personal data and corporate information privacy and security, as well as risk of physical harm to the users. These risks need to be addressed to protect individuals and corporations.

This technology has a very sci-fi feel to it, but just how far out is this reality?

Beyond research in the lab, there are early products that are noninvasive. We think the next step will be more invasive variants, where people might choose to do this on their own for an advantage in sports or at work or in school. That's where the “bring your own” and shadow aspects of this would be a significant concern for corporate CIOs.

Our own assumption at the moment in terms of planning is that by 2025, employees experimenting with bidirectional brain machine interfaces would cause at least one major corporate data security outage. And we think by 2030, about 5% of employees in North America will use some form of bidirectional brain machine interface.

For example, teachers, nurses or drivers, could be monitored for alertness and their ability to be positive at work, and may be required to opt in on brain wave management, for example, to boost alertness or cognition. And again, some of that would be through the employee bringing their own, some of that would be corporate. This raises issues of consent, data privacy and security.

[swg_ad id="36843"]

The post 5 Emerging Technologies Explained by Gartner Experts appeared first on Smarter With Gartner.


Forgerock Blog

The Passwordless Enterprise Era

How ForgeRock and Secret Double Octopus Are Paving the Way for a Passwordless User Journey     We’re living in a world where managing digital identities is becoming an increasingly complex and tedious task. Every organization must deal with multiple accounts and credentials for users, employees, and devices. Sometimes, these siloed identities can span across dozens or hundreds of
How ForgeRock and Secret Double Octopus Are Paving the Way for a Passwordless User Journey    

We’re living in a world where managing digital identities is becoming an increasingly complex and tedious task. Every organization must deal with multiple accounts and credentials for users, employees, and devices. Sometimes, these siloed identities can span across dozens or hundreds of locations, and number in the thousands – or even millions. All this chaos is accelerating the adoption of passwordless technologies. 

With all these moving parts, it can be extremely difficult to secure company information – and this results in a frustrating experience for both users and IT teams. With the infrastructure inside many organizations becoming increasingly fragmented across different servers, cloud services, and online platforms, the identity and access management problem becomes even more complicated. 

This is where passwordless authentication can help. The ForgeRock Identity Platform enables fully password-free user journeys out of the box, and with technology partner Secret Double Octopus, the experience can be extended to the users workstation authentication. This frees employees and administrators from the pain of remembering and managing passwords throughout the enterprise. 

The Growing Challenge of Enterprise Identity and Access Management

Companies often struggle to set up identity and access management (IAM) solutions in a secure, easy-to-use, scalable, and future-proof way. Unfortunately, many organizations end up outsourcing this task to expensive integration specialists to make complex systems work together and to maintain these integrations over time. Either way, organizations gradually get stuck with overly complicated systems that are costly, create unnecessary risk, and can’t scale with their growing needs.

 The mounting challenge of identity management has spurred collaboration among different vendors to create scalable, integrated solutions that provide robust security and easily integrate with the different on-premises and cloud-based solutions that the enterprise has already invested in. These efforts have become even more important as the COVID-19 pandemic has driven  many companies to adopt work-from-home models, making them even more dependent on reliable and scalable digital infrastructure.

 The addition of Secret Double Octopus’ technology to the ForgeRock Trust Network extends the reach of Intelligent Authentication to the desktop login experience and provides passwordless authentication to any application protected by the ForgeRock Identity Platform.  

As we’ve covered in a previous blog series, passwords are a weak spot that continues to give organizations IT cost overhead and security nightmares. The deployment of passwordless authentication provides increased security, lower operational costs, less downtime, and an enhanced user experience that results in improved productivity across the organization.

How Do We Do It?

With the integration of Secret Double Octopus, ForgeRock customers can improve security, creating a more pleasant user experience for employees, and change the way IT departments handle user authentication.

The change starts at the workstation level – with a choice between Desktop Multi-Factor Authentication (MFA) using the ForgeRock app or a passwordless desktop experience that removes passwords altogether when logging in to Microsoft Windows, Apple Mac, or Linux workstations. With additional support for existing one-time password (OTP) tokens, offline scenarios, and FIDO2 keys, the workstation becomes the first step towards a passwordless enterprise.

The next change happens at the directory level, with a choice to use to an existing Active Directory (AD) or Azure Active Directory (AAD) datastore, or to remove AD all together and rely on the ForgeRock Directory Service as the source of user profile data for workstation authentication.

Organizations have the flexibility to adopt different scenarios based on their policies, preferences, and available technology. For instance, they can choose between the ForgeRock Authenticator, the Octopus Authenticator, or a combination of both. If the work environment does not allow mobile devices, they can use FIDO2 keys as a second factor, or they can use an offline OTP if users can’t access the internet.

Integrating Octopus Authentication with the ForgeRock Identity Platform eliminates the need to create, change, manage or remember passwords, saving many headaches and complexities for IT teams and users. This directly results in boosted uptime and productivity, as well as increased security, thanks to a universal user experience across all applications.

 Together, ForgeRock and Secret Double Octopus provide customers with a clear path to transition from costly and risky user-managed passwords toward a passwordless future. Organizations can now deploy a single authentication mechanism to serve all their needs in a frictionless, cost-efficient way through a known and trusted platform.

Want to create simple and secure access experiences that just flow? Find out more about passwordless authentication here

 


KuppingerCole

Oct 20, 2020: KCLive Tools Choice: Privacy and Consent Management

Privacy isn’t a superficial requirement. It is a cultural change that is transforming the way individuals and enterprises treat data and impacts almost all industries, especially B2C. Everyone from end-users to enterprises to hackers know that identity information has value, and that if not protected it will be exploited. And yet collecting private information has become ubiquitous with online busi
Privacy isn’t a superficial requirement. It is a cultural change that is transforming the way individuals and enterprises treat data and impacts almost all industries, especially B2C. Everyone from end-users to enterprises to hackers know that identity information has value, and that if not protected it will be exploited. And yet collecting private information has become ubiquitous with online business to deliver personalized services. Enterprise-grade tools to handle and safeguard end-user information existed before the wave of global privacy regulation hit, but the Privacy and Consent Management segment has experienced rapid growth and maturity since the GDPR became active in 2018.

Authenteq

Online Alcohol Marketplaces Use KYC and Age Verification to Keep Sales Up, While Keeping Minors Out

The magic of innovation means that once a technology, product or tool becomes mainstream, it’s difficult for people to remember a time without […] The post Online Alcohol Marketplaces Use KYC and Age Verification to Keep Sales Up, While Keeping Minors Out appeared first on Identity Verification & KYC | Authenteq.

The magic of innovation means that once a technology, product or tool becomes mainstream, it’s difficult for people to remember a time without it. E-commerce and same-day delivery options are to be expected whether you’re buying clothing, household items, electronics, and now even groceries. While the online grocery store market has seen steady growth, COVID-19—as it did with many industries—dramatically shifted consumer behavior and, seemingly overnight, became the new norm. And yet, even for those grocery stores who were already operating online, there is one item on their shelves that presents a unique problem: alcohol. 

According to a RaboResearch report, online alcohol sales reached $2.6 billion in the US in 2019, up 22 percent. Despite this growth, the same report predicts that alcohol brands are losing billions of dollars in online sales opportunities and missing out on building digital relationships with their consumers.

The physical sale of alcohol is one area where the legislation is very clear. A legal purchasing limit is set. Age verification is required. Fines are actively enforced. In Europe and the US, alcohol can be purchased in grocery stores, corner stores, specialty shops, and even in some pharmacies. These purchases require a face to face interaction solely because of the age verification step. But while good innovation can feel like magic, legislation often lags behind, as is the case with online alcohol sales. Add in a global pandemic, social distancing, and isolation, and suddenly what felt more like an exception becomes the rule.

Bier mich

In Germany, a system has been honed over the years that have in fact included the online purchase of alcohol. Wine or beer can be ordered and delivered to a private residence, but to receive the alcohol, a government-issued ID needs to be provided by the purchasing party and verified by the delivery person. Before COVID-19 hit, there were issues with deliveries being sent to locations like DHL boxes which meant the identity verification step could not be completed. However, by and large, the system was working, and it was working well.  

As COVID-19 spread, Germans, and citizens around the world for that matter, needed more space and less interaction. E-commerce and shipping providers began offering zero contact delivery in response. With no human exchange taking place at all anymore, what does this mean for alcohol sales? In the absence of updated nationwide regulations, the onus of how to navigate increased online alcohol sales has been left up to the individual shops. As it turns out, ensuring that alcohol is not being purchased and consumed by minors is much harder to regulate during a global pandemic.

More than money 

Underage drinking poses a number of risks. In the US, alcohol plays a role in over 180,000 injuries and 4,000 deaths annually. Selling alcohol online is more than a matter of profit or meeting economic demands, it is also a matter of corporate social responsibility. For any company that values protecting minors online, selling any product or service that could be classified within the vice industry—that is alcohol, tobacco, gaming and betting, adult websites—proper age and identity verification absolutely needs to be baked in. 

Retailers across Europe and the US, both large multinationals and smaller brick and mortar shops, can easily integrate a KYC solution into their online systems. Rather than implementing a basic age checkbox, an automated identity and age verification system can protect your business and your customers.

Authenteq allows you to verify your customers and their identity through any channel without compromising their privacy or security. The omnichannel solution is fully automated and provides verification within minutes. With desktop and mobile solutions, it can be easily integrated into the sales process without channel breakage to keep your customer retention and completion rates high. Of course, a KYC tool with manual verification of online sales is better than no KYC at all, but with the expected immediacy of online purchases, manual intervention is both costly and lengthy.

By implementing an identity and age verification process like Authenteq’s into your own solution, you can unlock new growth potential in a safe and secure way for all.

Looking ahead

COVID-19 has taught us many things, but perhaps one of the biggest lessons so far is how adaptable companies and consumers must be. Consumer buying behavior evolves over time, but it can also change overnight.

As e-commerce sales continue to grow across all sectors, the barriers to entry are, for now, harder for alcohol marketplaces. Those that see these barriers as a reason to not enter the space at all, will see profound consequences. Participating in the online ecosystem is necessary for brands to stay connected to their customers, but brands that also consider and incorporate their own corporate social responsibility goals into that online presence are much better poised to respond to changes, and flourish while doing so.

Are you ready to incorporate KYC into your own corporate social responsibility goals? Connect with one of our sales experts today to learn how Authenteq can help you reach them.

The post Online Alcohol Marketplaces Use KYC and Age Verification to Keep Sales Up, While Keeping Minors Out appeared first on Identity Verification & KYC | Authenteq.


KuppingerCole

SAP Cloud Identity Access Governance

by Martin Kuppinger SAP Cloud Identity Access Governance (IAG) is the SaaS solution provided by SAP for managing access risks and SoD controls from the cloud, for both SaaS business applications and a range of on-premises services. It covers areas such as Access Analytics, Role Management, Access Requests, Access Reviews, and Privileged Access Management for these environments. SAP Cloud IAG can

by Martin Kuppinger

SAP Cloud Identity Access Governance (IAG) is the SaaS solution provided by SAP for managing access risks and SoD controls from the cloud, for both SaaS business applications and a range of on-premises services. It covers areas such as Access Analytics, Role Management, Access Requests, Access Reviews, and Privileged Access Management for these environments. SAP Cloud IAG can run independently of SAP Access Control, but also integrates neatly with that solution.


Buyer’s Compass: Access Management

by Richard Hill Access Management capabilities are well-established in the broader scope of IAM and are continuing to gain attraction due to emerging requirements for integrating business partners and customers. This KuppingerCole Buyer’s Compass will provide you with questions to ask vendors, criteria to select your vendor and requirements for successful deployments. This document will help prep

by Richard Hill

Access Management capabilities are well-established in the broader scope of IAM and are continuing to gain attraction due to emerging requirements for integrating business partners and customers. This KuppingerCole Buyer’s Compass will provide you with questions to ask vendors, criteria to select your vendor and requirements for successful deployments. This document will help prepare your organization to conduct RFIs and RFPs for Access Management.


MyKey

MYKEY, the First Smart Wallet on Tron Starts a Public Beta, Come to Claim Airdrop

MYKEY, the First Smart Wallet on TRON Starts a Public Beta, Come to Claim Airdrop Hi everyone, From now on, MYKEY officially supports TRON and starts a public beta, airdropping 80,000 TRX to users. TRON is the third blockchain supported by MYKEY after EOS and Ethereum. TRON protocol is one of the world’s largest blockchain-based decentralized application operating system protocols, providi
MYKEY, the First Smart Wallet on TRON Starts a Public Beta, Come to Claim Airdrop

Hi everyone,

From now on, MYKEY officially supports TRON and starts a public beta, airdropping 80,000 TRX to users. TRON is the third blockchain supported by MYKEY after EOS and Ethereum.

TRON protocol is one of the world’s largest blockchain-based decentralized application operating system protocols, providing high throughput, high scalability, and high reliability of the underlying public blockchain support for the operation of decentralized applications on the protocol. MYKEY, as the world’s first multi-chain smart wallet, will provide users with free TRON accounts and convenient wallet services after accessing TRON, helping users to use financial management, games, and other rich applications more conveniently on TRON.

Simple and Without Threshold

MYKEY wraps the TRON gas billing system so that users do not need to own TRX and actively manage the original energy and bandwidth on TRON.

Transfer for free

Users enabling Tron accounts can transfer for free for a limited time in MYKEY (the free use policy is subject to the latest official announcement).

The account can be retrieved

Based on the KEY ID contract design mechanism, the MYKEY assisted users’ TRON accounts can be retrieved through the emergency contact when the TRON account is lost.

The first smart wallet on TRON

MYKEY, the world’s first multi-chain smart wallet, is also the first smart wallet on TRON.

Airdrop 80,000 TRX for beta users

1. Entrance: [Home] — [Enable TRON Account] in MYKEY
2. Period: September 8, 2020, 12:00 — September 15, 2020, 12:00(UTC+8), enable MYKEY Tron accounts during the activity to get 10 TRX.
3. The number of rewards: Limited to the first 8,000, and each user can get 1 reward. The rewards will be automatically distributed to your TRON account within 3 working days after the activity ends.
4. Add WeChat of MYKEY Assistant: send “tron” to any MYKEY Assistant to join the MYKEY TRON Internal Test Community. If you don’t have WeChat of any MYKEY Assistant, please add MYKEY №6 Assistant: mykeytothemoon6, more benefits are waiting for you in the community.

Security Audit

The contract code of the KEY ID protocol has passed the security audit and formal verification of the top security organization SlowMist Technology. For details, please refer to Github.

Related contract addresses are shown:

Bounty Program

MYKEY security bounty program is ongoing for the long term. For details, please click: https://slowmist.io/en/mykey/

Some Exchange Restrictions

The TRON account in MYKEY is a contract account, currently only some Exchanges support contract deposits (Binance/Huobi, Huobi only supports TRC20). Users can choose these Exchanges for depositing and trading (all withdrawals from the Exchange to MYKEY account are not affected).

If users inadvertently deposit to the Exchange that does not support smart contracts, users can send a ticket order to the Exchange, and in most cases, users will do manual receipt processing. If not properly handled, users can contact MYKEY for further assistance.

If the Exchange has supported contract deposit but not be listed, it can submit PR in github.

We believe a smart wallet will be an important product to help blockchain enter thousands of households. The consensus on smart contract needs to be established by the whole industry. MYKEY will work with partners to promote Exchange support smart contract deposit.

About Us

KEY GROUP: https://keygroup.me/

MYKEY Web: https://mykey.org/

BIHU: https://bihu.com/people/1133973

Telegram: https://t.me/mykey_lab

Twitter: https://twitter.com/mykey_lab

Medium: https://medium.com/mykey-lab

Github: https://github.com/mykeylab

Youtube: MYKEY Laboratory

MYKEY, the First Smart Wallet on Tron Starts a Public Beta, Come to Claim Airdrop was originally published in MYKEY Lab on Medium, where people are continuing the conversation by highlighting and responding to this story.

Monday, 07. September 2020

KuppingerCole

Vickus Meyer: Nedbank Identity & Access Governance Fundamentals in Action




Panel Discussion - The Road to Enterprise Identity Success




Morey J. Haber: 10 Steps to Universal Privilege Management

Virtually every cybersecurity breach today involves the exploitation of privileged access. Privileges are initially exploited to infiltrate an IT environment; once compromised by threat actors, privileges are further leveraged to move laterally, access assets, install malware, and inflict damage. In this session, learn 10 key steps to achieving Universal Privilege Management, and how it is used

Virtually every cybersecurity breach today involves the exploitation of privileged access. Privileges are initially exploited to infiltrate an IT environment; once compromised by threat actors, privileges are further leveraged to move laterally, access assets, install malware, and inflict damage.

In this session, learn 10 key steps to achieving Universal Privilege Management, and how it is used to secure every user, session, and asset across your IT environment. Covered topics include: 

Why relying on password management alone leaves dangerous gaps in protection Disrupting the cyberattack chain with privileged access security controls Essential steps to achieving rapid leaps in risk reduction Keys to a frictionless PAM solution that is invisible to end users

We will also share how the BeyondTrust Privileged Access Management (PAM) platform enables absolute control over every privilege in your environment to drastically reduce your attack surface and windows of exposure, while boosting business productivity.




Neeme Vool: Implementing IAM in the Enterprise: 3 Takeaway Gems from Engineer

IAM implementations are not all same, but for sure there is not as many implementations as there are situations.I have selected 3 major factors which defined our IAM project. Of course, final result was a consequence of lot more things. And yes, we succeeded to implement full Identity lifecycle in the enterprise where starting point is a complex matrix of requirements like multiple legal en

IAM implementations are not all same, but for sure there is not as many implementations as there are situations.I have selected 3 major factors which defined our IAM project. Of course, final result was a consequence of lot more things.

And yes, we succeeded to implement full Identity lifecycle in the enterprise where starting point is a complex matrix of requirements like multiple legal entities, multiple contracts, kinds of work relationships, several account directories, manual processes.

Just name any situation, we had it. We were on the edge of failure and almost ready to add yet another failed IAM project to the list. But we made it and if I had to do it yet again in another place, in another situation, I would take these 3 with me.

Takeaways:

-Correct data is vital for every IT project, IAM is not special. But where to exactly look in the IAM project? Which issues with the data are toxic?

-The analysis of the system drives implementation architecture and design. Thereby the steering must be correct.

-And finally, how to bring it out in the enterprise scale if you cannot do big bang but same time you cannot left anybody behind?




Mike Kiser: 13 Treasures in 81 Minutes: The Isabella Stewart Gardner Heist and Identity as the New Vermeer

In the early hours of March 18th, 1990, two men entered the Gardner Museum. They left 81 minutes later with 13 artworks, including two Rembrandts, a Vermeer, a Degas, and ancient Chinese vase.  The heist remains unsolved today, with no leads and no suspects — and the museum is offering a $10 million prize for the safe return of the pieces. Given that background, you might assume that this w

In the early hours of March 18th, 1990, two men entered the Gardner Museum. They left 81 minutes later with 13 artworks, including two Rembrandts, a Vermeer, a Degas, and ancient Chinese vase.  The heist remains unsolved today, with no leads and no suspects — and the museum is offering a $10 million prize for the safe return of the pieces.

Given that background, you might assume that this was another session about zero trust. It’s not.

Recently, a growing emphasis on data privacy has sought to treat identities and their associated data as valuable works of art as well, worthy of protection and compensation for use. Through the lens of the Gardner Theft, we’ll evaluate the current proposals and concepts around user-owned data and explore the benefits and pitfalls of each.

We'll step through the heist, recreate those 81 minutes, and discover how identity data is the new Vermeer. Note: If we somehow crack the case together, we'll split the $10 million between us all.




Henk Marsman: Moving From the Dark Age of Legacy to the Era of Enlightenment

In this presentation Henk will share the journey that Rabobank made from a situation in 2017 with two solutions and infrastructural environments for IAM and two teams, that merged and went on a journey to become one, as well as overcoming the legacy environments that delivered the service. The presentation details especially the management of this journey and how to move from A to B to C to D to E

In this presentation Henk will share the journey that Rabobank made from a situation in 2017 with two solutions and infrastructural environments for IAM and two teams, that merged and went on a journey to become one, as well as overcoming the legacy environments that delivered the service. The presentation details especially the management of this journey and how to move from A to B to C to D to Enlightenment. And perhaps we're not even there yet.

The presentation will detail our specific journey, but general key takeaways can be identified that apply to any IAM department and service.




Joint Session: Demonstration of the Integrated Approach




Loren Russon: Good Enough is Never Enough When Protecting Your Business Resources & Customer’s Data

Technology is evolving quickly and keeping pace requires deep knowledge and experience. Enterprises are also evolving quickly and demand advanced but simple identity solutions to successfully fast track digital transformation, cloud adoption and Zero Trust initiatives. By utilizing “Best of Breed” solutions, organizations can take advantage of the key benefits that only a multi-vendor solution can

Technology is evolving quickly and keeping pace requires deep knowledge and experience. Enterprises are also evolving quickly and demand advanced but simple identity solutions to successfully fast track digital transformation, cloud adoption and Zero Trust initiatives. By utilizing “Best of Breed” solutions, organizations can take advantage of the key benefits that only a multi-vendor solution can offer. Join this session to learn about the core principle of best of breed solutions and hear about some examples of what organizations have done to build the right foundation for Enterprise Identity Success.




Matthias Reinwarth: The Three Fundamentals of Enterprise Identity Success - My Take




procivis

Zusammenfassung: Studie zum Einsatz der Blockchain in der kantonalen Verwaltung

Von Dr. rer. publ. Rolf Rauschenbach und MSc ETH. Sven Stucki Der Regierungsrat hat am 25. April 2018 die Strategie «Digitale Verwaltung» festgesetzt. Sie zeigt auf, wie die Verwaltung die digitale Entwicklung gestalten und die Chance der Digitalisierung nutzen will. Teil der Strategie ist ein Impulsprogramm mit Digitalisierungsvorhaben, die vorrangig und eng koordiniert angegangen werden. […] T

Von Dr. rer. publ. Rolf Rauschenbach und MSc ETH. Sven Stucki

Der Regierungsrat hat am 25. April 2018 die Strategie «Digitale Verwaltung» festgesetzt. Sie zeigt auf, wie die Verwaltung die digitale Entwicklung gestalten und die Chance der Digitalisierung nutzen will. Teil der Strategie ist ein Impulsprogramm mit Digitalisierungsvorhaben, die vorrangig und eng koordiniert angegangen werden. Das Impulsprogramm umfasst unter Ziel 1 «Vereinfachung und Ausbau des digitalen Leistungsangebotes» auch das Projekt «IP1.5: Studie zum Einsatz der Blockchain-Technologie». Procivis wurde mit der Ausarbeitung dieser Studie betraut. Wir veröffentlichen hier die Zusammenfassung der Studie. Den Link zur gesamten Studie ist am Ende des Blogs zu finden

Zusammenfassung

Eine Blockchain ist die Aneinanderreihung von Datensätzen, Blöcke genannt, die durch kryptographische Verfahren miteinander fest verkettet sind. Die Darstellung von Informationen als Kette (“Chain“) von Blöcken (“Block“) ergibt das Kunstwort Blockchain. Die Verkettung der Blöcke mittels Hashes erlaubt es, die gesamte Vorgeschichte einer Blockchain jeweils in den neuesten Block zu kondensieren. Damit entsteht ein unverfälschbarer Audit-Trail.

Blockchains können unterschiedlich konzipiert werden. Erstens wird differenziert bezüglich des Grads der Zugangsrechte (Unterscheidung nach Lese- und Schreiberechten). Zweitens wird unterschieden nach Konsensregeln, die festlegen, wer den nächsten Block berechnen und publizieren darf. Es besteht eine logische Abhängigkeit zwischen Zugangsrechten und Konsensregeln. Blockchains, die Lese- und Schreibrechte uneingeschränkt gewähren, wenden normalerweise die Konsensregel Proof-of-Work an. Bei dieser Konsensregel ist die Rechenleistung der einzelnen Teilnehmer der entscheidende Faktor. Bekannteste Beispiele dieser Art von Blockchain sind Bitcoin und Ethereum. Diese Art von Blockchains hat sich bezüglich der Sicherheit als robust erwiesen; ihr grösster Nachteil ist, dass der Skalierung – also der Fähigkeit, grosse Mengen von Transaktionen durchzuführen, Grenzen gesetzt sind. Zudem brauchen sie grosse Mengen Energie. Am anderen Ende des Blockchain-Spektrums bewegen sich private Blockchains, wo sowohl Lese- wie auch Schreibrechte auf einen ausgewählten Kreis beschränkt sind. Hier stellt die Skalierung kein sonderliches Problem dar; auch der Energieverbrauch ist bei solchen Blockchains vernachlässigbar. Allerdings bieten solche Blockchains aufgrund der fehlenden Transparenz und Dezentralisierung nicht die gleichen Vorteile wie öffentliche Blockchains.

Aufgrund der Tatsache, dass Blockchains dynamischen Charakter haben, also die Entwicklung von einzelnen Datenpunkten über die Zeit darstellen, ist es möglich, auf der Blockchain Transaktionen in Abhängigkeit von Bedingungen zu programmieren. Dies geschieht mittels Smart Contracts. Diese sorgen dafür, dass Transaktionen ausgeführt werden, sobald eine gewisse Bedingung erfüllt ist. Dies erlaubt es, auch komplexe Transaktionen weitergehend zu automatisieren.

Die Blockchain-Technologie wurde ursprünglich entwickelt, um über eine Infrastruktur zu verfügen, die ohne eine zentrale Autorität auskommt. Deshalb stellt sich die Frage, inwiefern es für den Staat Sinn macht, diese Technologie einzusetzen, ist er als Inhaber des physischen Gewaltmonopols doch das Sinnbild einer zentralen Autorität. Im Falle eines idealen Staates, der effizient und ordnungsgemäss arbeitet, bedeutet die Anwendung der Blockchain-Technologie eine Verkomplizierung; zentrale Datenbanken genügen. Ein besonderer Anwendungsbereich bilden demokratische Wahl- und Abstimmungsverfahren, bei denen Bürgerinnen und Bürger ihr Wahlgeheimnis gewahrt wissen, gleichzeitig aber auch sicher sein wollen, dass ihr politischer Wille korrekt ins Ergebnis einfliesst. Auf öffentlichen Blockchains wäre dies möglich; allerdings zum Preis der Unmöglichkeit der späteren Vernichtung der Wahlunterlagen.

Staaten weichen vom Ideal ab, indem sie Strukturen und Prozesse aufweisen, die einen mehr oder weniger effizienten Mitteleinsatz zur Folge haben. Zudem können sich Amtsträgerinnen und Amtsträger und Staatsangestellte mehr oder weniger gesetzeskonform verhalten bzw. Fehler machen. Blockchains können die damit verbundenen Ineffizienzen mildern, indem sie die Grundlage für Datenkonsistenz bieten. Darauf aufbauend können Prozesse weiter digitalisiert und automatisiert werden, und (unbeabsichtigtes) menschliches Fehlverhalten kann besser kontrolliert werden. Blockchains verhindern gesetzeswidriges Verhalten nicht per se, aber ein jederzeit einsehbarer und unverfälschbarer Audit-Trail erhöht den Druck, sich korrekt zu verhalten.

Der Nutzen einer Blockchain-Lösung kann sich nur entfalten, wenn die entsprechenden Rahmenbedingungen gegeben sind. Jeder Veränderungsprozess setzt politischen Willen und Führungsverantwortung voraus. Dies ist bei allfälligen Blockchain-Anwendungen umso mehr der Fall, weil deren Implementation Konsequenzen haben, die über technische Aspekte weit hinausgehen. Neben organisatorischen und kulturellen Fragen sind insbesondere auch rechtliche Fragen zu beachten: Erstens ist jeweils zu prüfen, ob für eine Blockchain-Anwendung die erforderliche rechtliche Grundlage vorhanden ist und zweitens, ob diese mit dem Datenschutzrecht in Einklang, und drittens, ob eine rechtskonforme Archivierung möglich ist. Den verfassungsmässig garantierten Rechten – Schutz von Personendaten, Recht auf Korrektur und Löschung, Recht auf Erinnerung (die zueinander in Konkurrenz stehen können) – ist Rechnung zu tragen.

Im Zuge der allgemeinen Begeisterung über die Blockchain-Technologie im Allgemeinen und Kryptowährungen im Speziellen sind weltweit hunderte Konzepte formuliert und Pilotprojekte für Blockchain-Anwendungen in öffentlichen Verwaltungen durchgeführt worden. Produktive Anwendungen mit namhaften Transaktionsvolumina sind aber noch keine bekannt. Dieser Befund trifft auch auf die Schweiz zu: IIm Rahmen dieser Studie konnten acht Blockchain-Projekte identifiziert werden, in denen kantonale Verwaltungen und das Fürstentum Liechtenstein in den Bereichen Register, elektronische Identitäten, E-Voting und Bezahlung von Gebühren erste Erfahrungen gesammelt haben oder sammeln. Diese Projekte sind abgeschlossen worden, teilweise noch im Gang oder in Vorbereitung.

Auf der Grundlage von Prozessbeschreibungen, die im Rahmen des Projekts «Strategie Datenmanagement und Data-Governance IP3.1» erarbeitet worden sind, wurden im Rahmen der vorliegenden Studie vier mögliche Anwendungsbeispiele für den Kanton Zürich entwickelt:

Registrierung von amtlichen Dokumenten auf der Blockchain zur Erhöhung der Transparenz Bürgerdossier mit Blockchain-basiertem Logbuch zur Stärkung der Rechtssicherheit Inventarkontrolle auf der Blockchain zur Erhöhung der Transparenz Öffentliche Ausschreibungen auf der Blockchain zur Erhöhung der Transparenz

Der Reifegrad dieser Beispiele ist unterschiedlich, ebenso wie die Schwierigkeiten bei einer allfälligen Umsetzung. Allen ist jedoch gemeinsam, dass sie sich die unmittelbare und nicht mehr verfälschbare Transparenz von Blockchains zunutze machen. Zentrale Datenbanken sind nicht dafür ausgelegt, einem grossen Kreis in dynamischer Weise Transparenz zu bieten. Eine solche Funktion kann zwar auch in zentralen Architekturen nachgebaut werden, Blockchains lösen das Problem jedoch eleganter. Hinzu kommt, dass dezentrale Architekturen, wie sie Blockchains aufweisen, tendenziell einen Vertrauensvorsprung bieten, der von zentral konzipierten Lösungen nicht eingeholt werden kann.

Lesen Sie die vollständige Studie unter:

https://www.zh.ch/de/politik-staat/kanton/kantonale-verwaltung/digitale-verwaltung/digitalisierungsprojekte.html#1605833167

The post Zusammenfassung: Studie zum Einsatz der Blockchain in der kantonalen Verwaltung appeared first on Procivis.


MyKey

MYKEY Weekly Report 15 (August 31th~September 6th)

Today is Monday, September 7, 2020. The following is the 15th issue of MYKEY Weekly Report. In the work of last week (August 31th to September 6th), there are mainly 3 updates: 1. The sixteenth MYKEY Crypto Stablecoin Report was published We release MYKEY Crypto Stablecoin Report every week to share our interpretation of the development status of stablecoins and analysis of their developmen

Today is Monday, September 7, 2020. The following is the 15th issue of MYKEY Weekly Report. In the work of last week (August 31th to September 6th), there are mainly 3 updates:

1. The sixteenth MYKEY Crypto Stablecoin Report was published

We release MYKEY Crypto Stablecoin Report every week to share our interpretation of the development status of stablecoins and analysis of their development trends to help the participants in the crypto market stay updated on the development status of stablecoin. The sixteenth Crypto Stablecoin Report was published on September 3th, click to read: https://bit.ly/3jDZHbk

2. MYKEY Lab added 1 billion KEY TOKEN to the “multiple chains exchange pool”

Due to the demand of KEY TOKEN on multiple chain, MYKEY Lab locked 1 billion KEY TOKEN on Ethereum to the exchange pool on September 1, for detail, click to read: https://bit.ly/3jwvaw2

3. Open Finance Conference was finished on Saturday

In the fourth week of the Open Finance Conference, the wonderful panels about Investment Funds were successfully held one after another.

!!! If you encounter any abnormal situation while using MYKEY, remember not to uninstall MYKEY APP, please contact MYKEY Assistant: @mykeytothemoon in Telegram.

!!! Remember to keep the 12-digit recovery phrase properly from [Me] — [Manage Account] — [Export Recovery Phrase] in MYKEY even if your account is not real-named.

About Us

KEY GROUP: https://keygroup.me/

MYKEY Web: https://mykey.org/

BIHU: https://bihu.com/people/1133973

Telegram: https://t.me/mykey_lab

Twitter: https://twitter.com/mykey_lab

Medium: https://medium.com/mykey-lab

Github: https://github.com/mykeylab

Youtube: MYKEY Laboratory

MYKEY Weekly Report 15 (August 31th~September 6th) was originally published in MYKEY Lab on Medium, where people are continuing the conversation by highlighting and responding to this story.

Sunday, 06. September 2020

KuppingerCole

KuppingerCole Analyst Chat: Business Resilience Management

Warwick Ashford and Matthias Reinwarth discuss the prerequisites and challenges of making a business able to adapt quickly to risks and disruptions.  

Warwick Ashford and Matthias Reinwarth discuss the prerequisites and challenges of making a business able to adapt quickly to risks and disruptions.

 



Friday, 04. September 2020

Evernym

SSI Roundup: September 2020

Below is a copy of our August/September 2020 newsletter, The SSI Roundup. To get the best SSI headlines, events, and resources sent straight to your inbox each month, subscribe below:   The Self-Sovereign Identity Roundup: September 2020 Welcome back to another edition of The SSI Roundup. Today, we’re exploring how COVID-19 accelerated the adoption of […] The post SSI Roundup: September 202

Below is a copy of our August/September 2020 newsletter, The SSI Roundup. To get the best SSI headlines, events, and resources sent straight to your inbox each month, subscribe below:   The Self-Sovereign Identity Roundup: September 2020 Welcome back to another edition of The SSI Roundup. Today, we’re exploring how COVID-19 accelerated the adoption of […]

The post SSI Roundup: September 2020 appeared first on Evernym.


SELFKEY

SelfKey Progress Report for August 2020

We are proud to bring you the progress report for the month of August, a month filled with major announcements and updates. The post SelfKey Progress Report for August 2020 appeared first on SelfKey.

We are proud to bring you the progress report for the month of August, a month filled with major announcements and updates.

The post SelfKey Progress Report for August 2020 appeared first on SelfKey.


KuppingerCole

Status and Advantages of Small Data Machine Learning Paradigms

by Anne Bailey Consider the relationship between Machine Learning (ML) and data consumption – is more always better? This Leadership Brief discusses the possible benefits of Small Data for ML, some technological approaches to get there, why you should still be cautious of Small Data, and recommendations on applying this practically.

by Anne Bailey

Consider the relationship between Machine Learning (ML) and data consumption – is more always better? This Leadership Brief discusses the possible benefits of Small Data for ML, some technological approaches to get there, why you should still be cautious of Small Data, and recommendations on applying this practically.


Ontology

ONT & ONG Now Listed on UniSwap to Support All Types of DeFi Products on Ethereum

Following up on Ontology’s successful completion of the development of its open-source DID smart contract on the Ethereum network, both ONT and ONG can now be swapped to eONT and eONG on the Ethereum blockchain, as cross-chain liquidities on Ethereum. eONT and eONG are also listed on the UniSwap platform, supporting all types of DeFi products in the Ethereum ecosystem. This move marks Ontology th

Following up on Ontology’s successful completion of the development of its open-source DID smart contract on the Ethereum network, both ONT and ONG can now be swapped to eONT and eONG on the Ethereum blockchain, as cross-chain liquidities on Ethereum. eONT and eONG are also listed on the UniSwap platform, supporting all types of DeFi products in the Ethereum ecosystem.

This move marks Ontology the first mainstream public blockchain that has completed cross-chain communication for Ethereum. Digital assets on the Ethereum blockchain can now be swapped to the Ontology blockchain. Bi-directional cross-chain communications are enabled between Ontology and Ethereum. Check eONT and eONG on Uniswap now!

🔥eONT: https://uniswap.info/pair/0x34852c03e1359fc2ad24f24eb429e30907962b23

🔥eONG: https://uniswap.info/pair/0x55c7c6569f53d4a4e198602ec84c4dfc7c605058

eONT official contract address:

0x519020fa558A52df57854135345C28024A596b68

eONG contract address:

0x6A4C89Eb9a26a2Da34F13f8976dAA9fD7526F35c

As Ontology completes DID smart contract development on Ethereum, Ontology’s bespoke decentralized identity solutions, defined in this instance by the addition of a new smart contract method did:etho:, can now be used not only across the expansive Ethereum network, including a range of popular DeFi applications, but also most chains that run EVM virtual machines. Ontology aims to make its Decentralized Identity (DeID) solutions available across any chain, as part of its ongoing effort to achieve full cross-chain functionality.

Successfully completed the cross-chain communication of digital identity and digital assets on Ethereum is another milestone for Ontology. It follows the recent launch of Ontology Mercury, a trusted, Decentralized Identifier (DID) based peer-to-peer communication framework; the official release of Ontology’s new Decentralized Identity Solutions, designed to help crypto holders and traders manage and exchange digital assets in a more secure and streamlined manner; and a recent collaboration with NEAR Protocol, Elrond Network, and Waves Platform among others to advance the development of secure decentralized identity solutions.

Ontology’s credit and decentralized identity solutions will be applied to more blockchain ecosystems to further redefine trust.

Find Ontology elsewhere

Ontology website / ONTO website / OWallet (GitHub)

Twitter / Reddit / Facebook / LinkedIn / YouTube / NaverBlog / Forklog

Telegram Announcement / Telegram English / GitHubDiscord

ONT & ONG Now Listed on UniSwap to Support All Types of DeFi Products on Ethereum was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.

Thursday, 03. September 2020

Forgerock Blog

ForgeTalks: A Local's Tour of the ForgeRock Identity Platform

Welcome back to ForgeTalks. In last week's episode, ForgeRock's VP of Product Management, Mary Writz, took me on a tour of the main landmarks of the ForgeRock Identity Platform, including Intelligent Access and ForgeRock Go. This week, we are treated to a local's tour of the platform. We'll travel "off the beaten road" and explore some of the hidden gems that the ForgeRock Identity Platform has to

Welcome back to ForgeTalks. In last week's episode, ForgeRock's VP of Product Management, Mary Writz, took me on a tour of the main landmarks of the ForgeRock Identity Platform, including Intelligent Access and ForgeRock Go. This week, we are treated to a local's tour of the platform. We'll travel "off the beaten road" and explore some of the hidden gems that the ForgeRock Identity Platform has to offer.

We'll be exploring:

How the ForgeRock Identity Platform makes it easier to develop applications How Macaroons fix problems around Fine-Grained Scopes and Delegation How ForgeRock makes Identity for Things easy!

I hope you enjoyed this two-parter with Mary. Make sure you stop by next week when I meet with ForgeRock's VP of Product Marketing, Ashley Stevenson, who unravels the question: "What is Single Sign-on?" And if you want to watch any of the other episodes you can check them all out here.


Smarter with Gartner - IT

How Data and Analytics Leaders Can Master Governance

Forty-two percent of data and analytics leaders do not assess, measure or monitor their data and analytics governance, according to a recent Gartner survey. Those who said they measured their governance activity mainly focused on achieving compliance-oriented goals. Good data and analytics governance enables faster, smarter decisions. Organizations that want to improve the quality of their

Forty-two percent of data and analytics leaders do not assess, measure or monitor their data and analytics governance, according to a recent Gartner survey. Those who said they measured their governance activity mainly focused on achieving compliance-oriented goals.

Good data and analytics governance enables faster, smarter decisions. Organizations that want to improve the quality of their data often begin with data and analytics governance projects. 

Companies start data and analytics governance initiatives to drive better information behaviors through their policies. These policies help maximize the investment that organizations have not just in data and analytics, but also content (pictures, voice recordings, emails, etc.) coming from AI and IoT, for example. However, governance practices continue to be data-oriented rather than business-oriented. 

CDOs and data and analytics leaders must ensure that their governance initiatives have concrete, measurable metrics that link data and analytics assets and initiatives to business and stakeholder value. For example, tie customer contact data quality to the percentage of customer retention in a specific market segment or percentage of revenue achieved via ecosystem partners. 

Involve the broader organization in data governance

Data quality is not solely the job of the IT organization. Data governance work must rally stakeholders to the cause, and IT and the business must be clear on the roles they play. The business decides expectations for data quality, but the business also needs to understand that IT does not own data governance and is not responsible for data quality.

The key to resolving this challenge is for data and analytics leaders and CDOs to connect all governance activity specifically and directly with business outcomes and priorities.

[swg_ad id="36843"]

The post How Data and Analytics Leaders Can Master Governance appeared first on Smarter With Gartner.


One World Identity

Accelitas: Reimagining Financial Access

Accelitas National Sales Manager Jimmy Williams joins State of Identity to discuss the expanding definition of alternative data, its applicability for both financial and non-financial use cases, and the future of establishing trust in a digital world.

Accelitas National Sales Manager Jimmy Williams joins State of Identity to discuss the expanding definition of alternative data, its applicability for both financial and non-financial use cases, and the future of establishing trust in a digital world.


KuppingerCole

Oct 27, 2020: Reduce Dependency on Active Directory With Cloud Identity

When it comes to identity management many companies depend heavily on Microsoft Active Directory (AD). This high degree of dependency on one service can become a problem when this service faces an outage. Also, respective of the requirements companies have, all services have certain limitations which are sometimes necessary to overcome. Many cyberattacks are perpetrated via staff endpoint devices (
When it comes to identity management many companies depend heavily on Microsoft Active Directory (AD). This high degree of dependency on one service can become a problem when this service faces an outage. Also, respective of the requirements companies have, all services have certain limitations which are sometimes necessary to overcome. Many cyberattacks are perpetrated via staff endpoint devices (computers, smartphones or even printers). This necessitates a fine-mesh risk management approach with a centralized solution, called Identity Fabric at KuppingerCole.

Trinsic (was streetcred)

Trinsic Basics: What Are Decentralized Identifiers (DIDs)?

“Identifiers” are how you are identified as a unique person and recognized as that same unique person over time. With other people, your primary identifier is your face. With organizations, it might be a student ID, social security number, or driver’s license number. Online, it might be a username, phone number, or email address. Using […] The post Trinsic Basics: What Are Decentralized Identifi

“Identifiers” are how you are identified as a unique person and recognized as that same unique person over time. With other people, your primary identifier is your face. With organizations, it might be a student ID, social security number, or driver’s license number. Online, it might be a username, phone number, or email address.

 

Using identifiers properly helps to avoid confusing two people with one another. But identifiers can also be harmful—if a hacker gets ahold of your social security number, it could mean trouble!

 

Most identifiers are given to us by centralized registration authorities like governments, telephone companies, and email providers. But that puts an organization in between us and our ability to access basic services, compromising privacy and putting individuals in a position of powerlessness. The answer to this problem is a W3C standard called Decentralized Identifiers (DIDs).

A new type of identifier

DIDs are “a new type of globally unique identifier”¹ that is used in self-sovereign identity (SSI). It looks like this:

You’ll notice it has three parts:

Scheme: DIDs all start with did: just like how all website addresses start with http:// . This is so that computers can tell what they’re looking at. Method: After DIDs are generated, they have to be stored somewhere. They’re often stored in a blockchain or a digital wallet, but they can be stored anywhere. The method, which is the example: part of the DID in the example above, tells computers where to go to find the DID. For example, issuers of verifiable credentials in the Trinsic platform get a DID that has a did:sov: method, because the DID goes on the Sovrin Network. Identifier: This is the unique identifier of the DID.

According to the W3C, a DID has the following four characteristics²:

Decentralized: There should be no central issuing agency. That means you should be able to create your own DIDs independently of anyone else. Persistent: The identifier should be inherently persistent, not requiring the continued operation of an underling organization. This is in contrast to your email, for example; if Gmail shuts down, you’d lose access to your Gmail messages. Crytopgraphically verifiable: It should be possible to prove control of the identifier cryptographically. Resolvable: It should be possible to discover metadata about the identifier. How are DIDs used?

As identifiers in the Trinsic system, DIDs serve two primary functions:

 

Publicly identifying credential issuers Privately identifying relationships (connections)

 

Credential issuers need to be publicly identifiable if they want the credentials they issue to be verifiable. We use a blockchain to post the issuers’ DID, which acts as a public key. The issuer signs the credential attributes with the corresponding private key.

 

Each relationship in the Trinsic system is called a connection. When you make a connection, you create a DID just for that relationship and share it with the counterparty. When you want to send messages, you encrypt them so that only the counterparty can decrypt the message using the DID you shared with them. This creates a super secure, peer-to-peer channel to interact. The identifiers for these connections are kept privately in your (and the counterparty’s) wallet.

 

If you are interested in learning more about DIDs, feel free to contact us or read the W3C’s DID specification at https://www.w3.org/TR/did-core/.

Notes This phrase comes from https://www.w3.org/TR/did-core/. These four chracteristics and descriptions come straight from https://w3c.github.io/did-use-cases/.

The post Trinsic Basics: What Are Decentralized Identifiers (DIDs)? appeared first on Trinsic.


SELFKEY

SelfKey Weekly Newsletter – Mobile Wallet V 0.3.22 is here ??

SelfKey Weekly Newsletter Date – 02nd September, 2020 In this edition, read all about the updated Mobile Wallet and more. The post SelfKey Weekly Newsletter – Mobile Wallet V 0.3.22 is here ?? appeared first on SelfKey.

SelfKey Weekly Newsletter

Date – 02nd September, 2020

In this edition, read all about the updated Mobile Wallet and more.

The post SelfKey Weekly Newsletter – Mobile Wallet V 0.3.22 is here ?? appeared first on SelfKey.


MyKey

Crypto Stablecoin Report 16: The connection between stablecoins and real assets

Original link: https://bihu.com/article/1315944646 Original publish time: September 1, 2020 Original author: HaiBo Jiang, researcher of MYKEY Lab We released MYKEY Crypto Stablecoin Report to share our interpretation of the development status of stablecoins and analysis of their development trends to help the participants in the crypto market stay updated on the development status of

Original link: https://bihu.com/article/1315944646

Original publish time: September 1, 2020

Original author: HaiBo Jiang, researcher of MYKEY Lab

We released MYKEY Crypto Stablecoin Report to share our interpretation of the development status of stablecoins and analysis of their development trends to help the participants in the crypto market stay updated on the development status of stablecoin. The MYKEY Crypto Stablecoin Report will be published every week, looking forward to maintaining communication with the industry and exploring the development prospects of stablecoin together.

Quick Preview Last week, the market capitalization of major stablecoins has increased by $601 million to $16,556 million. Tether additionally issued 450 million USDT on Tron twice. On August 29, China Construction Bank briefly opened its digital currency wallet business to the public. The digital currency wallet of CCB supports features such as digital RMB bank card recharge, transfer, QR code receipt and payment, NFC one-touch payment, and cancellation. MakerDAO stated that there had been applications from companies in the fields of treasury bonds, real estate, gold and commodity indexes, and supply chain finance to use assets as collateral for DAI. Aave applied for an Electronic Money Institution license from the Financial Services Authority of the United Kingdom in 2018, which was approved in July this year. Aave will continue to apply for other licenses. Tether additionally issued 10 million USDT on EOS on August 29 due to the increased DeFi activity on EOS. 1. Overview of Stablecoin Data

First, let’s review the changes in the basic information of the various stablecoins in the past week(August 22, 2020 ~ August 28, 2020, same below).

Market Circulation

Source: MYKEY, CoinMarketCap, Coin Metrics

At present, the market circulation of major stablecoins has increased by $601 million to $16,556 million.

Source: MYKEY, Coin Metrics

In the past week, Tether additionally issued 450 million USDT on Tron twice. The circulation of USDC, PAX, TUSD, HUSD, DAI, and GUSD increased by 86.48 million, 17.97 million, 3.01 million, 1.24 million, 15.06 million, and 1.28 million, and the circulation of BUSD decreased by 980,000.

The Number of Holding Addresses

Source: MYKEY, DeBank

Last week, the number of main stablecoin holding addresses on Ethereum all decreased by 10,1991.

Source: MYKEY, DeBank

The number of holding addresses of TUSD and DAI increased by 645, and 3,860. The number of holding addresses of USDT, USDC and PAX decreased by 88,075, 18,249 and 172.

The Number of Active Addresses

Source: MYKEY, Coin Metrics

The number of active addresses of stablecoins last week increased by an average of 4.75% compared to the previous week.

The Number of 24-hour Transactions on the Public Blockchains

Source: MYKEY, Coin Metrics

Compared with the previous week, the number of daily transactions of major stablecoins increased by an average of 3.13%.

The Number of 24-hour Volume of Transactions on the Public Blockchains

Source: MYKEY, Coin Metrics

Source: MYKEY, Coin Metrics

The daily volume of transactions of major stablecoins last week decreased by an average of 6.85% from the previous week.

2. The connection between stablecoins and real assets

On August 29, China Construction Bank briefly opened its digital currency wallet business to the public. Through the ‘digital currency’ feature in the China Construction Bank App, you can use the digital RMB, which is China’s CBDC. The wallet of CCB supports features such as digital RMB bank card recharge, transfer, QR code receipt and payment, NFC one-touch payment, and cancellation. Since CBDC is equivalent to legal currency and cannot be rejected, and Alipay and WeChat may also access digital RMB in the future. CBDC is also a kind of stablecoin. It is foreseeable that CBDC will be used by many people as a supplementary payment method. In this report, we will introduce the connection between stablecoins and real assets.

Since this year, MakerDAO has added USDC, WBTC, etc. as collateral. Last week MakerDAO stated that in the next 1–3 years, the focus of Maker will be on introducing physical assets as collateral for DAI. There have been applications from companies in the fields of treasury bonds, real estate, gold and commodity indexes, and supply chain finance to use assets as collateral for DAI. Companies and assets being applied for include:

Real estate mortgage company New Silver implements asset tokenization using Centrifuge protocol (NS-DROP). The trust company Paxos issued the ERC20 gold token PAX GOLD on Ethereum (PAXG). Arca issued digital securities on Ethereum in the form of ARCoin (corresponding to 1-year, 2–10 year, and 10-year US government bonds). Supply chain finance company Harbor uses the Centrifuge protocol to initiate asset tokenization (HTC-DROP). WiV Technology launched Tokenized Wine Commodity Index Fund on Ethereum (WiV).

The above is only being applied for as part of the collateral for DAI. Without the introduction of real assets, now DAI has 443 million in circulation, and the market capitalization of MKR has reached $676 million.

Aave is a decentralized lending protocol, managed by AaveDAO. As early as 2018, Aave applied for an Electronic Money Institution license from the Financial Services Authority of the United Kingdom, and it was finally approved recently. This allows Aave to get the same as Coinbase and Revolut in Europe, which can issue electronic currency substitutes and provide payment services. Aave users can buy stablecoins and other assets in the Aave ecosystem through legal currency. Stani Kulechov, the founder of Aave, stated that Aave entities will continue to apply for other licenses to facilitate new users to join the Aave ecosystem.

Telos is a smart contract platform running on EOSIO, supporting the creation of decentralized applications and decentralized autonomous organizations. At the end of August, Telos announced a partnership with Katalyo to tokenize $35 million worth of Croatian real estate. Katalyo will issue two tokens for each real estate. One token represents the ownership of the real estate, and the other is a stablecoin anchored in legal currency generated by rental income.

As a global payment company, VISA is committed to providing the greatest value to individuals, businesses, and economies. Digital currency provides an opportunity for VISA to continue to expand its service range. Nowadays, the digital currency (stablecoin) backed by legal currency combines the stability of digital currency with currencies such as the U.S. dollar to become a promising new payment method. The concept of stablecoins has gradually attracted attention outside the financial technology field, including financial institutions and central banks. Consumers and businesses are also using digital currencies, leading to rapid growth in the circulation of digital currencies. VISA cooperates with the regulated and approved digital currency platforms Coinbase and Fold, etc., to provide services to the blockchain for the existing 61 million merchants. More than 25 digital currency wallets have been connected to VISA, which provides users with an easy way to use VISA debit or prepaid cards to spend from their digital currency balances wherever VISA is accepted. VISA has been committed to cooperating with global policymakers and organizations to help everyone better understand digital currencies.

According to the twitter tweeted by Sygnum on August 27, Galaxus, the largest online retailer in Switzerland, started using Sygnum Bank’s stablecoin Digital Swiss Franc (DCHF) to complete e-commerce payments. DCHF was launched in March 2020 and is anchored to the Swiss franc. Sygnum claims to be the first licensed bank in Switzerland to issue stablecoins. The bank holds a digital asset banking license from the Swiss FINMA.

The use of stablecoins outside the cryptocurrency market is a trend. Not only companies in the blockchain industry are actively attracting external capital to join, but also traditional financial institutions. It is foreseeable that stablecoins will enter our daily lives in the future.

Tips

To better communicate with industry insiders, we decided to add two sections for questions of readers and opinions of guests. If readers have questions about stablecoins, please contact us. We will pick meaningful questions to answer in the next issue. At the same time, welcome guests from the industry to share your views on stablecoins. Contact information: jianghb@mykey.org.

This is what we’re sharing in this MYKEY Crypto Stablecoin Report, welcome to stay tuned for follow-up crypto stablecoin reports. We will provide more interpretations of the development status of stablecoins and analysis of their development trends to help you stay updated on the development status of stablecoin in the follow-up report.

PS: MYKEY Lab has the final right to interpret the content of the article, please indicate the source for the quotation. Welcome to follow our official account — MYKEY Lab: MYKEY Smart Wallet.

Past review

MYKEY Crypto Stablecoin Report 01: USDT continues to gain momentum as market capitalization exceeding $10 billion

MYKEY Crypto Stablecoin Report 02: USDT suspended additional issuance and the usage scenario of USDT in Tron is single

MYKEY Crypto Stablecoin Report 03: Where are the users of DAI?

Crypto Stablecoin Report 04: Tether additional issued 300 million USDT, commenting on various decentralized stablecoins

Crypto Stablecoin Report 05: DAI Maintains Steady Growth, Exploring Use of DAI by Users of Centralized Exchanges

Crypto Stablecoin Report 06: The latest 13 additional issuances of USDT all occurred on Tron, driving the increase use of Tron

Crypto Stablecoin Report 07: Security Analysis of Stablecoins

Crypto Stablecoin Report 08: Interpretation of Digital Dollar Project

Crypto Stablecoin Report 09: Analyze the lending leverage of Compound

Crypto Stablecoin Report 10: Introduce the Algorithmic Stablecoin Project Terra (Luna)

Crypto Stablecoin Report 11: The circulation of stablecoins has overall increased, Holding AMPL a month for 51 times incomes

Crypto Stablecoin Report 12: USDT is additionally issued 690 million The use of stablecoins outside the cryptocurrency market

Crypto Stablecoin Report 13: The market capitalization of stablecoins reached $14.387 billion, Stablecoin pool Reserve

Crypto Stablecoin Report 14: The increase of Ethereum Gas Fee makes the transfers of stablecoin transactions on the blockchain

Crypto Stablecoin Report 15: The market capitalization of stablecoins increased to $15.961 billion, On-chain usage of stablecoins

About Us

KEY GROUP: https://keygroup.me/

MYKEY Web: https://mykey.org/

BIHU: https://bihu.com/people/1133973

Telegram: https://t.me/mykey_lab

Twitter: https://twitter.com/mykey_lab

Medium: https://medium.com/mykey-lab

Github: https://github.com/mykeylab

Youtube: MYKEY Laboratory

Crypto Stablecoin Report 16: The connection between stablecoins and real assets was originally published in MYKEY Lab on Medium, where people are continuing the conversation by highlighting and responding to this story.


PingTalk

What Is Multi-factor Authentication (MFA)?


Forgerock Blog

Autonomous Identity: Your Prescription for Reducing Risk in Healthcare

Earlier this year, ForgeRock published our 2020 Consumer Identity Breach Report detailing insights and data on breaches impacting consumers in 2019 and Q1 2020. As highlighted in the report, healthcare was, once again, the most frequently targeted industry (43% of all breaches), personal identifiable information (PII) accounted for the most sought after data type at 98%, and unauthorized access wa

Earlier this year, ForgeRock published our 2020 Consumer Identity Breach Report detailing insights and data on breaches impacting consumers in 2019 and Q1 2020. As highlighted in the report, healthcare was, once again, the most frequently targeted industry (43% of all breaches), personal identifiable information (PII) accounted for the most sought after data type at 98%, and unauthorized access was by far the most common attack vector, responsible for 40% of breaches. 

As overburdened healthcare IT professionals work tirelessly to meet the demands of the COVID-19 pandemic, cybercriminals are using the global health crisis to take advantage of institutions by exploiting unauthorized access. 

This reality means the healthcare industry must protect against two threats at once. Neither is simple. IT professionals need to ask themselves whether they can identify high risk anomalous access. And just as importantly, explore how AI can help by automating access requests, performing certifications, and predicting what access should be provisioned to users.  

Identity Governance and Administration (IGA) solutions fall short in their ability to address the healthcare industry’s risk landscape and cannot meet its demanding and ever-changing requirements. Why? Simply put, IGA solutions don’t provide organization-wide visibility or identity context especially as identities continue to increase in multiple applications and locations (on-premise, cloud, etc.). This leaves your risk and security teams struggling to keep up as they manually provision access privileges and rubber stamp access requests and certifications. Additionally, the resulting operational inefficiencies can leave your teams blind as to who has access to what and, more importantly, why they have access in the first place. So, what’s the cure? 

ForgeRock Autonomous Identity is an AI-driven identity analytics solution that can be layered on top of, and integrated with, your existing IGA solutions to provide real-time and continuous organization-wide user access visibility, control, and remediation. Autonomous Identity analyzes all identity data to give you a deeper understanding into the risk associated with user access across the entire organization. The solution ingests vast amounts of workforce, partner, and consumer (patients/members) identity data from existing identity management and governance solutions, identity stores, and user activity repositories to provide wider and deeper insight into the risks associated with user access.

For example, one of the largest healthcare retailers in the United States used ForgeRock Autonomous Identity to bring visibility and contextual insight to their employee records, applications, entitlements, and entitlement assignments. The result was 550,000 entitlement assignments identified for AI-driven automation and clean-up; an accomplishment that would have taken a lot of resources and months, if not years, for IT teams to do manually.

As the customer story above exemplifies, Autonomous Identity enables your risk and security teams to accomplish the seemingly impossible — reducing risk, manual processes, and costs with one solution across your disparate identity enterprise.

To learn more about ForgeRock Autonomous Identity, read Maximize the Value of Your Healthcare Identity Solutions with AI-Driven Identity Analytics or contact us today.

 

Wednesday, 02. September 2020

KuppingerCole

Remote Workforce: How to Protect Yourself From Emerging Threats?

The outbreak of the COVID-19 pandemic has served as a catalyst for digitization in many companies and led to an increase in remote work and adoption of the bring your own device (BYOD) policy. Every device and digital service that employees use is a potential gateway into company networks and thus poses a security risk. The risks are magnified even more when privileged accounts enter the equation

The outbreak of the COVID-19 pandemic has served as a catalyst for digitization in many companies and led to an increase in remote work and adoption of the bring your own device (BYOD) policy. Every device and digital service that employees use is a potential gateway into company networks and thus poses a security risk. The risks are magnified even more when privileged accounts enter the equation because they enable access to critical data. Given the immediacy and speed with which companies had to shift from office to remote work, security concerns were often neglected leading to an increase in successful cyberattacks.

Facing this situation of increased risks and attack vectors, companies need to have foolproof security in place to keep their data safe and secure. IT teams should start with the very basics such as giving the company staff cybersecurity awareness trainings and monitoring remote access. While it has become a well-known fact that Privileged Access Management (PAM)  is an integral part of cybersecurity in modern companies, Secure Compliance Management (SCM) and User Behaviour Analytics (UBA) perhaps have not been given proper attention in the security world yet. 




IAM Essentials: Virtual Directory Services




Dec 01, 2020: Zero Trust for the Workforce

While the concept of zero-trust networking is nearly a decade old, the last few years have seen its popularity in industry discussions grow exponentially.
While the concept of zero-trust networking is nearly a decade old, the last few years have seen its popularity in industry discussions grow exponentially.

SELFKEY

DeFi: Remodeling the Financial Industry

DeFi has established its undeniable presence in today’s financial ecosystem. In this article, we discuss how DeFi can remodel and innovate the financial industry for the better. The post DeFi: Remodeling the Financial Industry appeared first on SelfKey.

DeFi has established its undeniable presence in today’s financial ecosystem. In this article, we discuss how DeFi can remodel and innovate the financial industry for the better.

The post DeFi: Remodeling the Financial Industry appeared first on SelfKey.


PingTalk

Why DevOps Matters to Identity Teams

As organizations have accelerated their investments in digital initiatives, business and app teams have embraced the DevOps model, a process that speeds up software delivery. DevOps enables businesses to be agile enough to quickly adapt or be the first to market, which can have a tremendous impact on the bottom line.   It’s time for identity to join the DevOps party. For far too long, i

As organizations have accelerated their investments in digital initiatives, business and app teams have embraced the DevOps model, a process that speeds up software delivery. DevOps enables businesses to be agile enough to quickly adapt or be the first to market, which can have a tremendous impact on the bottom line.

 

It’s time for identity to join the DevOps party. For far too long, identity initiatives have been stretched to their limits while struggling to keep pace with application growth. DevOps is starting to gain traction in the identity space—and for good reason, since identity is a key part of application onboarding. Without collaboration between identity teams and DevOps teams, many of the intended benefits of DevOps can quickly unravel.

 

What Is DevOps?

The DevOps methodology for software development breaks down traditional silos by eliminating manual tasks and replacing them with automation via code so that releases and updates can be pushed out faster and more frequently. At its core, DevOps improves the efficiencies between app development and IT operations or infrastructure teams. Previously, developers would make manual requests to identity teams or IT ops and lose valuable time. In DevOps, the identity teams enable the software to be consumable via APIs or infrastructure-as-code, the preferred methods for developers.

 

In addition, the DevOps model has specific tools for implementation. The most common include Docker, which enables software in images or “containers,” and Kubernetes, the orchestration engine that manages containers. These tools allow developers to easily spin up infrastructure and perform updates with minimal interruption.

 


Otaka - Secure, scalable, and highly available authentication and user management for any app.

10x Your Development with the Azure CLI

Back in the days of DOS, software developers couldn’t count much on fancy tools. There were no graphical interfaces, and everything was purely text-based. I remember using brief as an editor for my C source files (C++ didn’t exist yet), and compiling the code from the command line with the Aztec C compiler. The most advanced concept of a non-trivial software project was based on makefiles. The ide

Back in the days of DOS, software developers couldn’t count much on fancy tools. There were no graphical interfaces, and everything was purely text-based. I remember using brief as an editor for my C source files (C++ didn’t exist yet), and compiling the code from the command line with the Aztec C compiler. The most advanced concept of a non-trivial software project was based on makefiles. The idea of grabbing a mouse and moving it around the desk to operate a computer would have been hilarious (or disgusting maybe!). No clicks or taps, only keystrokes - and a lot of them.

With the appearance of the graphical OSs, and point-and-click devices, we grew bored of hitting keys; it annoys us when there is no reason to click or tap, and we are forced to go back to the keyboard. The annoyance escalates sometimes to anger if there is not a shortcut or a copy-and-paste solution and we have to write down the full command, character-by-character!

While graphical interfaces are very convenient and user-friendly, they are not ideal with repetitive sequences of operations. Modern DevOps aims to make faster infrastructure configuration and application deployment. The availability of a tooling system that allows scripting these operations on automated and streamlined workflows is paramount to achieve better performances.

Command-Line? Discover Why

With .NET Core and Azure Microsoft aiming to gain a bigger share in the software development industry, non-Windows operating system’s ban policy adopted in the past by the Redmond giant gave way to a new era where different mainstream operating systems and devices (like Linux, macOS, iOS and Android) are embraced as allied, instead of fought against as competitors in the software business.

In addition to the graphical-first approach that characterized .NET Framework, .NET Core and Azure offer complete command-line-oriented tooling, including a built-in Linux integration (WSL, Windows Subsystem for Linux).

In this tutorial I am going to demonstrate how you can create and deploy to the Cloud an application including Okta Security services using the .NET Core and the Azure command-line interface tools, respectively known as dotnet and az.

Requirements

The resources used for this post are:

A computer with a .NET Core compatible Operating System (I used Windows 10) A modern internet browser (I used Google Chrome and Microsoft Edge) An Azure account (Free Tier is ok) An Okta Development account Your favorite text editor (I used Visual Studio Code) ASP.NET Core SDK (I am using version 3.1.401) The Azure Command Line Interface (Azure CLI) tool Prep an ASP.NET Core Application for the Azure CLI

In this paragraph you will create a new ASP.NET Core application using dotnet and add the Okta security layer with a text editor.

Create a basic ASP.NET Core MVC application, using the dotnet CLI and the mvc template and Check that it works as expected.

dotnet new mvc -n okta-cli-app

Then run the application.

dotnet run

The output should look like the following:

The application is now served at the two addresses indicated in the log messages; open your browser and navigate to http://localhost:5000. You should get the following page:

Note that the browser has been redirected to the URL https://localhost:5001. This is because the standard template includes by default a redirection to to the secure protocol (app.UseHttpsRedirection() is called by Configure() in Startup.cs)

You are now ready to add the authentication and authorization features: Okta, of course! Stop the running app by pressing CTRL+C in the terminal.

To make the necessary code changes, a simple text editor would be enough. In this case, I am using Visual Studio Code.

Add the Okta.AspNetCore package to the project.

dotnet add package Okta.AspNetCore

This modifies the project file okta-cli-app.cspro, adding an entry for the dependency.

<ItemGroup> <PackageReference Include="Okta.AspNetCore" Version="3.3.0" /> </ItemGroup>

Modify Startup.cs as follows

using Microsoft.AspNetCore.Authentication.Cookies; using Microsoft.AspNetCore.Authentication.OpenIdConnect; using Microsoft.AspNetCore.Builder; using Microsoft.AspNetCore.Hosting; using Microsoft.Extensions.Configuration; using Microsoft.Extensions.DependencyInjection; using Microsoft.Extensions.Hosting; using Okta.AspNetCore; using System.Collections.Generic; namespace okta_cli_app { public class Startup { public Startup(IConfiguration configuration) { Configuration = configuration; } public IConfiguration Configuration { get; } // This method gets called by the runtime. Use this method to add services to the container. public void ConfigureServices(IServiceCollection services) { services.AddAuthentication(options => { options.DefaultScheme = CookieAuthenticationDefaults.AuthenticationScheme; options.DefaultChallengeScheme = OpenIdConnectDefaults.AuthenticationScheme; }) .AddCookie() .AddOktaMvc(new OktaMvcOptions { // Replace these values with your Okta configuration OktaDomain = Configuration.GetValue<string>("Okta:OktaDomain"), ClientId = Configuration.GetValue<string>("Okta:ClientId"), ClientSecret = Configuration.GetValue<string>("Okta:ClientSecret"), Scope = new List<string> { "openid", "profile", "email" }, }); services.AddControllersWithViews(); } // This method gets called by the runtime. Use this method to configure the HTTP request pipeline. public void Configure(IApplicationBuilder app, IWebHostEnvironment env) { if (env.IsDevelopment()) { app.UseDeveloperExceptionPage(); } else { app.UseExceptionHandler("/Home/Error"); app.UseHsts(); } app.UseHttpsRedirection(); app.UseStaticFiles(); app.UseRouting(); app.UseAuthentication(); app.UseAuthorization(); app.UseEndpoints(endpoints => { endpoints.MapControllerRoute( name: "default", pattern: "{controller=Home}/{action=Index}/{id?}"); }); } } }

Now the application is fully equipped with the Okta middleware to manage Authentication and Authorization. However, there isn’t any part of the application using it, yet.

Whenever you put an Authorize attribute declaration on a Controller or Action, The ASP.NET Core runtime verifies that a user is logged in and has been granted the necessary authority. If not, you are redirected to the Okta login page.

In this case, we are going to implement an explicit Sign In/Sign Out feature, with two links in the top bar of the front-end User Interface.

Add AccountControllers.cs to the project, under the Controllers folder.

using Microsoft.AspNetCore.Authentication; using Microsoft.AspNetCore.Authentication.Cookies; using Microsoft.AspNetCore.Mvc; using Okta.AspNetCore; namespace okta_cli_app.Controllers { public class AccountController : Controller { public IActionResult SignIn() { if (!HttpContext.User.Identity.IsAuthenticated) { return Challenge(OktaDefaults.MvcAuthenticationScheme); } return RedirectToAction("Index", "Home"); } [HttpPost] public IActionResult SignOut() { return new SignOutResult( new[] { OktaDefaults.MvcAuthenticationScheme, CookieAuthenticationDefaults.AuthenticationScheme, }, new AuthenticationProperties { RedirectUri = "/Home/" }); } } }

As you can see, the new controller provides the two actions, one for Signing In and one for Signing Out.

Finally, add two links to the top bar of the UI, with the following razor code in _Layout.cshtml.

<div class="navbar-collapse collapse d-sm-inline-flex flex-sm-row-reverse"> @if (User.Identity.IsAuthenticated) { <ul class="nav navbar-nav navbar-right"> <li><p class="navbar-text">Hello, @User.Identity.Name</p></li> <li><a class="nav-link" asp-controller="Home" asp-action="Profile" id="profile-button">Profile</a></li> <li> <form class="form-inline" asp-controller="Account" asp-action="SignOut" method="post"> <button type="submit" class="nav-link btn btn-link text-dark" id="logout-button">Sign Out</button> </form> </li> </ul> } else { <ul class="nav navbar-nav navbar-right"> <li><a asp-controller="Account" asp-action="SignIn" id="login-button">Sign In</a></li> </ul> } <ul class="navbar-nav flex-grow-1"> Set Up Your Okta Application for Azure

The application is now ready to provide its services, enriched with a state-of-the-art security framework, in conjunction with a first-class Cloud-based authentication/authorization provider as Okta. You have probably realized that something is missing: I have done nothing (yet) to inform the provider that my brand new application, okta-cli-app, is going to ask it to manage authentication and authorization matters on its behalf. To fix this I need to set up a proper configuration in Okta and bind my ASP.NET Core project to it.

If you haven’t yet got an Okta developer account, please sign up and grab one, it’s quick and free. You will need one to perform the next step.

Login to your Okta developer account and create a new application.

This is how it looks after I have filled all the necessary fields:

Note that the TCP port 5001 must be the same used by the application. You can see it in the messages displayed in the terminal when you start the application with dotnet run.

At the bottom of the page, you see two fields containing cryptic strings, named Client ID and Client secret.

A third fundamental value you need to complete the configuration process is the Okta domain URL, which has been created for you when you signed up for your new account. You can find it on the top-right of the dashboard page.

Insert these three values in the appsettings.json file in the ASP.NET Core project:

{ "Logging": { "LogLevel": { "Default": "Information", "Microsoft": "Warning", "Microsoft.Hosting.Lifetime": "Information" } }, "AllowedHosts": "*", "Okta": { "OktaDomain": "https://dev-509249.okta.com", "ClientId": "0oaq5vf8fzPfAljzy4x6", "ClientSecret": "HoGyxgDTrMfNqDp4RO234CEQxDAwuXnZm2FycON-" } }

Go back to the terminal and start it again.

dotnet run

Point your browser to http://localhost:5000 once again and you should see the same page as before but with a Sign In link on the left of the top bar.

You can now click on Sign In and get redirected to the Okta login page. Once logged in, you’ll see the top bar changing.

Even though this post is about using the command line tooling rather than the IDE’s plugins or the portal user experience, I haven’t done so when creating the Okta application. It’s worth mentioning here that Okta services are offered through the portal, a Web API interface and an SDK (Okta.Sdk) - but the new Okta CLI tool has just been released! This allows DevOps engineers to fully automate the Okta account and application setting management and integrate Okta configurations in the CI/CD pipelines.

The Azure CLI Cloud Deployment Process

Microsoft Azure allows us to setup and configure everything right from the Azure Portal. While this is very convenient while learning, it’s not ideal when the same sequences of operations need to be repeated. Modern DevOps aim to make faster infrastructure configuration and application deployment. The availability of a tooling system that allows you to script these operations on automated and streamlined workflows is necessary in order to achieve better performance. I am now going to demonstrate how you can deploy the application we just created to Azure, using the Azure Command Line Interface (Azure CLI, a.k.a az).

az login

This command opens a browser page where you are requested to login to your Microsoft account. This means, of course, that to proceed with the following steps you need to have an Azure account and a valid (even free-tier) subscription. Once logged in (you only need to do this once), you can begin creating resources.

Create a Resource Group:

az group create -n okta-cli-rg

Create an Application Service Plan:

az appservice plan create -n okta-cli-plan --resource-group okta-cli-rg --sku FREE

Create the Application Service Instance:

az webapp create -g okta-cli-rg -p okta-cli-plan -n okta-cli-app

Your application is now deployed to the Cloud and available to the world at the address https://okta-cli-app.azurewebsites.net/. But before trying it, don’t forget that you need to align the URLs in your Okta application to the new address. You do that simply replacing the Okta application page, the old http://localhost:5000 with the new https://okta-cli-app.azurewebsites.net/ in all the fields that contain it (should be three).

You can now open the browser and surf to https://okta-cli-app.azurewebsites.net/. Or, if you prefer, automate this step as well with another console command.

az webapp browse -n okta-cli-app

That’s all there is to it!

Recap

In this article, I explored the basics of two command-line interface utilities (ASP.NET Core CLI and Azure CLI) and showed how to use them to create, build and deploy to the Cloud a web application, without using an IDE or a web portal.

What you learned:

Create an ASP.NET Core web application with dotnet new Add Okta ASP.NET Core package to the application with dotnet add Implement a basic Sign In/Sign Out workflow using Okta Build and run the application with dotnet run Login the local Azure CLI to your Azure account with az login Create an Azure resource group with az group create Create an Azure application service plan with az appservice plan create Create an Azure application service instance with az webapp create Launch the browser and point it to an Azure application using az webapp browse Learn More About Okta, .NET, and Azure

If you are interested in learning more about security and the world of Azure and .NET, check out these other blog posts!

Secure Your ASP.NET Core App with OAuth 2.0 Build Single Sign-on for Your ASP.NET MVC App Policy-Based Authorization in ASP.NET Core Store ASP.NET Secrets Securely with Azure KeyVault User Authorization in ASP.NET Core with Okta Baking in Security with .NET CLI Templates

Make sure to follow us on Twitter and subscribe to our YouTube Channel so that you never miss any awesome content!

Tuesday, 01. September 2020

KuppingerCole

Designing and Establishing a Mature PAM Ecosystem for Reducing Risk in Your Organisation

What makes a PAM strategy different from enterprise password management or Identity Access Management? What are the first actions you should take to protect your privileged accounts in the shortest amount of time? And, how has the definition of “PAM Basics” changed as the industry and cyber risks have evolved? This webinar is a must for teams launching PAM initiatives to ensure they start on

What makes a PAM strategy different from enterprise password management or Identity Access Management? What are the first actions you should take to protect your privileged accounts in the shortest amount of time? And, how has the definition of “PAM Basics” changed as the industry and cyber risks have evolved? This webinar is a must for teams launching PAM initiatives to ensure they start on the right foot. As you progress on your PAM journey, there’s always something new to learn. If you’ve already begun your PAM rollout, this event is a great chance to confirm you’re setting the appropriate milestones and see how others demonstrate success.  




Global ID

The GiD Report #125 —Everyone v. Apple, Discord’s Groups vision

The GiD Report #125 — Everyone v. Apple, Discord’s Groups vision If you haven’t had a chance to check it out ye, check out our first employee profile interview featuring Erik! This was a fun one: Meet the Team — Erik Westra, head of GlobaliD Labs What we have for you this week: Quick updates on the war against Apple’s App Store Facebook’s multi-pronged fight with&n
The GiD Report #125 — Everyone v. Apple, Discord’s Groups vision

If you haven’t had a chance to check it out ye, check out our first employee profile interview featuring Erik! This was a fun one:

Meet the Team — Erik Westra, head of GlobaliD Labs

What we have for you this week:

Quick updates on the war against Apple’s App Store Facebook’s multi-pronged fight with Apple Discord’s long term vision (it sounds a little like Groups) The shifting challenges of platform moderation Symbolic: ExxonMobile booted from Dow Stuff happens 1. We’ve been covering the battle between Tim Sweeney’s Epic Games and Apple’s App Store so a couple of quick updates for you.

First, the initial judge ruling, via The Information:

A federal judge denied a request by Epic Games for a temporary restraining order that would have forced Apple to allow Fortnite, a popular Epic videogame, back into the App Store. But that same judge granted another request by Epic to stop Apple from kicking it out of a developer program.

….

In a court filing, Judge Yvonne Gonzalez Rogers said that Epic had not shown that it was at risk of irreparable harm from Apple, which removed Fortnite from the App Store last week after Epic began violating its policies. Epic modified Fortnite to bypass an Apple payment mechanism, which in turn enabled Epic to avoid paying Apple the standard 30% commission it collects on in-app transactions.
Photo: Mike Deerkoski
But the judge ruled that Epic was more persuasive about the harm that could have resulted from another matter — a threat by Apple to deny Epic access to developer tools for its iOS and Mac operating systems. Epic depends on those tools for the development of Unreal Engine, an Epic product used by many outside companies to build games.
The judge said that Apple had “chosen to act severely” and that denying Epic access to the developer could have hurt games made by independent companies. The judge ordered Epic to file a motion for a longer-term injunction by Sept. 4 and a hearing it on the matter is scheduled for Sept. 28.

Apple has also made peace with Wordpress creator Automattic:

Apple says it has come to an agreement with Automattic, which operates the WordPress content management system application, after Automattic CEO Matt Mullenweg announced on Friday that Apple had blocked updates for the iOS version of WordPress until he agreed to add in-app purchases.

The first couple of skirmishes but the war is far from over.

In any case, Ben Thompson thinks Apple’s App Store policy is bad for innovation, bad for customers, and ultimately bad for Apple.

Related:

From Kingmaker to ‘Autocrat’: How Game Makers Went to War With Apple Fortnite is splitting into two different games because of Epic and Apple’s fight ‘Fortnite’ Maker’s Apple Fight Leaves Some Developers Wary Ben Thompson — Rethinking the App Store Linus weighs in on Apple vs Epic Games 2. In part because Facebook has joined the chat. That’s because Apple’s policies are a threat to Mark Zuckerberg’s long term vision of making Facebook more of a super app in the vein of WeChat.

He made that pretty clear in a company wide town hall last week, reported by Buzzfeed:

Facebook CEO Mark Zuckerberg took a swing at Apple on Thursday, calling the iPhone maker’s app store monopolistic and harmful to customers during a companywide meeting.
“[Apple has] this unique stranglehold as a gatekeeper on what gets on phones,” Zuckerberg said to more than 50,000 employees via webcast. He added that the Cupertino, California–based company’s app store “blocks innovation, blocks competition” and “allows Apple to charge monopoly rents.”

Mark has a point and it’s one reason why China has superapps while the rest of us don’t. Apple has set it up so that the iOS itself is the superapp — further entrenched by its deep integration of features such as payments and now identity.

It’s not Facebook’s only tiff w/Apple, BTW:

A new fight between Facebook and Apple over the mechanics of ad tech is surfacing an industry divide over user privacy and spotlighting longstanding dilemmas about the tracking and use of personal information online, Axios’ Kyle Daly reports.
Why it matters: Privacy advocates have been sounding alarms for years about tech firms’ expansive, sometimes inescapable data harvesting without making much headway in the U.S. But the game could change if major industry players start taking opposite sides.
What’s happening: Facebook warned advertisers Wednesday that a coming change to Apple’s iOS could devastate revenue for ads that sends users straight to the App Store to install an app — an approach that’s used widely by developers including mobile game makers.

Like I said, a lot is at stake and both sides are in it for the long haul.

Related:

Briefing: Facebook Says Apple Changes Could Kill Part of its Ads Business Facebook warns advertisers on Apple privacy changes Mark Zuckerberg Said Apple Has A “Stranglehold” On Your iPhone

3. Because the battle between Facebook and Apple is just starting to heat up.

Axios:

Facebook and Apple are fighting an increasingly high stakes battle over user privacy and access to the iOS App Store, deepening a rift between two of the most powerful companies in Silicon Valley, Axios’ Scott Rosenberg and I report.
Why it matters: By trading accusations, Facebook and Apple could just be handing more ammo to critics and regulators — but at the same time, conflict between these giants could be read as a sign of competitive life and a rebuttal to antitrust charges.

A sign of the times:

Our thought bubble: Historically, the big tech companies have maintained power by dominating a key market while competing with one another at the edges. The Apple-Facebook fight shows that they’re now willing to take swings at each other’s core businesses.
3. The thing is, Apple’s approach also bucks the macro trend. More apps want to do more.

Consider Discord, via Axios:

Discord began in 2015 as a way for gamers to talk to one another before, during, and after play. Now, the chat company is pursuing a far broader vision: to be the Slack for your non-work life.
Why it matters: In the age of COVID-19, more than ever before, people need the online equivalent of social spaces like bars, restaurants and stages.
How it works:
Discord allows people to create their own online community space, to set and enforce rules and decide whether to remain invite-only or open it to the public. Users can share messages in various channels, chat privately and have group discussions. More recently, the company has added group video chat. Discord calls each community’s space a “server,” but it’s not a server in the sense of a separate computer controlled by the user. Users can run servers without needing system-administrator knowhow. This arrangement has pros and cons. It means Discord controls the data and is responsible for complying with law enforcement. But it also means the service can enforce its own code of conduct, handling trust, safety and security.

One way to think about it: all of these various products are really just various combinations of a few core building blocks: identity, messaging, and payments (and groups!).

LINE Launches Digital Asset Wallet and Blockchain Development Platform — CoinDesk Apple confirms acquisition of VR startup Spaces 4. The thing about bringing people together on a platform (through identity, messaging, and payments) is that humans have a tendency to do really amazing things but also some really nasty things. And one of the big takeaways from this latest tech cycle is how we optimize the former and mitigate the latter.

A recent battleground has been Section 230 of the Communications Decency Act, which limits platform liability from user behavior.

But the days of that laissez-faire approach could be coming to the end.

AT&T is firing the latest salvo in that movement:

AT&T will tell the Federal Communications Commission that the agency should craft rules to shrink tech’s longstanding legal shield, Axios’ Ashley Gold reports.
Why it matters: AT&T is a telecom giant and, since buying Time Warner, a major force in media and entertainment — both industries that have butted heads with Silicon Valley. The company is now launching this fresh attack as tech is under attack in Washington.
Driving the news: On Monday, AT&T is previewing comments it will make to the FCC Wednesday on reinterpreting Section 230 of the Communications Decency Act, which shields tech companies from liability over content their users post.
The company will ask the FCC to modify Section 230 so that tech platforms don’t get such broad immunity.
It will also argue that Facebook, Amazon, Apple, Microsoft and Google should be forced to be more transparent about the decisions they “make on a daily basis,” like how they rank search results and feature news stories.

Plus the latest from FB:

After initially taking no action on militia pages organizing an armed counter-protest in Kenosha, Wisconsin, Facebook said Wednesday that the pages violate a just-enacted policy that imposes stricter limits on QAnon, militia and other extremist groups.
Why it matters: Facebook’s handling of the issue raises fresh questions about its ability and willingness to enforce policies in time to prevent violence rather than after the fact.

Relevant:

How the Battle for Thailand Is Being Fought on Twitter — CoinDesk Youtube on platform moderation — Responsible policy enforcement during Covid-19 Via /f — Post No Evil | Radiolab | WNYC Studios Via /gregWSJ News Exclusive | Gun Sellers Use New Tactic to Deal on Facebook Marketplace 5. This one’s more symbolic than anything. The Dow Jones Industrial Index just booted ExxonMobile, its oldest member, in favor of SalesForce.

Felix Salmon:

The Dow Jones Industrial Average announced a major shakeup on Monday after the market closed — it booted Pfizer, Raytheon Technologies and ExxonMobil, the oldest member of the index, having joined in 1928. (GE, the last original member of the Dow, was removed two years ago.)
Between the lines: Despite recently becoming a net oil exporter, U.S. oil companies’ stocks have fared poorly and indexes excluding energy and oil names have tended to outperform those that do include them.
Apple overtook Exxon in 2012, and today has a market cap over $2 trillion while Exxon’s market cap has sunk to $175 billion.

Related:

Fed lays out historic shift to inflation strategy Powell set to deliver ‘profoundly consequential’ speech, changing how the Fed views inflation Canadian Software Startup Puts 40% of Cash Reserves Into Bitcoin — CoinDesk Via /pstav — The Case for $500K Bitcoin 6. Stuff happens Via /peteWhy You Should Stop Sending Texts From Your Android Messages App Chainalysis Report Shows Healthy Crypto Usage in Venezuela — CoinDesk Interoperability Series: Sovrin Stewards Achieve Breakthrough in Wallet Portability — Sovrin Via /vs — Abbott’s Fast, $5, 15-Minute, Easy-to-Use COVID-19 Antigen Test Receives FDA Emergency Use Authorization; Mobile App Displays Test Results to Help Our Return to Daily Life; Ramping Production to 50 Million Tests a Month Pandemic drives surge in Open Banking-based payment initiation SoftBank Eyes Joining Deal for TikTok Huawei Builds Blockchain Platform to Help Beijing Government Manage People’s Data — CoinDesk Via /pstavNetflix’s $10 Million Deposit Via /laura — Particl.io • Privacy-focused Decentralized Applications

The GiD Report #125 —Everyone v. Apple, Discord’s Groups vision was originally published in GlobaliD on Medium, where people are continuing the conversation by highlighting and responding to this story.


KuppingerCole

Oct 14, 2020: Policy-Based Access Control – Consistent Across the Enterprise

The evolution of cybersecurity protection demands a more nuanced response to providing access to a company’s sensitive resources. Policy-based access control (PBAC) combines identity attributes and context variables to enable sophisticated granting of access to corporate systems and protected resources based on centrally managed policies that ensure consistent access control decisions across the en
The evolution of cybersecurity protection demands a more nuanced response to providing access to a company’s sensitive resources. Policy-based access control (PBAC) combines identity attributes and context variables to enable sophisticated granting of access to corporate systems and protected resources based on centrally managed policies that ensure consistent access control decisions across the enterprise. Advancement in both business requirements and technology (such as growing use of micro-services), require a better way to control access. In a way that is consistent across all silos, dynamic enough to react to change in risk, and provides better control for the application business owners.

Trinsic (was streetcred)

Simplifying SSI-Based Solutions to Focus on Adoption

When the outbreak of COVID-19 started, Michael Corning, the Cryptocosm Architect at Secours.io, wanted to build a self-sovereign identity (SSI) based app that would help reopen his local economy in Sisters, Oregon without reopening the pandemic. As we all know, how easy it is to adopt a new technology or software plays a key role […] The post Simplifying SSI-Based Solutions to Focus on Adoption

When the outbreak of COVID-19 started, Michael Corning, the Cryptocosm Architect at Secours.io, wanted to build a self-sovereign identity (SSI) based app that would help reopen his local economy in Sisters, Oregon without reopening the pandemic. As we all know, how easy it is to adopt a new technology or software plays a key role in its ultimate success.  

 

The first version of his app proved to be too complicated and idealistic for businesses and organizations to adopt. He quickly reevaluated his strategy and built a more simplified app, called Safe in Sisters, which was built with adoption in mind. The Q&A below is our interview with him to learn more about the changes he made to create an SSI-based app that was “adoptable”. 

What is the Safe in Sisters project? What is its purpose?

After the COVID-19 pandemic hit the state of Oregon and we shuttered shops and public places, here in my little piece of heaven—the city of Sisters—I went to some of my friends at Economic Development for Central Oregon and told them how I wanted to reopen the Sisters economy without reopening the pandemic. They both loved the idea. Two of my physicians heard me explain how I was planning to accomplish this mission (I wanted to know how medical providers would react). One doctor was enthusiastic, and the other had an arguable HIPAA-allergic reaction.  

 

Armed with this mix of reactions, I set out to build a set of tools that would provide users with local contact tracing, a COVID symptoms scoring sheet (based on data published by Oregon Health Authority), and verifiable presentations. The verifiable presentations would do something as simple as check a zip code (Sisters had zero COVID cases at the time, but counties nearby glowed purple with the afflicted) or as complicated as checking for a positive COVID test result and two subsequent negative results within a two-week window. 

 

Early on, I realized that writing the code was the easy part. Socializing the protocol and getting people to understand, value, and use the technology would be the hard part. 

 

Turns out, I was correct. 

 

Working with my Enduring Net COVID comrades across the pond, we took the prototypes to a large care center in Manchester, England. We showed them all the fancy stuff, and their first response was, “We have six entrances to our facility and handle over 600 people per day. We can’t spend time exchanging QR codes with visitors.” 

 

I learned two valuable lessons (actually, I knew the lessons but had never seen their consequences in action quite so starkly):  

My passion for complex solutions usually does not meet with equal user enthusiasm.  

The second lesson was best taught by Professor Edwin Jaynes in his Probability Theory: The Language of Science. He notes, “The simplest solutions aren’t best because they’re simple. They’re best because simple solutions are more likely to succeed.” 

 

So, I set aside the techno bling, and I started over. I was bound and determined to provide the world with a solution simple enough to succeed.  

What makes this project unique from some of the other SSI & verifiable credentials projects that are designed to combat the threat of COVID-19?

The first difference is that we focus on local ground conditions.  People in Sisters don’t care about COVID prevalence in Manchester. They care about the prevalence at the Sisters High School or the Five Pine Lodge. They don’t care about the prevalence at Heathland Care Center in Manchester, but they care about The Lodge in Sisters care center across the street from the Fika Coffeehouse.  

 

A second difference follows from the first: Local contact tracing is just the first line of defense. If a COVID carrier gets into a room and exposes other healthy people, at least those now at risk know about their invisible condition in time to do something to stop further spread. Once a room has carried a carrier, the room needs to bring out the big guns. The room and its visitors need other tools to access the risk of exposure. This is where symptom tracking and verifiable credentials come into play.  

 

The third difference is that we focus on measuring risk.  This may turn out to be the most important difference. We understand that a measurement is a set of observations that reduce uncertainty. We don’t guarantee safety; we don’t reduce risk by 100%. But we do use all available data as a set of observations that reduce the uncertainty that plagues us about the virus. Our tools provide users with consistent metrics they can use to make informed decisions about who is safe enough to interact with. No more controversy. No more excuses. No more virus. 

 

Our job as technologists is not to show how cool our technology is. It’s to save lives. It’s to stop the virus in its tracks. My job is to make Sisters the COVID Hotel California—the virus can check-in, but it will never check-out.

Why have you focused on a local contact tracing solution? What is the difference between contact tracing and local contact tracing?

All politics are local. All battles with the virus are local. Most of those affected by the virus are local. Local contact tracing is far simpler and more reliable than conventional contact tracing. We can control the virus in our town without help from the government or from Apple and Google. We don’t need GPS. We don’t need Bluetooth. But we do need common sense, privacy, and secure messaging. 

 

Since local contact tracing is local and not centralized at scale, and since the code and infrastructure is neither complex nor expensive, we expect rapid replication across the planet, not of the virus, but of the tools to kill it. 

Can you go into detail of how the Safe in Sisters app works?

Since local contact tracing is a free service, I moved development to my nonprofit counterpart of Secours.io, the Soteria Institute. The Soteria Local Contact Tracing tool is a Progressive Web App (built with VueJS on the front end and socket.io on the backend). Once you’re on https://soterialct.z22.web.core.windows.net/, you can add the app to your phone’s home screen and it looks and acts like a native mobile app.  

 

The app has two views: Room and Visitor. 

 

People responsible for a public space manage a Room. Visitors choose that room and check-in when they enter and check-out when they leave. Rooms and Visitors store each of these decisions (to enter a Room or to be allowed to enter a Room) in local storage (IndexedDB) on their phone. Visitors send check-in socket messages to the server, and the server forwards those messages to the Room.  

 

If a person sends a COVID exposure alert to all the Rooms stored on their phone, the Rooms can pick up the message from the server, list all the Visitors who occupied the room at the same time in the past two weeks, and can send alerts to the server that forwards them on to each of the other Visitors. 

 

In the simplest scenario, Rooms don’t have to do anything (the alert protocol can be fully automated), and all a Visitor has to do is click a button on the app to log Rooms and another button if they go into quarantine. No QR codes. No room staff overhead. No cost of service. 

 

If that’s still too complicated for people, I’m going into another line of work. 

 

Below is a short demo video that shows how the process works: 

How has this project evolved since you first started? What are the obstacles you have run into? 

To summarize the evolution of my work: 

The Secours app started as a full-blown application based on Trinsic.  I separated out the simplest capabilities of that original design to focus on local contact tracing using Aries messaging.  I stopped using Aries messaging and refactored LCT with Socket.io (this is the base Service Level Agreement, SLA-0), and I assigned that development to Soteria.  I refactored the Trinsic capabilities into two additional Secours SLAs: SLA-1 adds personal credentialing and additional risk management tools. SLA-2 adds COVID test credentials and rich Verifiable Presentations.  I plan to augment Socket.io with more powerful assets like Redis, so I can develop artificial intelligence asserts that will transform my Soteria/Secours-based digital twins into intelligent and helpful digital partners.  How has Trinsic helped you along in the process?

Trinsic was mission-critical for three reasons. 

 

First, making something simple that’s complicated is difficult. Trinsic has met that challenge. Back in the Spring, I was able to get up and running in a few minutes. Trinsic assets are generally well-documented now, and development tools are even simpler than a few months ago. This meant I was able to build my verifiable credential exchange (VCX) prototype in record time. And while I was disappointed that the world wasn’t quite ready for all these new capabilities, it was a forcing function that changed my marketing strategy from push (see how cool this tech is, don’t you want some?) to pull (see how easy it is to do something you couldn’t do before?). 

 

And because I built so much in so little time, I was able to set these powerful Trinsic solutions aside and spent one more month getting a simple and stable web socket messaging platform ready for our fist offering: local contact tracing. Having all the rest of the heavy iron waiting in the back room means I can respond immediately to early adopters who quickly needed more power than messaging provides. At each stage of the VCX adoption process, we stay as simple as possible and no simpler. We give our users all the power they need now and no more (until they need it). 

 

Second, local contact tracing goes global only if thousands of people like me can get their hands on technology that is simple and easy to implement and extend. Trinsic was built on the lowest barrier to entry. All our Trinsic API code is in a public GitHub repo, and we will provide later local contact tracing adopters with training material that will ensure many other communities around the world can stay safe from the virus. 

 

Finally, I realized something else during this early development phase: As cool as connectionless credentials are, they are not the most important part of the Trinsic platform. Credentials keep your data safe. Verifiable presentations make your data valuable. Trinsic’s implementation of verification policies took all the mystery out of this critical step in VCX; so, again, our users can tell us what they need for verification, and we can respond on the spot. 

 

Next time you ask me this question, I may have a third reason to love Trinsic: I suspect verification policies will enable me to implement decentralized semantics with relative ease. 

What stage of development are you in? How big do you see your solution getting, and how do you see it scaling?

I am nearly ready to take the tool back to the UK and see if it is simple enough to be effective. If the app passes acceptance testing, I will recommend a British colleague to fork the GitHub repos (client and server), setup the necessary server assets (static Web App, Ubuntu VM), and take it from there. 

 

At the same time, I will make it available to my friends and family here in Sisters. And if that works, I will recruit others in Oregon to replicate our success in Sisters. 

 

Once my users build muscle memory on Service Level 0, I will continue field testing at SLA-1 and SLA-2. Someday, soon—I hope—I will have the resources to focus on SLA-3: a personal artificial intelligence. 

 

If I do my job and our local approach scales effortlessly, then I expect there will be 7 million localities each serving 1,000 users. 

Anything else you would like to add?

To my fellow Sisters residents: For our strategy to work, people must care about each other just enough to push two buttons on their phone, maybe three.  

 

For a few people in other cities and towns: You can do your part, too. Replicate Safe in Sisters where you live and work. Stop the virus in its tracks. 

 

Together, we can relegate COVID-19 to a display at the Natural History Museum. 

 

And stay safe out there… 

 

(end of interview) 

 

We, at Trinsic, are currently offering our platform free-of-charge to those working on SSI projects related to COVID-19. If you are interested in learning more about this offer, contact us or get started with a Trinsic Studio account for free today. 

The post Simplifying SSI-Based Solutions to Focus on Adoption appeared first on Trinsic.


IDnow

IDnow acquires Wirecard Communication Services

Munich / Leipzig – September 1st, 2020, IDnow, a leading provider of Identity Verification-as-a-Service solutions and Wirecard Communication Services GmbH announce the...

Munich / Leipzig – September 1st, 2020, IDnow, a leading provider of Identity Verification-as-a-Service solutions and Wirecard Communication Services GmbH announce the signing of an agreement for the acquisition of Wirecard Communication Services by IDnow. IDnow will retain the Leipzig location preserving the majority of the 150 employees.

With Wirecard Communication Services, IDnow will focus on providing identity services and continue to build additional capacity for the strong growth of IDnow identification processes. In light of organic growth in digital services as well as the added needs related to restrictons imposed by the COVID-19 pandemic, IDnow is experiencing a sharp increase in demand for its digital verification procedures. The acquisition is intended to enhance the service quality of IDnow products and thus further increase the responsiveness for customers and cut waiting times.

Wirecard Communication Services GmbH was established on April 29, 2003, and is part of the Wirecard Group. In a structured investor process, Wirecard Communication Services GmbH was offered for sale by way of an asset deal. IDnow and Wirecard Communication Services have already been working together successfully for more than five years.

At a staff meeting on Monday afternoon, the insolvency administrator, attorney Dr. Nils Freudenberg from Tiefenbacher Insolvency Administration, announced the takeover to the staff of Wirecard Communications Services. The first step will be to provide the employees with appropriate qualifications and to adapt the technical infrastructure. Following this realignment, the Leipzig location will be expanded in a targeted way and integrated into the IDnow processes in order to enable even faster implementation of projects as well as further technical development of the Ident services.

“With the integration of Wirecard Communication Services into the IDnow Group, we are seizing the opportunity to further improve our range and service quality for our customers. We have worked in close collaboration with this division of the company for several years and greatly appreciate the qualifications and experience of its employees. We are on a strong growth path and will maintain and further develop the Leipzig location,” says Andreas Bodczek, CEO of IDnow.

“We are delighted to have found in IDnow a buyer who appreciates and knows our company and has such a friendly culture. Personally, I am very pleased to stay on board and work in the team at IDnow,” says Amra Blume, Managing Director Wirecard Communications Services.

 

Press contact:

Christina Schwinning

press@idnow.io


Ontology

Ontology Monthly Report — August 2020

Ontology Monthly Report — August 2020 August has been yet another great month for Ontology. We are proud to announce the great strides we have made in cross-chain interaction through the launch of Poly Network, as well as in decentralized identity solutions with our partnership with NEAR Protocol, and in the field of DeFi dApps with the launch of the new Renaissance 2.0 dApp incentive plan.
Ontology Monthly Report — August 2020

August has been yet another great month for Ontology. We are proud to announce the great strides we have made in cross-chain interaction through the launch of Poly Network, as well as in decentralized identity solutions with our partnership with NEAR Protocol, and in the field of DeFi dApps with the launch of the new Renaissance 2.0 dApp incentive plan. This month, we also launched our new website in 7 languages, bringing the Ontology message to new audiences around the world.

You can find more detailed updates below.

中文

繁體中文

한국어

日本語

русский

Tiếng Việt

Tagalog

বাংলা

slovenský

සිංහල

हिंदी

Español

MainNet Optimization

- Ontology v2.1.0 launched on MainNet

- Ontology GraphQL interface development is 20% completed

- Layer 2 v0.3 launched

ONTO

- ONTO v3.1.0 and v3.2.0 launched

- The credentials feature has been upgraded, and new credential templates and support for Plaid financial data verification have been added

- The ETH asset management experience has been improved, including by adding the ability to set fast, normal, and slow gas settings for ETH transactions

- ONTO Financial Services issued its third set, this time of 1,000 ETH, available to ONTO users. Sold out within two hours.

- Daily active users of DeFi dApps through ONTO surged record high as nearly 1,000 users participated in the joint campaign by ONTO and MakerDAO

dApp

- 70 dApps launched in total on MainNet

- 6,051,367 dApp transactions completed in total on MainNet

Community Growth

- We onboarded 656 new members across Ontology’s global communities, witnessing pronounced growth in our Persian, Bengali and Tagalog communities.

Bounty Program

- We are seeking SDK developers from our community

- We are collecting suggestions for new bounties.

- 513 applications, 5 new additions to existing bounties

- 38 tasks, 50 teams in total: 31 teams have finished their tasks and have received rewards in return, while 19 teams are still working on tasks

Latest Release

- Ontology successfully completed the development of the next stage of its open source DID smart contract on the Ethereum network. The new smart contract method did:etho: can also be used across popular DeFi applications and across most chains that run EVM virtual machines.

- Ontology, Neo, and Switcheo announced the joint launch of Poly Network, a heterogeneous interoperability protocol alliance. Poly Network will permit cross-chain interoperability, greatly increasing transparency and accessibility.

- Ontology’s website is now available in 7 languages: English, Chinese, Japanese, Korean, Russian, Spanish, and German.

- Ontology teamed up with NEAR Protocol to accelerate the development of decentralized digital identity solutions. Ontology has provided technical support for NEAR’s DID from a regulatory perspective to enable smart contract implementation and W3C registration.

- Ontology continued performance testing with a focus on improving CPU utilization of relay nodes at a very high TPS.

- Ontology upgraded its decentralized credit-score solution, OScore, which generates a credit score based on a user’s held assets to allow cross-wallet management of all the user’s digital assets.

- Ontology released the “Renaissance 2.0” DeFi dApp incentive plan with a three-month trial period, during which developers may get up to double the transaction fees returned.

- Ontology partnered with Waves to build a cross-chain communication infrastructure for DeFi. Empowered by Gravity, Waves’ cross-chain oracle network, Ontology and Waves have joined forces to offer inter-chain DeFi solutions and dApps to build the next generation of reliable Web 3.0 applications

Events

- Jun LI, Founder of Ontology, was invited to the public chain session of CoinW’s 3rd Anniversary Cloud Online Party on 10 August. He exchanged inspiring insights with other leading figures of public chains on topics ranging from public chains and DeFi to blockchain mass applications.

- On 11 August, on a panel themed “New Opportunities for Blockchain Developers” at the POWER 2020 Technology and Application Summit organized by Mars Blockchain, Jun LI said, “Blockchain is here to build links and connections, not to overturn the internet. Among these blockchain-enabled connections, the focus lies in the connection of assets, identity, and data.”

- Andy JI, Co-founder of Ontology, contributed to the approval of Plenary decisions and two P3200s standards (P3207 & P3210) in IEEE BDL Standards Committee Plenary. During the plenary, Andy offered Ontology’s insights and expertise in digital identity on multiple standard proposals.

- On August 5, Andy JI, Co-founder of Ontology, was invited to a panel discussion themed “How will public chains cope with new challenges in the DeFi ecosystem?” at the 2020 FINWISE Online Summit, a constructive discussion participated in by representatives from leading public chains and multiple industry experts.

- John Izaguirre, Ontology’s European Ecosystem Lead, was invited to a panel at Indonesia Blockchain Week 2020 (IBW2020) to exchange insights on DeFi projects. John said, “The DeFi projects available in the market have yet to demonstrate full reliability, and barriers remain between DeFi projects and traditional financial projects. Ontology’s DID solutions are specifically tailored to tackle these problems as users’ assets and their credit levels can be integrated into the OScore system, a new capability that will greatly benefit the industry as a whole.”

- On 13 August, Kendall MAO, Dean of the Ontology Research Institute, shared his thoughts on Ontology’s new staking model and distributed data infrastructure in an interview with WALI Finance. He emphasized that Ontology will continue to prioritize self-sovereign identity and data.

Recruitments

- Solution Architect

- Business Sales

- Global Development Manager

- Global Marketing Manager

- Social Media Associate

New Team Members

- 1 Management Trainee

- 1 Marketing Intern

Contact Us

- Contact us: contact@ont.io

- Apply now: careers@ont.io

Find Ontology elsewhere

Ontology website / ONTO website / OWallet (GitHub)

Twitter / Reddit / Facebook / LinkedIn / YouTube / NaverBlog / Forklog

Telegram Announcement / Telegram English / GitHubDiscord

Ontology Monthly Report — August 2020 was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


MyKey

Announcement: MYKEY Lab adds 1 billion KEY TOKEN to the “multiple chains exchange pool”

Due to the demand of KEY TOKEN on multiple chain, MYKEY Lab locked 1 billion KEY TOKEN on Ethereum to the exchange pool on September 1: 0xc4947bf8c74033c7079f6780460e72e82a8df33c Exchange pool change record: Announcement: MYKEY Lab adds 1 billion KEY TOKEN to the “multiple chains exchange pool” Announcement: MYKEY Lab adds 1 billion KEY TOKEN to the “multiple chains exchange poo

Otaka - Secure, scalable, and highly available authentication and user management for any app.

See you at Disclosure 2020!

I’m thrilled to share the virtual stage at Disclosure tomorrow! Disclosure is a security conference that’s shaping up to have a super amazing schedule! The speaker lineup is fantastic—with talks ranging from cyber warfare to disinformation to social engineering (and much more!). My talk is called “How to Think About OAuth Security”. I’ll be focusing on what makes OAuth secure and some holes in

I’m thrilled to share the virtual stage at Disclosure tomorrow! Disclosure is a security conference that’s shaping up to have a super amazing schedule! The speaker lineup is fantastic—with talks ranging from cyber warfare to disinformation to social engineering (and much more!).

My talk is called “How to Think About OAuth Security”. I’ll be focusing on what makes OAuth secure and some holes in implementations that have previously left some gaps for attackers. Here’s a taste of the kinds of things I’ll be talking about tomorrow!

Revealing mobile app API keys

It’s a well-understood part of modern app development that trying to hide API keys in an app is a futile effort. But just a few years ago, Twitter made headlines when their mobile app’s API keys were leaked! We’ll talk about what happened and how OAuth evolved to address the need for mobile apps to securely log users in.

Stop relying on passwords

There are much better solutions than passwords on the modern internet. How has OAuth been adapting to the changes? Spoiler alert: OAuth never encouraged apps to handle passwords in the first place, so we’ll take a look at how OAuth is changing to address this.

What’s coming in OAuth 2.1?

You may have heard some rumblings about OAuth 2.1, an update to the spec that I and a few others are currently working on at the OAuth Working Group in the IETF. We’ll look at the significant changes coming in that version, highlight what isn’t changing, and talk about why now is a good time to publish an update to the spec!

Hope to see you there! It’s not too late to register! I’ll be answering Q&A, and I’m always happy to chat about OAuth security on Twitter, you can find me at @aaronpk.

Monday, 31. August 2020

KuppingerCole

10 Use Cases for Universal Privilege Management

Even before COVID-19 entered our lexicon, privileged access management (PAM) was widely recognized as a foundational cybersecurity technology. In recent years, almost every cyberattack has involved compromised or misused privileges/privileged credentials. Most malware needs privileges to execute and install payload. Once a threat actor has infiltrated an IT network, privileges are typically needed

Even before COVID-19 entered our lexicon, privileged access management (PAM) was widely recognized as a foundational cybersecurity technology. In recent years, almost every cyberattack has involved compromised or misused privileges/privileged credentials. Most malware needs privileges to execute and install payload. Once a threat actor has infiltrated an IT network, privileges are typically needed to access resources or compromise additional identities. With privileged credentials and access obtained, a threat actor or piece of malware essentially becomes a malicious “insider”. Outside of PAM, there are few defenses against a rogue insider.

In 2020, the largescale shift to working from home (WFH) imposed by the coronavirus has increased the urgency for maturation of PAM security capabilities. The subsequent increase in BYOD and shadow IT, compound by significantly more reliance on insecure WiFi networks and remote access pathways, has exacerbated many of the thorniest privileged access challenges. It’s also a proven recipe for breaches.

In the remainder of this blog, I will cover the 10 phases, or use cases, of PAM that comprise a complete, holistic privileged access management program. At BeyondTrust, we refer to this as the Universal Privilege Management model. This approach entails securing every privileged user (human or machine), session, and asset across your IT environment—leaving no privilege undiscovered, unmanaged, or unaudited. The 10 phases can be implemented via the three solutions that comprise a complete PAM platform—privileged password management, endpoint privilege management, and secure remote access.

While organizations most frequently begin with securing privileged credentials (privileged password management), you can start anywhere, so long as your PAM platform is flexible. With each PAM layer implemented, your organization eliminates and mitigates additional privileged attack vectors, while realizing new security and operational synergies.

1. Secure & Audit Privileged Account Credentials

Gaining control and accountability over privileged accounts—both human and machine—is often the first step organizations take on their PAM journey. Privileged password management solutions can automate the discovery, onboarding, management, and monitoring of the ever-expanding types of human and machine privileged accounts/credential types (privileged user passwords, application passwords, DevOps secrets, SSH keys, certificates, etc.), and bring those accounts/credentials under management within a centralized password safe. This is an important step for preventing or mitigating password re-use attacks and other backdoors (orphaned accounts, etc.) into the IT environment.

2. Enforce Least Privilege on Desktops (Windows and MacOS)

Enforcing least privilege on desktop devices is one of the most powerful ways to reduce endpoint security risk across the enterprise. Endpoint privilege management solutions can remove local administrative rights and default every user as a standard user. Rather than being enabled, persistent, and always-on, the privileges are only elevated on an as-needed basis and only for the targeted application or process. Limiting both the amount and duration of access condenses both the attack surface and threat window for malicious applications and activity that can abuse privileges.

3. Apply Least Privilege Across Your Server Environment (Windows, Unix, Linux)

IT admins often require elevated rights to perform their jobs. Unfortunately, in the wrong hands, high levels of privilege can be abused to inflict considerable damage to an IT environment and exfiltrate data. While sudo can help organizations “get by” in simple environments, it’s not an enterprise-class tool. Sudo suffers from significant security and administration drawbacks. Enterprise-class PAM solutions can enable organizations to efficiently and effectively delegate server privileges without disclosing the passwords for root, local, Active Directory domain, or bridged administrative accounts. Least-privileged, just-in-time access should always be enforced, and every privileged session should be closely audited and monitored.

4. Implement Application Reputation

PAM solutions should be able to enforce a number of application reputation strategies as part of endpoint privilege management. Some of these include:

Application control capabilities, including allow listing,  block listing, and reputation-based listing to restrict applications to only those approved to execute, with the correct privileges, within the appropriate context Applying real-time risk intelligence to inform privilege delegation and elevation decisions Command filtering (on Unix and Linux systems) and PowerShell script management (on Windows systems) Trusted application protection to add context to the IT process tree to prevent fileless attacks and attacks leveraging trusted applications to perform malicious activities

 

5. Control Remote Access

Rarely does a cyber attacker operate directly on a resource (such as a stolen laptop). Most attacks start externally via a remote access connection. Typically, these threats initially compromise a remote vendor or employee, then piggyback into an organization’s network. VPNs and other widely used remote access tools lack connection isolation, granular privilege and access controls, and application-based audit capabilities. With the recent, largescale shift to remote work, tools like VPNs and RDP are being stretched way beyond their legitimate use cases, contributing to a surge in targeted attacks and breaches. PAM platforms with privileged remote access capabilities can enforce least-privilege access and session auditing for remote access sessions—for both vendors and employees better than traditional VPN solutions alone.

6. Extend PAM Best Practices to Network Devices and IoT/IIoT

Some non-traditional endpoints and edge devices, like IoT, have minimal computing power, which means they may not be candidates for traditional endpoint security tools, like AV. Additionally, IoT and network devices may have embedded or easy-to-guess credentials, among other design flaws. That’s why it’s critical to extend credential management, least privilege, and other PAM controls to these devices and keep them properly segmented across your environment.

7. Extend PAM Best Practices to the Cloud and Virtualized Environments

In addition to suffering from many of the same privileged access weaknesses as on-premise environments, the cloud presents unique use cases, such as hypervisors, cloud management consoles, and APIs. In the cloud, ephemeral privileged accounts and credentials are rapidly instantiated and disposed of when new cloud and virtual instances are spun up and, just as easily, spun down. When managing any privileged account, discovery is the critical first step to gaining control over these assets and the many planes of privileges across cloud environments. Once cloud and virtualized instances and their assets are found, they must be managed to limit exposure, and all session access should be monitored and audited. Simply put, PAM has a substantive role to play when it comes to both cloud security and API security, regardless of the access and implementation of privileged accounts.

8. Extend PAM to DevOps and DevSecOps

DevOps seems to magnify many of the worst PAM challenges due to the heavy emphasis on automation and speed. Common DevOps risks include:

Insecure code and hardcoded passwords Scripts or vulnerabilities in Continuous Integration/Continuous Deployment (CI/CD) tools, that could deploy malware or sabotage code Over-provisioning of privileges Sharing of DevOps secrets

While DevOps presents some special use cases, PAM’s role in DevOps security is comparable to any other environment—managing privileged accounts/credentials (including for CI/for CI/CD tools, service accounts, etc.), enforcing least privilege, etc. It also essential that the PAM solution does not disrupt or delay workflows, but rather enables peak DevOps agility. Securing the accounts, keys, and certificates required for automation is a fundamental part of extending PAM into your development and automation practices.

9. Integrate PAM and Identity Access Management

Identity and access management (IAM) solutions help IT teams answer, “Who has access to what?” PAM solutions answer the questions of “Is that access appropriate?” and “Is that access being used appropriately?” Complete visibility and accountability over identities requires bi-directional integration of privilege management and IAM solutions. Some PAM solutions also include AD Bridging capabilities, which help further centralize identity management and authentication by providing single sign on across Windows, Unix, Linux, and macOS environments using the same account for simplified access, monitoring, and reporting.

10. Integrate PAM with Other IT Tools

PAM + IAM integration is imperative, but your privileged access security workflows and data should also integrate with the rest of your IT and security ecosystem. Gaps in this ecosystem translate into security vulnerabilities and lost productivity. The better your PAM platform integrates (such as with SIEM, ITSM, etc.), the more effective your ability to orchestrate pinpoint responses to problems—or opportunities.  As a rule of thumb, any security technology that solves a problem, but that does not integrate into the rest of your ecosystem, is a point solution with a finite lifespan. It could be argued that siloed solutions are also a waste of money and resources. Therefore, make sure your PAM investment works with, and integrates with, your overall IT and security ecosystem to best serve your environment.

Visit BeyondTrust for more information about securing your universe of privileges.

If you are interested in learning more about 10 steps to universal privilege management, tune in to my keynote at this week’s KCLive Event.


Authenteq

The Most Secure ID verification and KYC platform to this date: August Product Update

Last month at Authenteq: besides striving to be trustworthy and secure we also strive to be fully automated. As automation is the key […] The post The Most Secure ID verification and KYC platform to this date: August Product Update appeared first on Identity Verification & KYC | Authenteq.

Last month at Authenteq: besides striving to be trustworthy and secure we also strive to be fully automated. As automation is the key to the success last month our product updated focused on automated flagging – since last month instead of your compliance team having to flag suspicious users, our technology does that for you. What is more, you can take a glimpse at the verification data with our dashboard quick-view and not waste your precious time. 

This month our product update is one of the biggest product releases we ever had! Here’s what’s new:

Let your customers pick up where they left

At Authenteq we know that the conversion rate is the key metric that you use to calculate your business success and that even small friction can have a high impact on your user satisfaction. To make sure that your customers are always pleased with the onboarding process we’ve added a feature that allows your customer to exit the flow at any point or move the app to the background and not cancel the onboarding flow. This will allow your users to pick up where they left the flow and will provide for a truly frictionless mobile onboarding experience.

Enhanced security – always 

Our customer dashboard has never been as secure. Now you can facilitate data collection in three different ways and never have to touch the images of the identification documents again. Here are the ways you can facilitate the data collection: mobile SDK, mobile end-point, or a webhook. Read more about it on our API documentation.

What is more, our team added a small but no less important two-factor authentication (2FA) for additional security to the customer dashboard. You can enable it by going to the settings section of your company dashboard. Interesting in learning more about how to protect your data and increase your business security?  Check out our latest blog on anti-spoofing where we share useful tips.

Features that will help you onboard customers with international documents

As part of our continued efforts to provide document support worldwide, we’ve introduced some API changes that will benefit your user onboarding process. We know that personal identification document formats differ across the globe, for example, those with the US or Australian drivers licenses or passports that are being issued by certain Asian countries might have difficulty with verifying their identity.

The main issue was that these documents do not distinguish between the surname and given names field. In order to solve this issue, Authenteq will now pass you the entire field as SurnameandGivenNames. By introducing this change we empower you to cater to a truly secure, frictionless experience to your users. 

Want to unleash the power of frictionless verification. Fetch your API keys by starting a 30-day-trial with us, or have a chat with one of our identity verification and KYC experts.

The post The Most Secure ID verification and KYC platform to this date: August Product Update appeared first on Identity Verification & KYC | Authenteq.


Nyheder fra WAYF

WAYF lancerer MDQ-interface til metadata

En identitetsføderations metadata er det autoriserede katalog over de indholdstjenester og autentifikationssystemer som deltager i føderationen og herigennem udveksler digitale identiteter. Traditionelt distribueres en føderations metadata i én stor XML-fil til samtlige deltagende systemer, som på den måde får kendskab og tillid til hinanden og mulighed for at kommunikere sammen. Metadata vedli

En identitetsføderations metadata er det autoriserede katalog over de indholdstjenester og autentifikationssystemer som deltager i føderationen og herigennem udveksler digitale identiteter. Traditionelt distribueres en føderations metadata i én stor XML-fil til samtlige deltagende systemer, som på den måde får kendskab og tillid til hinanden og mulighed for at kommunikere sammen. Metadata vedligeholdes af føderationens centrale myndighed, “føderationsoperatoren”, som underskriver metadata digitalt.

Language Danish Read more about WAYF lancerer MDQ-interface til metadata

KuppingerCole

IdentityIQ – SailPoint

by Graham Williamson IdentityIQ continues to provide organizations with a comprehensive solution to their identity management requirements. With the recent advances in predictive identity management, the tool reduces manual intervention and improves accuracy of user entitlements. The latest release adds cloud access management capability allowing entitlements in DevOps environments to be monitore

by Graham Williamson

IdentityIQ continues to provide organizations with a comprehensive solution to their identity management requirements. With the recent advances in predictive identity management, the tool reduces manual intervention and improves accuracy of user entitlements. The latest release adds cloud access management capability allowing entitlements in DevOps environments to be monitored and managed.


Otaka - Secure, scalable, and highly available authentication and user management for any app.

Deploy a Secure Spring Boot App to Heroku

Developers have cool ideas for pet projects all the time. I often have quite a clear picture of what I want to build and am ready to spend next weekend making the Next Big Thing. The weekend comes finally, and instead of building it, I find myself doing the same repetitive things - deployment, user sign in, registration, deployment, etc. Starting a new project with user registration and a login fo

Developers have cool ideas for pet projects all the time. I often have quite a clear picture of what I want to build and am ready to spend next weekend making the Next Big Thing. The weekend comes finally, and instead of building it, I find myself doing the same repetitive things - deployment, user sign in, registration, deployment, etc. Starting a new project with user registration and a login form is fun! - said no one, ever.

The first hundred user sign-up forms I made were fun, kind of. As I move forward, I see more and more boilerplate, mostly copypasta between projects with some minor tweaks.

Luckily, we are living in the era of PaaS, IaaS, and open source. Most of the typical components are already built for us, and we simply need to connect them in order to focus on unique features and delivery for the end-user.

While there are many options available to bootstrap a new project, sometimes it can be challenging to mix and match the right components.

Prerequisites

Java 11 A free Heroku account and the Heroku CLI 15 minutes of your time

Table of Contents

Build a Secure Spring Boot Application Scaffold a New Spring Boot Project Touch up Gradle Dependencies Prepare a Git Repository for Heroku Say Hello World with Spring Boot and Kotlin Use Kotlin’s Statically-typed HTML Builder Deploy Spring Boot to Heroku Protect Your Spring Boot Application Enable the Okta Spring Boot Starter Add the Okta Add-on to Your Heroku Application Provide Environment Variables for Okta Spring Boot Configure IntelliJ IDEA to Run Your Spring Boot App Configure Spring Security Redirect Users to the Okta Login Page Handle an Authenticated User (Bonus) Enable Self-Registration Learn More about Spring Boot and Heroku Build a Secure Spring Boot Application

In this tutorial, you’re going to build a single page (no JavaScript, I promise!), secure web application, and deploy it in Heroku’s cloud. It could be a good foundation for the next project or just something on the side. Most importantly, it doesn’t cost a penny.

Your website will invite a user to log in and then will present them with some meaningful information, for example, give The Answer.

Tools you’ll be using:

Spring Boot - agile and time-tested all-in-one suite for the web and REST API development framework with countless integrations and much more. Spring Security - swiss knife for the various security setups which provides great flexibility and control all over the application. You’ll be using its OAuth 2.0 module. Kotlin - fastest-growing statically typed language which is gaining adoption in different applications. kotlinx.html - an HTML-like DSL (domain-specific language) helping developers to build type-safe applications. Okta - easy to use authentication and authorization service provider, you’ll offload user management to this service Heroku - PaaS provider making deployment process as smooth as possible.

Since you won’t handle authentication and won’t store any personal data, that significantly helps to be compliant with GDPR, CCPA, and other government regulations, as Okta already takes care of that.

Spring Boot has first-class support for Kotlin, easing out potential challenges in some corner cases. Okta provides a very handy autoconfiguration okta-spring-boot-starter, similar to spring-boot-starter-web which automagically sets up most of the components for you. With Heroku, you’ll be able to deploy the whole application with a simple git push.

Scaffold a New Spring Boot Project

You can create a project skeleton with a standard directory layout and basic dependencies configured. However, configuring Spring manually can be convoluted, and leave you with cryptic errors just because a dependency was or wasn’t included. Fortunately, Spring Initializr can help with that. It’s a neat online tool giving you an option to set up all dependencies of the project automatically. I prepared a magic link which preselects project’s components.

Should you prefer to select the dependencies yourself, choose options as displayed in the screenshot below:

The Spring Initializr website will generate a zip archive that you’ll need to download, decompress, and import in your favorite IDE.

Touch up Gradle Dependencies

Although most of the required dependencies are there, you’ll need to add the kotlinx.html library and the jcenter repository to build.gradle.kts. Also, please temporarily exclude the Okta Spring Boot starter to see the application working:

... repositories { mavenCentral() // add this repository for kotlinx.html library jcenter() } dependencies { implementation("org.springframework.boot:spring-boot-starter-web") implementation("com.fasterxml.jackson.module:jackson-module-kotlin") // temporarily exclude the Okta Spring Boot starter // implementation("com.okta.spring:okta-spring-boot-starter:1.4.0") implementation("org.jetbrains.kotlin:kotlin-reflect") implementation("org.jetbrains.kotlin:kotlin-stdlib-jdk8") // add kotlin html dsl implementation("org.jetbrains.kotlinx:kotlinx-html-jvm:0.7.1-1.4.0-rc") testImplementation("org.springframework.boot:spring-boot-starter-test") { exclude(group = "org.junit.vintage", module = "junit-vintage-engine") } }

Re-import your Gradle configuration and run the project from your IDE, or from the command line using ./gradlew bootRun.

The application should start successfully, but a 404 page will be returned when accessing http://localhost:8080. This is expected behaviour if you didn’t define a handler for the root endpoint.

Prepare a Git Repository for Heroku

To enable Heroku deployments you need to create a new Git repository and commit your application skeleton generated by Spring Initializr. Open you favorite terminal, navigate to your project’s folder, then run the following commands:

git init && git checkout -b main git add -A && git commit -a -m "Initial commit"

These commands will initialise a new Git repository with a default main branch and create the first commit.

Say Hello World with Spring Boot and Kotlin

One of the best things about the controlled magic of Spring Boot is that it makes complex things plain, and helps to write very concise and easy to read code, especially when teamed up with Kotlin.

Create a HelloController.kt class that returns “Hello, World”.

package com.okta.springboot.demo import org.springframework.http.MediaType import org.springframework.web.bind.annotation.GetMapping import org.springframework.web.bind.annotation.RestController @RestController class HelloController { @GetMapping("/", produces = [MediaType.TEXT_HTML_VALUE]) fun theAnswer(): String = "<h1>Hello, World</h1>" }

A lot of things are happening under the hood. This controller has a @RestController annotation because instead of returning a name of view, you want to return a response body. In this example, it will be rendered as an HTML string.

Using @GetMapping annotation you declare a GET endpoint bound to /, and indicate that return mime type is text/html. Finally, the HTML string <h1>Hello, World</h1> is returned.

Re-start the application and open your browser to http://localhost:8080. You should see “Hello, World”.

Use Kotlin’s Statically-typed HTML Builder

Kotlin provides great syntactic sugar for creating statically-typed DSLs. HTML builder is one of the many practical applications and is implemented as kotlinx.html library. It allows you to create web pages with plain Kotlin code which resembles HTML.

Your web application contains only one page. The corresponding HTML-generating render function indexPage() has all the HTML boilerplate including basic HTML document layout, Bootstrap CSS inclusion, and styles. The indexPage() function produces a horizontally and vertically centered ‘hero’ block with content.

Start building your HTML page by adding the following code in a new views.kt file alongside the controller you just added:

import kotlinx.html.* import kotlinx.html.stream.createHTML fun indexPage() = createHTML().html { head { link("https://stackpath.bootstrapcdn.com/bootstrap/4.5.1/css/bootstrap.min.css", rel = "stylesheet") } body("h-100") { div("container lead text-center") { div("h-100 align-items-center") { style = "display: grid" div("jumbotron jumbotron-fluid") { // meaningful content goes here guestView() } } } } } private fun FlowContent.guestView() { h1 { +"Hello, Guest" } }

The function producing meaningful content is FlowContent.guestView(). Note that it is an extension function because HTML DSL components are available only within FlowContent objects. You can read more about creating a DSL in Kotlin.

Update your controller’stheAnswer() method to call indexPage():

import indexPage ... @RestController class HelloController { @GetMapping("/", produces = [MediaType.TEXT_HTML_VALUE]) fun theAnswer(): String = indexPage() // call your render function }

After you restart your app and refresh your browser, you should see the output from your template:

Deploy Spring Boot to Heroku

I know that feeling, you can’t wait to put ‘The Thing’ out there on the Internet and share it with your friends. It’s a good time to start deploying as you’ve built a welcoming page.

Please ensure that you have Heroku CLI installed.

Log in to Heroku. You can skip this step if it was done previously:

heroku login

Then, create a new app with heroku apps:create:

heroku apps:create

You should see output like the following:

Creating ⬢ okta-springboot-heroku-demo... done https://okta-springboot-heroku-demo.herokuapp.com/ | https://git.heroku.com/okta-springboot-heroku-demo.git

You’ll find Heroku automatically configures a remote origin called heroku. Running a git push will trigger the build and deploy process automatically. It’s as simple as that!

At the moment, the default JVM Heroku uses is 1.8. You’ll need to create a system.properties file in your application root to provide the desired version:

java.runtime.version=11

Then commit your changes:

git add . && git commit -m "Added system.properties with target jvm version"

Then, deploy to Heroku:

git push --set-upstream heroku main

Heroku will build from source once your code is pushed.

Enumerating objects: 4, done. Counting objects: 100% (4/4), done. Delta compression using up to 4 threads Compressing objects: 100% (2/2), done. Writing objects: 100% (3/3), 319 bytes | 319.00 KiB/s, done. Total 3 (delta 1), reused 0 (delta 0) remote: Compressing source files... done. remote: Building source: remote: remote: -----> Gradle app detected remote: -----> Spring Boot detected remote: -----> Installing JDK 11... done remote: -----> Building Gradle app... remote: -----> executing ./gradlew build -x test remote: To honour the JVM settings for this build a new JVM will be forked. Please consider using the daemon: https://docs.gradle.org/6.4.1/userguide/gradle_daemon.html. remote: Daemon will be stopped at the end of the build stopping after processing remote: > Task :compileKotlin remote: > Task :compileJava NO-SOURCE remote: > Task :processResources remote: > Task :classes remote: > Task :bootJar remote: > Task :inspectClassesForKotlinIC remote: > Task :jar SKIPPED remote: > Task :assemble remote: > Task :check remote: > Task :build remote: remote: BUILD SUCCESSFUL in 37s remote: 4 actionable tasks: 4 executed remote: -----> Discovering process types remote: Procfile declares types -> (none) remote: Default types for buildpack -> web remote: remote: -----> Compressing... remote: Done: 77.8M remote: -----> Launching... remote: Released v4 remote: https://okta-springboot-heroku-demo.herokuapp.com/ deployed to Heroku remote: remote: Verifying deploy... done. To https://git.heroku.com/okta-springboot-heroku-demo.git 5780603..a1a1ae0 main -> main Branch 'main' set up to track remote branch 'main' from 'heroku'.

Once your application is deployed, it can be easily accessed by running heroku open. This command opens a new web browser window and navigates to its URL.

Every time you want to deploy your web application, simply push your source code by running git push heroku main.

Protect Your Spring Boot Application

Many services have a “user’s area” - part of the website or content visible only to the members. In this application, the content of the index page depends on the user login state. Logged in users can see The Answer while guests are invited to sign in. This user registration and login bit might sound trivial but in fact, it causes a number of very serious questions not easy to find answers:

Where do I store user’s personal data such as name, email, etc.? Is it another table, another database, another type of database, or as a microservice? Do I encrypt data, if so what algorithm should I use? Where do I keep encryption keys? Ask yourself if you understand cryptography enough to make the right decision. How do I hash passwords? Do I need salt and pepper to cook it right? What if I want to add more authentication providers, for instance, social networks? Shall I spend time writing those abstractions I might never use? How would I design access management and access token revocation? Does it sound like a very generic thing which must have been implemented by somebody? Do I have a good understanding of how to keep user’s PII (Personal Identifiable Information) in compliance with GDPR/CCPA/DPA/other regulations?

Those are just a few questions off the top of my head. I’m certain you’ve got a cool bar story about authentication to tell.

It’s easy to build authentication and authorisation but it’s hard to do it right.

You’ll be using Okta, a software-as-service identity access provider which has excellent integration with Spring Boot and Heroku. Combining Okta with Spring Security makes the sign-in/sign up process as easy as it could be. Yes, you’ll have a user sign up right out of the box. (See bonus section).

Enable the Okta Spring Boot Starter

Update your build.gradle.kts to include okta-spring-boot-starter artifact and don’t forget to re-import the Gradle model in your IDE.

dependencies { ... implementation("com.okta.spring:okta-spring-boot-starter:1.4.0") ... } Add the Okta Add-on to Your Heroku Application

Okta provides an official Okta Heroku Add-on which expedites the development process. You’ll create an Okta account linked to your Heroku app, and automatically configure Okta right from the command line.

Run heroku addons:create okta to begin. You should see output like the following:

➜ heroku addons:create okta Creating Okta on ⬢ okta-springboot-heroku-demo... free Provisioning Okta Org okta-opaque-28728 is being created in the background. The app will restart when complete... Use heroku addons:info okta-opaque-28728 to check creation progress. Use heroku addons:docs okta to view documentation.

This add-on creates a user and configured Okta application for you. The configuration settings will be specified for your service via environment variables. You can lookup these settings with heroku config.

This command will return all the environment variables for your app on Heroku.

=== okta-springboot-heroku-demo Config Vars OKTA_ADMIN_EMAIL: dcc8cf18-198b-4395-81b7-f96a6adc9463@heroku.okta.com OKTA_ADMIN_PASSWORD: A$8278eb57-6af7-4ef4-2713-b0d6ba3ff36d OKTA_CLIENT_ORGURL: https://dev-995757.okta.com OKTA_CLIENT_TOKEN: <okta api token> OKTA_OAUTH2_CLIENT_ID_SPA: 0aap37r2m33P2eEPbbx6 OKTA_OAUTH2_CLIENT_ID_WEB: 0aap529m06mVIIidR4x6 OKTA_OAUTH2_CLIENT_SECRET_WEB: <okta oauth2 client secret> OKTA_OAUTH2_ISSUER: https://dev-995757.okta.com/oauth2/default

If you see an empty output you need to wait a minute or two while the setup process is completed.

Note that OKTA_ADMIN_EMAIL and OKTA_ADMIN_PASSWORD are actual credentials you can use to log in to your application.

Provide Environment Variables for Okta Spring Boot

For OpenID Connect (OIDC) authentication and OAuth 2.0 authorization, only three variables are important to you: OKTA_OAUTH2_ISSUER, OKTA_OAUTH2_CLIENT_ID_WEB, and OKTA_OAUTH2_CLIENT_SECRET_WEB.

You’ll need to update application.properties file to provide them for Okta’s Spring Boot starter:

okta.oauth2.issuer=${OKTA_OAUTH2_ISSUER} okta.oauth2.clientId=${OKTA_OAUTH2_CLIENT_ID_WEB} okta.oauth2.clientSecret=${OKTA_OAUTH2_CLIENT_SECRET_WEB}

Although the idea of hardcoding secrets into the properties file might look very tempting, especially for a pet project, you should never do that! It’s good hygiene practice to never store secrets in your source control.

Configure IntelliJ IDEA to Run Your Spring Boot App

You probably want to play with the application locally as well, but at the moment it expects environment variables to be set. IntelliJ IDEA allows providing a custom configuration.

In Run Actions(Ctrl-Shift-A) dialogue search for Edit Configurations or use your mouse to edit the current run configuration in dropdown.

Provide the OKTA_OAUTH2_ISSUER, OKTA_OAUTH2_CLIENT_ID_WEB, OKTA_OAUTH2_CLIENT_SECRET_WEB keys and values from the heroku config output:

Configure Spring Security

One last step to make the application secure is to configure Spring Security. Create a WebSecurityConfig class in the same package as your other classes and fill it with the code below.

package com.okta.springboot.demo import org.springframework.context.annotation.Configuration import org.springframework.security.config.annotation.web.builders.HttpSecurity import org.springframework.security.config.annotation.web.configuration.EnableWebSecurity import org.springframework.security.config.annotation.web.configuration.WebSecurityConfigurerAdapter @Configuration @EnableWebSecurity class WebSecurityConfig : WebSecurityConfigurerAdapter() { override fun configure(http: HttpSecurity): Unit = with(http) { authorizeRequests().run { antMatchers("/").permitAll() anyRequest().authenticated() } oauth2Login().run { defaultSuccessUrl("/") } logout().run { logoutSuccessUrl("/") } } }

This configuration allows anyone to access / but always authenticates each request; allowing logged in users and guests to access the same URL. In the event of successful login or logout, users will be redirected to the index page /.

Redirect Users to the Okta Login Page

Spring Security automatically registers an endpoint for Okta at /oauth2/authorization/okta. This allows you to redirect the user to the right location to start the OIDC authentication process.

To enable login flow for the user add a link to your guestView() in views.kt:

private fun FlowContent.guestView() { h1 { +"Hello, Guest" } a(href = "/oauth2/authorization/okta") { +"Login to get The Answer" } } Handle an Authenticated User

Upon successful login, you can extract information provided by the authentication service. Spring Boot can inject it straight in to your controller’s handler:

import org.springframework.security.core.annotation.AuthenticationPrincipal import org.springframework.security.oauth2.core.oidc.user.OidcUser ... @RestController class HelloController { @GetMapping("/", produces = [MediaType.TEXT_HTML_VALUE]) fun theAnswer(@AuthenticationPrincipal user: OidcUser?): String = indexPage(user) }

OidcUser contains a variety of fields you might find useful, among them are: name, email, claims, etc.

When OidcUser? is null user is not authorized, it effectively makes them a guest.

Update the indexPage() method in views.kt to match the code below. Now when a user is logged in, they’ll see a warm welcome message:

import kotlinx.html.* import kotlinx.html.stream.createHTML import org.springframework.security.oauth2.core.oidc.user.OidcUser fun indexPage(user: OidcUser?) = createHTML().html { head { link("https://stackpath.bootstrapcdn.com/bootstrap/4.5.1/css/bootstrap.min.css", rel = "stylesheet") } body("h-100") { div("container lead text-center") { div("h-100 align-items-center") { style = "display: grid" div("jumbotron jumbotron-fluid") { // choose the right view depending on login status if (user == null) guestView() else enlightenedUserView(user) } } } } } private fun FlowContent.enlightenedUserView(user: OidcUser) { h1 { +"Hello, ${user.fullName}" } p { +"The answer you were looking for is" h1("badge-dark display-1") { +"42" } } hr { } p("text-right text-muted small") { +"Your email ${user.email} is ${"not".takeIf { !user.emailVerified }.orEmpty()} verified" } } private fun FlowContent.guestView() { h1 { +"Hello, Guest" } a(href = "/oauth2/authorization/okta") { +"Login to get The Answer" } }

Run your application via IntelliJ and you should be able to complete the sign-in process:

(Bonus) Enable Self-Registration

Okta can also take care of the registration process, which can be easily enabled in the settings. Head on over to your Heroku dashboard and choose your project. Find the Installed add-ons section and click on okta to open your Okta dashboard.

Navigate to Users > Registration and click Enable Registration. A registration configuration form allows some level of flexibility. For example, you can require new users to provide their first and last name.

After you save the configuration, open a new incognito browser window and try to log in. This time, the Okta login form will have a link for user registration:

You can push all your changes to Heroku after committing them.

git add . git commit -m "Add Okta for Auth" git push heroku main Learn More about Spring Boot and Heroku

In this tutorial, you learnt how to quickly bootstrap secured web applications for your ‘pet project’ ideas using Spring Boot with Spring Security, Kotlin, and Okta. You deployed a web application to the Heroku cloud using Heroku CLI and Git command-line tools. It’s always a good idea to use existing frameworks and tools instead of focusing on repetitive tasks such as deployment, authentication, and authorization.

The source code for this tutorial and the examples in it are available on GitHub in the oktadeveloper/okta-spring-boot-heroku-example repository.

If you liked this post, you might like these others too:

What the Heck is OAuth? Guide to OAuth 2.0 with Spring Security OpenID Connect Logout Options with Spring Boot Angular + Docker with a Big Hug from Spring Boot Hashing and salting and why it’s crucial

If you have any questions about this post, please add a comment below. For more awesome content, follow @oktadev on Twitter, like us on Facebook, or subscribe to our YouTube channel.


MyKey

MYKEY Weekly Report 14 (August 24th~August 30th)

Today is Monday, August 31, 2020. The following is the 14th issue of MYKEY Weekly Report. In the work of last week (August 24th to August 30th), there are mainly 5 updates: 1. MYKEY has officially established a strategic partnership with HBTC This cooperation will give full play to the DeFi-friendly features of MYKEY and expand more DeFi usage scenarios for HBTC. We will jointly promot

Today is Monday, August 31, 2020. The following is the 14th issue of MYKEY Weekly Report. In the work of last week (August 24th to August 30th), there are mainly 5 updates:

1. MYKEY has officially established a strategic partnership with HBTC

This cooperation will give full play to the DeFi-friendly features of MYKEY and expand more DeFi usage scenarios for HBTC. We will jointly promote the development of the blockchain industry and expand the growth opportunities of the cryptocurrency market. Click to read: https://bit.ly/2QvDGyS

2. The fifteenth MYKEY Crypto Stablecoin Report was published

We release MYKEY Crypto Stablecoin Report every week to share our interpretation of the development status of stablecoins and analysis of their development trends to help the participants in the crypto market stay updated on the development status of stablecoin. The fifteenth Crypto Stablecoin Report was published on August 27th, click to read: https://bit.ly/2FTKUdY

3. MYKEY upgraded Ethereum Network Fee discount and released a new coupon system

MYKEY released a new Network Fee discount system, upgraded from the original monthly free Ethereum transfers to a new coupon, which is more flexible and personalized than the previous program. You can experience it when upgrading to the latest version 2.8.0. For detailed rules, click to read: https://bit.ly/3aXMk2K

4. Announcement: KEY ID Ethereum logic contract module upgrade

The KEY ID Ethereum contract has passed the security audit of trailofbits. According to the audit, the logical contract module has been updated and upgraded, including ERC1271/NFT/multi-signature proposal support and optimization. For detail, click to read: https://bit.ly/34E5dqm

5. Open Finance Conference is ongoing

In the third week of the Open Finance Conference, the wonderful panels about DeFi and DEX were successfully held one after another.

!!! If you encounter any abnormal situation while using MYKEY, remember not to uninstall MYKEY APP, please contact MYKEY Assistant: @mykeytothemoon in Telegram.

!!! Remember to keep the 12-digit recovery phrase properly from [Me] — [Manage Account] — [Export Recovery Phrase] in MYKEY even if your account is not real-named.

About Us

KEY GROUP: https://keygroup.me/

MYKEY Web: https://mykey.org/

BIHU: https://bihu.com/people/1133973

Telegram: https://t.me/mykey_lab

Twitter: https://twitter.com/mykey_lab

Medium: https://medium.com/mykey-lab

Github: https://github.com/mykeylab

Youtube: MYKEY Laboratory

MYKEY Weekly Report 14 (August 24th~August 30th) was originally published in MYKEY Lab on Medium, where people are continuing the conversation by highlighting and responding to this story.


Ontology

Ontology Integrates With Plaid, Bringing Open Banking Style Benefits To ONTO Users

The move will offer ONTO wallet users a better picture of their financial health Thanks to Ontology’s new integration with Plaid, a data network powering the fintech tools that millions of people rely on to improve their financial lives, ONTO wallet users will now be able to enjoy open banking-style benefits through increased access to a vast amount of consolidated financial data from multiple in

The move will offer ONTO wallet users a better picture of their financial health

Thanks to Ontology’s new integration with Plaid, a data network powering the fintech tools that millions of people rely on to improve their financial lives, ONTO wallet users will now be able to enjoy open banking-style benefits through increased access to a vast amount of consolidated financial data from multiple institutions and categorized transaction data.

ONTO users will also be able to receive detailed information on their personal data which can be leveraged to improve their ONT score, a credit score accumulated by completing identity, financial, and social authentication.

Commenting, Andy Ji, Co-founder of Ontology, said, “Plaid’s status as a leading fintech company and their vision to democratize financial services through technology, is closely aligned with what we are committed to achieving at Ontology. The integration of Plaid’s API will allow us to provide detailed transaction history and data to ONTO users, who will have the opportunity to gain a complete understanding of their financial picture. This marks another important step in fulfilling our commitment of allowing consumers to take back control of their data. We look forward to what this integration will bring to the Ontology community, as we continue to enrich our ecosystem and service-offering.”

Over 80% of the world’s largest fintechs are powered by Plaid, including Acorns, Betterment, Coinbase, and Venmo. Across the US, Canada, and Europe, Plaid connects over 11,000 financial institutions. To date, 1 in 4 US bank account holders has used Plaid.

With a suite of decentralized identity and data sharing protocols to enhance speed, security, and trust, Ontology’s features include ONT ID, a mobile digital ID application and DID used throughout the ecosystem, and DDXF, a decentralized data exchange, and collaboration framework.

Find Ontology elsewhere

Ontology website / ONTO website / OWallet (GitHub)

Twitter / Reddit / Facebook / LinkedIn / YouTube / NaverBlog / Forklog

Telegram Announcement / Telegram English / GitHubDiscord

Ontology Integrates With Plaid, Bringing Open Banking Style Benefits To ONTO Users was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.

Sunday, 30. August 2020

KuppingerCole

KuppingerCole Analyst Chat: Privacy and Consent Management

Anne Bailey and Matthias Reinwarth discuss the findings of the recently published Leadership Compass on Privacy and Consent Management.

Anne Bailey and Matthias Reinwarth discuss the findings of the recently published Leadership Compass on Privacy and Consent Management.



Friday, 28. August 2020

KuppingerCole

SecZetta Third-Party Identity Risk Solution

by Paul Fisher Managing third party identities has become an important issue for companies when they open their infrastructures to non-employees from partners and vendors. The influx of third-party identities puts an extra strain on existing IAM tools but management solutions that provide added functionality for non-employee identities are now available. SecZetta has streamlined existing solution

by Paul Fisher

Managing third party identities has become an important issue for companies when they open their infrastructures to non-employees from partners and vendors. The influx of third-party identities puts an extra strain on existing IAM tools but management solutions that provide added functionality for non-employee identities are now available. SecZetta has streamlined existing solutions to create its new Third Party Identity Risk Solution. 


CyberArk

by Christopher Schuetze Privileged Access Management (PAM) has evolved into a set of technologies that address some of today's most critical areas of cybersecurity in the context of digital transformation and industrial change. The CyberArk Core Privileged Access Security solution is designed to secure and protect privileged accounts and credentials within cloud and hybrid infrastructures. It is

by Christopher Schuetze

Privileged Access Management (PAM) has evolved into a set of technologies that address some of today's most critical areas of cybersecurity in the context of digital transformation and industrial change. The CyberArk Core Privileged Access Security solution is designed to secure and protect privileged accounts and credentials within cloud and hybrid infrastructures. It is part of a suite of solutions and technologies from one of the leading providers of PAM.


Onfido Tech

Taming The Beast — Refactoring to empower teams

Taming the Beast — Refactoring to Empower Teams One day I stumbled onto a ticket from the website team that was open for several days. The ticket was for a simple change, to enable file compression. The comments went back and forth. First we would try something, then we would ask someone else to test. Then they would come back and say it didn’t work. This repeated 3 or 4 times until everyon
Taming the Beast — Refactoring to Empower Teams

One day I stumbled onto a ticket from the website team that was open for several days. The ticket was for a simple change, to enable file compression.

The comments went back and forth. First we would try something, then we would ask someone else to test. Then they would come back and say it didn’t work. This repeated 3 or 4 times until everyone gave up.

I was confused, the request seemed easy to do. I decided to investigate, and that’s when I found The Beast.

Unraveling The Beast

To understand The Beast, you need to understand what would happen at the time when you opened onfido.com on your phone.

A request would first go to Kubernetes, then an nginx server, then a Cloudfront distribution, and finally to an S3 bucket!

Here was the beast

Each step on this chain had a different owner, was in a different project, and had a different release process. Each team had to verify that changes to the website didn’t break their main service. It took time to make changes.

Not only that, it was also slow. US users had to wait twice as long as European users to see the page.

How did it get to this

To understand how we got to this beast, we need to go back to when it was just a tiny puppy.

Many years ago, the website was done in Rails, and its code was part of our platform. And all of this was behind an Nginx instance.

Life was simple for the young puppy

The developers managed the website, the Rails app handled routing, the Nginx proxy handled requests and caching, and the user saw a website.

Eventually one Rails app became many Rails apps. Nginx started serving the static assets from a folder. The puppy’s fur grew and nobody was there to trim it.

A couple more years and the website became separate, so the marketing team could own it. But because all the paths were intrinsically linked, it continued to be served by Nginx.

A bit more time and Kubernetes was added to the mix, and all of this was moved inside it.

The puppy’s fur grew and grew, and nobody gave it a trim. The puppy grew to become The Beast.

Taming The Beast

So how do you go towards taming The Beast?

We had to understand how that devilish Nginx configuration came to be. We had to understand what we had to keep, what we could rearrange and what we could throw away. We wanted to understand what the user workflows were.

So we talked with teams to see what they knew about this beast and evaluated the access logs for the website. With that precious information in mind, we were able to propose a new architecture that removed the website out of the way of other teams.

In this new architecture, we start with a Cloudfront distribution, this distribution has one single goal, to see what goes towards the website and what goes towards Kubernetes.

Proposed taming of The Beast

With this new architecture designed and the approval of all the teams, we prepared our new test environment. We made it match the behaviour of the real website as much as possible.

With a few days of trials, we got confidence that we could proceed. We did a dark launch of production only in our office network. Teams did their final end-to-end tests. Minor hiccups were fixed and we got ready for the big switch.

Launch Day

It was early morning, the release scheduled to happen in just a few hours. There was some tension. This could have a large impact, both on our image as well as for our clients.

All internal stakeholders were gathered, rollout procedures were ready and rollback procedures were prepared in case of a disaster. Eyes all on the various graphs and monitors. The pull request had one single change, onfido.com stopped pointing to the Load Balancer and would instead point to CloudFront.

The commit was merged, the CI ran and the changes were applied.

We waited.

No alarms were sounded. All services were normal. The website looked exactly the same. Nobody had noticed the change.

Exactly as intended.

Results

So, visually it looked the same. What changed?

First, we finally could fix that ticket from the start. And not having to go all the way to Europe on every request made the average page loads in the United States go from 5.59 seconds to 2.01 seconds.

Page load times from the weeks before and after the switch, even depending on the internet connection of the person the load times fell by over a second worldwide

But even if all of these metrics continued the same, the changes would have been worth it. They enabled our website team to act independently.

With the smaller and self-contained code base, the team was able to own it. They went from opening tickets to change something to opening pull requests themselves. The devops team has less load and the website team can change things fearlessly.

Key Takeaways

When faced with a large scary beast, the first instinct is to run! But that beast will not go away! It will terrify everyone who passes by and any change will take an eternity to do.

The proper action is to be fearless and tame the beast! When it is no more than a harmless puppy, teams will be more confident to make changes and will act quicker.

It’s a win-win for everyone, and all it needs is that initial push for change!

Taming The Beast — Refactoring to empower teams was originally published in Onfido Tech on Medium, where people are continuing the conversation by highlighting and responding to this story.


KuppingerCole

Micro Focus Identity Governance

by Graham Williamson Identity Governance is a mature Identity Governance and Administration (IGA) solution that is undergoing extensive development to meet the identity services requirements of the modern enterprise. As business systems become increasingly demanding of identity management environments, Micro Focus continues to extend its Identity Governance product, improving the user interface,

by Graham Williamson

Identity Governance is a mature Identity Governance and Administration (IGA) solution that is undergoing extensive development to meet the identity services requirements of the modern enterprise. As business systems become increasingly demanding of identity management environments, Micro Focus continues to extend its Identity Governance product, improving the user interface, automating provisioning and governance processes, removing dependence on manual intervention and facilitating regulatory compliance. Micro Focus Identity Governance should be considered by organizations seeking to manage the collection and use of identity information within their corporate systems.


Forgerock Blog

ForgeTalks: Your Guide to the ForgeRock Identity Platform

Welcome back to another episode of ForgeTalks. The ForgeRock Identity Platform is a workhorse - covering every identity possible and offering a comprehensive set of capabilities. There are few people more knowledgeable about its depth and breadth than Mary Writz, VP of Product Management. In today's episode, Mary compares a tour of our platform to a traveler visiting Paris for the f

Welcome back to another episode of ForgeTalks. The ForgeRock Identity Platform is a workhorse - covering every identity possible and offering a comprehensive set of capabilities. There are few people more knowledgeable about its depth and breadth than Mary Writz, VP of Product Management. In today's episode, Mary compares a tour of our platform to a traveler visiting Paris for the first time. For newcomers, the Eiffel Tower and the Louvre can't be missed. When it comes to our platform, her 'must visit' hot spots include Intelligent Access and ForgeRock Go. In our chat, she shares insider tips on how her favorite features solve some of the most common and complex identity issues companies face. 

We'll be answering key questions like:

How do I design the perfect access journey?  What is the best way to help people recover lost passwords?  Can an identity platform offer DevOps deployment nirvana?  

Make sure you check out next week's episode, where Mary takes us on a "locals tour" of the ForgeRock platform - revealing some of the lesser-known but equally powerful features. And if you want to check out any of the previous episodes of ForgeTalks you can view everything here.

Thursday, 27. August 2020

KuppingerCole

Privileged Access Management

by Paul Fisher Privileged Access Management (PAM) is one of the most important areas of risk management and security in any organization. Privileged accounts have traditionally been given to administrators to access critical data and applications. But, changing business practices, agile software development and digital transformation has meant that users of privileged accounts have become more nu

by Paul Fisher

Privileged Access Management (PAM) is one of the most important areas of risk management and security in any organization. Privileged accounts have traditionally been given to administrators to access critical data and applications. But, changing business practices, agile software development and digital transformation has meant that users of privileged accounts have become more numerous and widespread. To reduce the risk of privileged accounts being hijacked or fraudulently used, and to uphold stringent regulatory compliance within an organization, a strong PAM solution is essential.


SELFKEY

SelfKey Partners with Polkadot

We’re happy to announce that SelfKey will be partnering with Polkadot. The post SelfKey Partners with Polkadot appeared first on SelfKey.

We’re happy to announce that SelfKey will be partnering with Polkadot.

The post SelfKey Partners with Polkadot appeared first on SelfKey.


KuppingerCole

ManageEngine Log360

by Alexei Balaganski Log360 from ManageEngine is a tightly integrated suite of log management and network security analytics tools. Complementing SIEM capabilities with EDR, DLP, and even SOAR functionality, it offers a convenient and affordable one-stop solution for security analytics and threat remediation across on-prem and cloud.

by Alexei Balaganski

Log360 from ManageEngine is a tightly integrated suite of log management and network security analytics tools. Complementing SIEM capabilities with EDR, DLP, and even SOAR functionality, it offers a convenient and affordable one-stop solution for security analytics and threat remediation across on-prem and cloud.


Radware Kubernetes WAF

by Richard Hill Containerized microservices are gaining momentum in IT organizations today, requiring tools such as Kubernetes for automating the orchestration and management of those containers. The Radware Kubernetes WAF meets the unique requirements of the Kubernetes environment to protect its containerized applications and data.

by Richard Hill

Containerized microservices are gaining momentum in IT organizations today, requiring tools such as Kubernetes for automating the orchestration and management of those containers. The Radware Kubernetes WAF meets the unique requirements of the Kubernetes environment to protect its containerized applications and data.


MyKey

Crypto Stablecoin Report 15: The market capitalization of stablecoins increased to $15.961

Crypto Stablecoin Report 15: The market capitalization of stablecoins increased to $15.961 billion, On-chain usage of stablecoins Original link: https://bihu.com/article/1229222499 Original publish time: August 25, 2020 Original author: HaiBo Jiang, researcher of MYKEY Lab We released MYKEY Crypto Stablecoin Report to share our interpretation of the development status of stablecoins
Crypto Stablecoin Report 15: The market capitalization of stablecoins increased to $15.961 billion, On-chain usage of stablecoins

Original link: https://bihu.com/article/1229222499

Original publish time: August 25, 2020

Original author: HaiBo Jiang, researcher of MYKEY Lab

We released MYKEY Crypto Stablecoin Report to share our interpretation of the development status of stablecoins and analysis of their development trends to help the participants in the crypto market stay updated on the development status of stablecoin. The MYKEY Crypto Stablecoin Report will be published every week, looking forward to maintaining communication with the industry and exploring the development prospects of stablecoin together.

Quick Preview Last week, the market capitalization of major stablecoins increased by $759 million to $15.961 billion. Tether migrated 1 billion USDT from Tron to Ethereum and additionally issued 120 million USDT on Ethereum, and additionally issued 400 million USDT on Tron twice. On the evening of August 23, the gas fee cost in USDT transfer within 24 hours accounted for 10.73% of the gas fee on Ethereum. The deposits in Curve include 333 million USDC, 307 million USDT, 236 million TUSD, 70.7 million DAI, 23.39 million sUSD, 290,000 BUSD, and 220,000 PAX(Including cToken and yToken). The loan balance of DAI in Compound is 217 million, accounting for 49.2% of the DAI supply. USDT has a higher volume/liquidity in Uniswap. 1. Overview of Stablecoin Data

First, let’s review the changes in the basic information of the various stablecoins in the past week(August 15, 2020 ~ August 21, 2020, same below).

Market Circulation

Source: MYKEY, CoinMarketCap, Coin Metrics

At present, the market circulation of major stablecoins has increased by $759 million to $15.961 billion.

Source: MYKEY, Coin Metrics

In the past week, Tether migrated 1 billion USDT from Tron to Ethereum and additionally issued 120 million USDT on Ethereum and additionally issued 400 million USDT on Tron twice. The circulation of USDC, PAX, BUSD, TUSD, DAI, and GUSD increased by 134 million, 2.59 million, 4.34 million, 79.99 million, 19.87 million, and 140,000, and the circulation of HUSD decreased by 1.28 million.

The Number of Holding Addresses

Source: MYKEY, DeBank

Last week, the number of main stablecoin holding addresses on Ethereum all increased and increased by 90, 657.

Source: MYKEY, DeBank

The number of holding addresses of USDT, TUSD, and DAI increased by 90,845, 504, and 1,695. The number of holding addresses of USDC and PAX decreased by 2,375 and 12.

The Number of Active Addresses

Source: MYKEY, Coin Metrics

The number of active addresses of stablecoins last week increased by an average of 1.64% compared to the previous week.

The Number of 24-hour Transactions on the Public Blockchains

Source: MYKEY, Coin Metrics

Compared with the previous week, the number of daily transactions of major stablecoins decreased by an average of 4.15%.

The Number of 24-hour Volume of Transactions on the Public Blockchains

Source: MYKEY, Coin Metrics

Source: MYKEY, Coin Metrics

Due to the large amount of transfers on the blockchain caused by the USDT migration, the daily volume of transactions of major stablecoins last week increased by an average of 19.67% from the previous week.

2. On-chain usage of stablecoins

Stablecoins have become a very important infrastructure, especially for public blockchains that are ready to develop DeFi, almost essential. The more complete ecological public blockchains Ethereum, EOS, TRON, not only have introduced USDT issued by Tether but also have their on-chain collateralized stablecoins DAI, EOSDT, USDJ. For Ethereum, stablecoins are an important asset, providing a foundation for the ecology on Ethereum. But at the same time, stablecoins also bring some bad effects, such as on-chain congestion and skyrocketing fees. In this report, we will introduce you to the on-chain usage of stablecoins on Ethereum.

To maintain the security of the network and allow miners to package transactions, each transaction on Ethereum needs to pay a certain gas fee, and the gas fee can most intuitively reflect the use of each contract on the blockchain. According to Etherscan data on August 23, Uniswap V2: Router2, which uses the most Gas fee, is routing on Uniswap. The Transaction Account ‘0x7a250d5630b4cf539739df2c5dacb4c659f2488d’ that we often see on Uniswap corresponds to the contract. In Uniswap, any two ERC20 tokens can be exchanged, which is the credit of the routing. There are still many transactions that do not require routing because of direct exchange, so it can be confirmed that Uniswap is a Dapp that costs more gas fees on Ethereum.

Source: ethereoscan.io

The second-largest cost of the gas fee is USDT. In the past 24 hours, the gas fee cost accounted for 10.73% of the Ethereum, with a total cost of 780.69ETH, totaling $305,800. Among the top 50 fees, there are USDC and DAI, which account for 0.47% and 0.22% of the gas fee on Ethereum. Only the transfer of USDT, USDC, and DAI accounted for 11.42% of the gas fee on Ethereum.

Some contracts are highly related to stablecoins that also cost a lot of gas fees, such as Curve.fi: y Deposit, Maker: Proxy Registry. Uniswap, 1inch, Balancer, and other decentralized trading platforms will also use more stablecoins. It can be predicted that transactions related to stablecoins cost a relatively high proportion of gas fees on Ethereum.

The total lock-up volume is an important indicator when measuring the scale of DeFi projects. A large part of the lock-up volume in Aave, Curve, and Compound comes from stablecoins.

Aave is an open-source decentralized lending protocol. Although many kinds of assets can be deposited and withdrawn in Aave, the main deposit /lending of users is still stablecoin. Currently, deposits in Aave include 267 million USDT, 250 million USDC, 240 million TUSD, and 39.53 million DAI. There are more deposits in USDT, USDC, and TUSD, but DAI is more capital efficient. USDT, USDC, and TUSD have more deposits, but the capital utilization rate of DAI is higher.

Source: apps.aave.com

Curve is a protocol based on automatic market makers and specifically designed for stablecoin and stablecoin swap. Curve also integrates the cToken of Compound and yToken of yearn. Compound and yearn depositors can not only get the loan interest in their respective platforms but also earn transaction fees in Curve. As of August 23, a total of 1.139 billion dollars of assets have been deposited in Curve including 333 million USDC, 307 million USDT, 236 million TUSD, 70.7 million DAI, 23.39 million sUSD, 290,000 BUSD, and 220,000 PAX. Also, there are 70.19 million renBTC and 68.98 million WBTC.

Source: curve.fi

As of August 24, the total deposits in Compound were $1.746 billion, which is still the DeFi project with the most deposits. Deposits in Compound include $1.024 billion DAI, $201 million USDC, and $25.55 million USDT. The concept of collateral factor is introduced in Compound. The collateral factor is the ratio of assets that can be borrowed against deposits, and the collateral factor of USDT is 0. Since Compound does not fully support USDT, USDT deposits are less. The application of DAI in Compound is better. The loan balance of DAI in Compound is 217 million, accounting for 49.2% of the DAI supply.

Source: compound.finance

Among the top 6 tokens with the best liquidity in Uniswap, USDC, USDT, DAI, and sUSD are all relatively common stablecoins. Among them, USDC has the best liquidity, but USDT has a higher volume/liquidity in Uniswap, and it is easier to get higher transaction fees for depositing USDT. For example, in the exchange pairs with ETH, the liquidity of USDC-ETH, USDT-ETH, and DAI-ETH are $22.36 million, $16.01 million, and $13.12 million. However, due to the higher volume of transactions of USDT-ETH, their fees in the past 24 hours were $27,850, $38,648, and $25,227.

Source: uniswap.info

3. Questions of Readers

1. Why is USDT not included in the cryptocurrency whitelist announced by NYDFS?

Answer: USDT, USDC, and DAI are not on the list of cryptocurrency whitelists announced by NYDFS, but PAX, BUSD, and GUSD are on the list. I think this is because of compliance issues. GUSD and PAX passed the approval of NYDFS in the first batch, and it is not surprising that they can pass the whitelist of NYDFS this time. BUSD is a stablecoin launched by Binance and Paxos, and it has also been approved by NYDFS.

USDC was developed by Centre Consortium, a company jointly invested and operated by Coinbase and Circle. It complies with the supervision of the ‘MTM’ and can be used as a currency transfer agency in the United States, but it has not obtained a clear regulatory license from NYDFS.

Although USDT is large in scale, it has the worst compliance and does not even issue monthly audit reports like other stablecoins. Last year, the New York Department of Justice also accused Bitfinex of illegally embezzling $850 million in Tether to make up for losses. Therefore, USDT is the most difficult to comply with the commonly used off-chain collateralized stablecoins.
On the other hand, DAI is entirely dependent on on-chain collateral generation without the participation of centralized institutions and has the characteristics of anti-censorship, and there is uncertainty for supervision.

Tips

To better communicate with industry insiders, we decided to add two sections for questions of readers and opinions of guests. If readers have questions about stablecoins, please contact us. We will pick meaningful questions to answer in the next issue. At the same time, welcome guests from the industry to share your views on stablecoins. Contact information: jianghb@mykey.org.

This is what we’re sharing in this MYKEY Crypto Stablecoin Report, welcome to stay tuned for follow-up crypto stablecoin reports. We will provide more interpretations of the development status of stablecoins and analysis of their development trends to help you stay updated on the development status of stablecoin in the follow-up report.

PS: MYKEY Lab has the final right to interpret the content of the article, please indicate the source for the quotation. Welcome to follow our official account — MYKEY Lab: MYKEY Smart Wallet.

Past review

MYKEY Crypto Stablecoin Report 01: USDT continues to gain momentum as market capitalization exceeding $10 billion

MYKEY Crypto Stablecoin Report 02: USDT suspended additional issuance and the usage scenario of USDT in Tron is single

MYKEY Crypto Stablecoin Report 03: Where are the users of DAI?

Crypto Stablecoin Report 04: Tether additional issued 300 million USDT, commenting on various decentralized stablecoins

Crypto Stablecoin Report 05: DAI Maintains Steady Growth, Exploring Use of DAI by Users of Centralized Exchanges

Crypto Stablecoin Report 06: The latest 13 additional issuances of USDT all occurred on Tron, driving the increase use of Tron

Crypto Stablecoin Report 07: Security Analysis of Stablecoins

Crypto Stablecoin Report 08: Interpretation of Digital Dollar Project

Crypto Stablecoin Report 09: Analyze the lending leverage of Compound

Crypto Stablecoin Report 10: Introduce the Algorithmic Stablecoin Project Terra (Luna)

Crypto Stablecoin Report 11: The circulation of stablecoins has overall increased, Holding AMPL a month for 51 times incomes

Crypto Stablecoin Report 12: USDT is additionally issued 690 million The use of stablecoins outside the cryptocurrency market

Crypto Stablecoin Report 13: The market capitalization of stablecoins reached $14.387 billion, Stablecoin pool Reserve

Crypto Stablecoin Report 14: The increase of Ethereum Gas Fee makes the transfers of stablecoin transactions on the blockchain

About Us

KEY GROUP: https://keygroup.me/

MYKEY Web: https://mykey.org/

BIHU: https://bihu.com/people/1133973

Telegram: https://t.me/mykey_lab

Twitter: https://twitter.com/mykey_lab

Medium: https://medium.com/mykey-lab

Github: https://github.com/mykeylab

Youtube: MYKEY Laboratory

Crypto Stablecoin Report 15: The market capitalization of stablecoins increased to $15.961 was originally published in MYKEY Lab on Medium, where people are continuing the conversation by highlighting and responding to this story.


PingTalk

4 Ways to Make More Intelligent Workforce Authentication Decisions

Earlier this year, many enterprises saw their once-bustling offices go quiet as company-wide work from home policies went into effect around the world. The shift to remote work has magnified the cracks in the traditional network approach to enterprise security. Once able to make access decisions based on whether employees are in the office or not, organizations can no longer rely on this binary pr

Earlier this year, many enterprises saw their once-bustling offices go quiet as company-wide work from home policies went into effect around the world. The shift to remote work has magnified the cracks in the traditional network approach to enterprise security. Once able to make access decisions based on whether employees are in the office or not, organizations can no longer rely on this binary process of workforce authentication decision making when all employees are working off premises.


Smarter with Gartner - IT

Top Actions From Gartner Hype Cycle for Cloud Security, 2020

Cloud computing has proven battle-ready. During COVID-19, cloud demonstrated it can support unplanned and unexpected needs. Organizations may no longer question its utility, but security remains a commonly cited reason for avoiding it. In reality, the public cloud can be made secure enough for most uses. “Even for the most reluctant organizations, there are now techniques such as confidential

Cloud computing has proven battle-ready. During COVID-19, cloud demonstrated it can support unplanned and unexpected needs. Organizations may no longer question its utility, but security remains a commonly cited reason for avoiding it.

In reality, the public cloud can be made secure enough for most uses.

“Even for the most reluctant organizations, there are now techniques such as confidential computing that can address lingering concerns,” says Steve Riley, Senior Director Analyst, Gartner. “You can stop worrying about whether you can trust your cloud provider.”

Learn more: About the Gartner Hype Cycle Methodology

[swg_ad id="28186"]

Confidential computing — one of 33 technologies on the Gartner Hype Cycle for Cloud Security, 2020 — is a security mechanism that protects code and data from the host system. By making critical information invisible to third parties, including the host, it potentially removes the remaining barrier to cloud adoption for highly regulated businesses in the financial services, insurance and healthcare sectors.

For example, a retailer and a bank could cross-check customer transaction data for potential fraud without giving the other party access to the original data.

While confidential computing is highly useful in theory, it isn’t plug-and-play. Gartner anticipates a five- to 10-year wait before it is in regular use.

Here are three technologies from the Gartner Hype Cycle for Cloud Security, 2020, to action right now.

Secure access service edge 

Secure access service edge (SASE), pronounced “sassy,”  supports secure branch office and remote worker access. SASE’s cloud-delivered set of services, including zero trust network access and software-defined WAN, is driving rapid adoption.

Gartner predicts that by 2024, at least 40% of enterprises will have explicit strategies to adopt SASE, up from less than 1% at the end of 2018.

Watch out for slideware, especially from incumbent vendors

COVID-19 has highlighted the need for business continuity plans that include flexible, anywhere, anytime, secure remote access at scale, even from untrusted devices. SASE enables security teams to deliver secure networking and security services in a consistent way, to support digital business transformation and workforce mobility.

SASE is in the early stages of market development but is being actively marketed by the vendor community, with more than a dozen SASE announcements over the past 12 months.

User advice: “Watch out for slideware, especially from incumbent vendors that are ill-prepared for cloud-based delivery as a service model,” says Riley. “This is a case in which software architecture and implementation matters. True SASE services are cloud-native.”

Read more: Gartner Top 9 Security and Risk Trends for 2020

Cloud security posture management 

It is becoming increasingly complex and time-consuming to answer the critical question “are my public cloud applications and services configured securely?” Even simple misconfiguration issues represent significant risk, as evidenced by several public data disclosures last year.

For enterprises that have a multicloud strategy, cloud security posture management (CSPM) assures business and security leaders that their services are implemented in a secure and compliant way across multiple cloud infrastructure as a service (IaaS) providers.

User advice: “First, investigate your cloud provider’s own risk posture assessment capabilities to see if they will satisfy the requirement, even if they fall short of commercial offerings,” Riley says. “Also check if any products you already have include CSPM capabilities.”

Cloud access security brokers  

Unlike traditional security products, cloud access security brokers (CASBs) are designed to protect data that’s stored in someone else’s systems. They enable organizations to achieve consistent security policies and governance across many cloud services and demonstrate that cloud use is well-governed.

We recommend seeking one-year contract terms over lengthier ones

[swg_ad id="28186"]

The pace of Gartner client inquiry indicates that CASBs are a popular choice for cloud-using organizations. Although Gartner’s latest spending forecast shows slowing growth for all security markets, CASBs’ expected growth remains higher than any other information security market at 33% in 2020. This high-benefit technology has entered the mainstream and the number of vendors has stabilized.

User advice: “Differentiation among vendors is becoming difficult, and several have branched beyond SaaS governance and protection to include other features such as CSPM and user and entity behavior analysis (UEBA). Given continued feature expansion and relative ease of switching, we recommend seeking one-year contract terms over lengthier ones,” says Riley.

Explore now: 5 Trends Drive the Gartner Hype Cycle for Emerging Technologies, 2020

The post Top Actions From Gartner Hype Cycle for Cloud Security, 2020 appeared first on Smarter With Gartner.

Wednesday, 26. August 2020

Global ID

Meet the Team—Erik Westra, head of GlobaliD Labs

Erik Westra, Technical Architect We have a fun one for you—the inaugural interview of our new Meet the Team series of employee profiles. Meet the Team is an opportunity for our users and partners to get to know us a little better. As a remote company with employees around the world—from the U.S. to Slovenia to Spain to New Zealand—it’s also an opportunity for our team to get to know each oth
Erik Westra, Technical Architect

We have a fun one for you—the inaugural interview of our new Meet the Team series of employee profiles.

Meet the Team is an opportunity for our users and partners to get to know us a little better.

As a remote company with employees around the world—from the U.S. to Slovenia to Spain to New Zealand—it’s also an opportunity for our team to get to know each other a little better (especially with current travel restrictions).

Today, we’re introducing Erik Westra, technical architect at GlobaliD, head of GlobaliD Labs, and avid cyclist.

Not only is Erik working on a ton of super cool projects here at GlobaliD, he also has a unique perspective given his decades-long relationship with our co-founder and CEO Greg Kidd, which, as you’d expect, involves plenty of crazy adventures.

During our chat, Erik covered a range of topics:

How GlobaliD Labs came to be Some of the cool projects Labs is working on right now How Erik met Greg back in the ’80s (spoiler alert: payments data, an insane bike ride, and hypothermia) Their first company together (which went public on the NASDAQ and had over 6000 employees) How a young Jack Dorsey hacked into their systems The time Erik saved both their lives (and the essence of their relationship) Standing up for their values and being sued for $13 trillion Coming full circle and the birth of GlobaliD (and Groups!) 1. Who are you and what do you do at GlobaliD?

My name’s Erik! I’m head of GlobaliD Labs, which is responsible for the experimental aspects of what we’re trying to build. My job involves designing systems, writing proof of concepts, and managing the Labs team. We’re always trying to push the envelope.

GlobaliD Labs exists because the current way we build our system has to be relatively structured to ensure quality code and security — everything needs to work and run smoothly. Like other startups, we employ agile frameworks for development, but we need to balance that with long term planning.

That has huge benefits for the quality of our core product, but sometimes this structured approach can crowd out more experimental ideas.

I like to call myself the unofficial champion of GlobaliD Groups. It’s a feature that’s been central to our long term vision from day one, but for one reason or another, we weren’t able to get it into a production environment. Something else always required higher priority when it came to the core business.

“I like to call myself the unofficial champion of GlobaliD Groups.”

Even though Groups had a long term upside for us — especially when it came to our vision and our mission — we never had time to develop it.

That’s when we decided to spin it off and create GlobaliD Labs, which helps bring balance between our short term development needs with our long term aspirations. Now, we have a separate team dedicated to Groups that’s also exploring other cool ideas.

2. What kind of cool ideas?

We believe that Groups should be able to issue credentials to people. You might get a license to ride a particular mountain bike trail — one of Greg’s favorite things, being a mountain biker. You might get a ticket to a Zoom concert. Or you might get a credential for receiving a vaccine.

It’s an open platform so people can come up with their own ideas on how to use it. You could even set up a group and run your business through it — connecting directly with employees and customers, matching up seekers and providers of goods and services, managing bookings, sending and receiving payments, and of course marketing and promotions — all done through the GlobaliD platform.

“It’s an open platform so people can come up with their own ideas on how to use it. You could even set up a group and run your business through it — all done through the GlobaliD platform.”

Another project we’ve been investing time into is email — what I call the email anonymizer, an idea I find quite cool.

One of the problems today is that whenever you sign up for anything online, you need to provide your email. Do this enough times and your inbox is now full of spam. You’ve essentially lost control of your email address. With the email anonymizer, you can set up a one-time address. That way, you never reveal your private email. You stay in control of your email.

Let’s say you want to sign up for Amazon and they ask for your email. Now, you can create an anonymous address just for Amazon. If you choose to receive those messages, emails sent by Amazon will go to your private address. If you decide that you don’t want to deal with Amazon anymore, you can simply disable the address. Privacy is a fundamental tenet of our company, and this is another way of keeping your identity private.

It’s little things like this that I find quite cool. If both sides are using the anonymizer, you can actually have both parties communicate back and forth without either party knowing the other’s real address.

That’s just one of the things we’re working on right now. At Labs, we go in a lot of odd directions; there’s a lot of exploration. For every idea that does work out, there will be plenty that fall flat. Maybe they don’t work technically or there’s no business need for it. We’ll learn from those lessons. We’re the mad scientists at GlobaliD.

“At Labs, we go in a lot of odd directions; there’s a lot of exploration. For every idea that does work out, there will be plenty that fall flat. Maybe they don’t work technically or there’s no business need for it. We’ll learn from those lessons. We’re the mad scientists at GlobaliD.”
3. How did you end up in this role?

I met Greg [GlobaliD co-founder and CEO] around 1980 when I was still a young man — a very, very long time ago. I was studying computer science and psychology at university in New Zealand. I was really into human-computer interaction.

Greg came to New Zealand to restructure the banking industry. I was taking a part-time course in the evenings at the polytechnic at the time and word came around that people were looking for students to help with data analysis as a part-time job. I put my hand up.

It was a company called Booz Allen & Hamilton — where Greg was working at the time. They were management consultants working with Data Bank of New Zealand, a centralized check processing company, to help make their operations more efficient. My job was to analyze data using Excel macros, and the results would be displayed in various charts. Excel charts didn’t look professional enough for the presentations they wanted, so we had to draw them by hand in MacDraw.

One day, I was at my desk, busily drawing one of these charts when Greg walks by and notices my bike pump sticking out of my bag.

“Oh, you ride a bike?” he asked. It was more of a statement, really. “We’ll go for a ride this weekend. I’ll pick you up.”

“Okay, cool!”

Or so I thought. I figured it would probably be a three or four hour ride. After all, all we had were our road bikes — real, old-school technology. No problem.

But then Greg got this idea. “Oh, look. There’s this route — it goes over this hill and goes all the way out to the coast. This looks pretty cool. We’ll go that way.”

And so off we go.

That hill ended up being a rather large mountain — mostly rock with severe slopes. You can imagine us going up and down, up and down on these narrow-tired road bikes with our big clunky shoes with cleats on the bottom. I suffered something like six flats going over that mountain range.

Oh yeah, it was mid winter.

By the time we got to the other side of the coastline, it was dusk, and we still had another 70 kilometers to ride back to the car — the only silver lining being that it was on the road again.

But it was dark, and I was getting really cold. At one point, I became hypothermic and I had to lie in a ditch by the side of the road while Greg rode back himself to get the car.

In all, it was a 13-hour bike ride. It was absolutely insane.

That’s how we met each other for the first time.

We’ve been doing crazy things together ever since.

“At one point, I became hypothermic and I had to lie in a ditch by the side of the road while Greg rode back himself to get the car. In all, it was a 13-hour bike ride. It was absolutely insane. That’s how we met each other for the first time. We’ve been doing crazy things together ever since.”
4. On the beginning of a lifelong partnership

Around 1986, Greg called me. “Hey, can you come meet me? I want you to see something.”

I met him at this really old, dingy building in downtown Wellington, New Zealand. People are running in and out with their bicycles. What are these guys doing, I’m wondering to myself.

Apparently, it was an urgent messenger delivery company. The guys running around with bikes were delivering parcels.

“Come look at this,” Greg said. We went up the stairs. “Open this door — that’s where we’re going. But be careful. If the wind blows, we’re in real trouble.”

We sneak in and shut the door behind us. On two wooden supports is this door covered with hundreds of bits of paper.

That’s how they kept track of all the parcels.

Their service could guarantee delivery within 15 minutes from the time of your call — super efficient. For a while, we just stood there and watched it work.

Then Greg looked over at me. “Can you build a computer program that does that?”

That was our first business together. It took us about eight years to build it. But it was very successful. Eventually, it made it to the U.S., and we actually wound up being listed on the NASDAQ. We had about 6,000 employees at one stage. The company was called DMS corporation — Dispatch Management Services.

“That was our first business together. It took us about eight years to build it. But it was very successful. Eventually, it made it to the U.S., and we actually wound up being listed on the NASDAQ. We had about 6,000 employees at one stage. The company was called DMS corporation — Dispatch Management Services.”

We were really lucky. It was our first try at building something together and it succeeded.

5. On Jack Dorsey hacking into their system

We ultimately left DMS and explored various other ventures together. I want to tell you about one because it’s kind of interesting.

We were building a system for last-mile ecommerce delivery. You could order online, and someone would pick it up from the depot and bring it to you within an hour. We needed an admin interface that had to be access-controlled. “Let’s do something quick! Just hardwire it into the code,” Greg said — which was typical for him at the time. Everything was about speed and minimum viable products. So that’s what we did.

Of course, someone broke in and sent Greg a message: “Your security sucks.”

That someone was Jack Dorsey.

“Of course, someone broke [into our system] and sent Greg a message: ‘Your security sucks.’ That someone was Jack Dorsey.”

Greg decided to hire him and we worked together for a while before he went off and founded Twitter.

It was my fault that Jack broke in. In my defense, it’s what Greg told me to do. We’ve taken our security a bit more seriously since then :)

6. I certainly hope so! What’s it like working with Greg?

That’s been the essence of our relationship.

Greg lived in New Zealand for about six years — if I remember correctly.

“We should go for a hike in the mountains,” he said one day.

So off we went.

Along the way, we reached the Broken Axe Pinnacles. The name speaks for itself — it’s pretty insane, this narrow little ledge with maybe a thousand-foot drop on either side plus strong winds. We were creeping along this sliver of a ledge when suddenly it started sloping downhill, steeper and steeper.

“Come on, we gotta get down there!” Greg said.

“Hang on a minute, Greg. Let’s just wait and check.”

We stopped for a moment to check the compass and the map. Sure enough, that path dropped off into a cliff. Had we kept going, we wouldn’t have been able to get back up. We could have died up there.

“We stopped for a moment to check the compass and the map. Sure enough, that path dropped off into a cliff. Had we kept going, we wouldn’t have been able to get back up. We could have died up there.”

That’s one way to view our working relationship — we’ve been working as a team for many years. There’s always a lot of back and forth. Greg would come up with all these mad ideas, and I’d figure out how to make them work.

It’s been like that ever since we met.

7. Public information should be free

We’ve had a number of crazy adventures together. Another one of the companies we worked on, 3Taps, was sued by Craigslist for the nice sum of $13 trillion — an insane, arbitrary number.

[3Taps was an exchange company dedicated to keeping public facts publicly accessible — which included Craigslist listings.]

Greg was steadfast — he believed that public information should be free. If the information is freely available, it should be available to everybody. You can’t make it available to some people and not to others.

“Greg was steadfast — he believed that public information should be free. If the information is freely available, it should be available to everybody. You can’t make it available to some people and not to others.”

[Greg was an executive producer of and contributor to the Aaron Schwartz documentary — The Internet’s Own Boy: The Story of Aaron Swartz.]

He fought the case, which went on for several years. It got pretty ugly and I wondered if we’d lose everything. Eventually, they reached a settlement, which Greg saw as a victory. He agreed to pay out $1 million — except to the charity of his choice, the Electronic Frontier Foundation.

After that saga, we were in a little bit of a lull.

Then out of the blue, Greg invited me to join him in Ljubljana, Slovenia. You can guess where this is going.

8. “The early days of GlobaliD”

This was the early days of GlobaliD — the company had only been up and running for a couple of months.

We were there for two weeks, discussing the possibilities of a self-sovereign identity platform complemented with messaging and a wallet.

We came up with this notion of Groups. Having your digital identity is great. Here’s my phone, my wallet — my identity. I’m an individual. I can do what I need to do. But it’s when you have collections of people working together — there’s an emergent quality that’s greater than the sum of our parts.

“We came up with this notion of Groups. Having your digital identity is great. Here’s my phone, my wallet — my identity. I’m an individual. I can do what I need to do. But it’s when you have collections of people working together — there’s an emergent quality that’s greater than the sum of our parts.”

That’s where Groups comes into play. It’s this notion that you can do a lot more with people together than with people individually.

I wrote a bunch of white papers outlining how Groups should work, but it took another three years to get the foundation in place. This is where GlobaliD Labs came in.

I’m going to paraphrase Greg here, but once you put together the three different pieces, identity, messaging, and money — and now with Groups — magic happens.

Most people in our company are relatively young. Greg and I — we’re not spring chickens anymore. We’ve worked on a number of projects over the years. With GlobaliD, we just thought, “Let’s do this.” It’s our last big chance to do something amazing.

This isn’t a job. This is a mission. We are absolutely one thousand percent dedicated to making this succeed.

You might also like:

GlobaliD App — Introducing Groups, Vouches, and multi-person chat GlobaliD messaging is end-to-end encrypted by default GlobaliD App — Introducing the Wallet GlobaliD App: Introducing SEPA and crypto transfers to your Wallet

Join a growing trusted community and experience how digital identity works for you.

Meet the Team—Erik Westra, head of GlobaliD Labs was originally published in GlobaliD on Medium, where people are continuing the conversation by highlighting and responding to this story.


MyKey

Announcement: KEY ID Ethereum Logic Contract Module Upgrade

Reason for this upgrade The KEY ID Ethereum contract has passed the security audit of trailofbits (https://www.trailofbits.com/). According to the audit, the logical contract module has been updated and upgraded, including ERC1271/NFT/multi-signature proposal support and optimization. Publicity of this contract code https://github.com/mykeylab/keyid-eth-contracts Publicity of the thi

Reason for this upgrade

The KEY ID Ethereum contract has passed the security audit of trailofbits (https://www.trailofbits.com/).
According to the audit, the logical contract module has been updated and upgraded, including ERC1271/NFT/multi-signature proposal support and optimization.

Publicity of this contract code

https://github.com/mykeylab/keyid-eth-contracts

Publicity of the third-party audit report of this contract

The audit report of this trailofbits is being issued and will be published after finishing the final version.

AccountLogic original contract address

0x6A3f8fE26c22f6Ec7938ee046a69293F6C692B6F

AccountLogic new contract address

0x205dc661ee6946319ebb0698a017bcc20549910f

DualsigsLogic original contract address

0xB9D2FcBF411DdB9CdFF0A705abD401217221012A

DualsigsLogic new contract address

0x142914e134348e51c5f402baed81810a1f829e7b

DappLogic original contract address

0x847f5AbbA6A36c727eCfF76784eE3648BA868808

DappLogic new contract address

0xF9bb55b6a14ACd32066182f0F5f0296073F5D054

ProposalLogic new contract address

0xdc4A5151C0F29f6deFA09b383d04B95d587fA275

CommonStaticLogic new contract address

0x910119BEE96C7A03Dd2597D4596e88bDf3aff682

Time of upgrade trigger

2020–08–26 24:00 (UTC+8)

Time-Lock effective time

2020–08–26 24:00 (UTC+8)

If you have any questions about the upgrade publicity, you can contact us: service@mykey.org or leave messages on GitHub. Thank you.

About Us

KEY GROUP :https://keygroup.me/

MYKEY Web: https://mykey.org/

BIHU: https://bihu.com/people/1133973

Telegram: https://t.me/mykey_lab

Twitter: https://twitter.com/mykey_lab

Medium: https://medium.com/mykey-lab

Github: https://github.com/mykeylab

Youtube: MYKEY Laboratory

Announcement: KEY ID Ethereum Logic Contract Module Upgrade was originally published in MYKEY Lab on Medium, where people are continuing the conversation by highlighting and responding to this story.


Ontology

Ontology Weekly Report (August 19–25)

This week we celebrated the successful completion of the next-stage development of our open source DID smart contract (did:etho:) on the Ethereum network. Furthermore, we joined forces with Neo and Switcheo to announce the launch of Poly Network, a heterogeneous protocol that will enable cross-chain interoperability while greatly increasing transparency and accessibility. Back-end - Completed 20

This week we celebrated the successful completion of the next-stage development of our open source DID smart contract (did:etho:) on the Ethereum network. Furthermore, we joined forces with Neo and Switcheo to announce the launch of Poly Network, a heterogeneous protocol that will enable cross-chain interoperability while greatly increasing transparency and accessibility.

Back-end

- Completed 20% of Ontology GraphQL interface development

- Layer 2 v0.3 released

Product Development

ONTO v3.2.0

- ONTO organized a special initiative in August geared towards encouraging users to collect NFT medals. Users who tried the latest credential templates function are eligible to collect the limited NFT “Explorer” and “Supporter” medals.

ONTO v3.3.0

- Completed 50% of development

dApp

- 70 dApps live on Ontology

- 6,051,367 dApp-related transactions since genesis block

- 4,425 dApp-related transactions in the past week

Bounty Program

- Currently recruiting SDK community developers

- 1 new application for SDK bounty

Community Growth

- We onboarded 79 new members across Ontology’s Filipino, Dutch, and German communities.

Newly Released

- Ontology successfully completed the development of the next stage of its open-source DID smart contract on the Ethereum network. The new smart contract method did:etho:, brought by Ontology’s tech and product teams, can be used across the Ethereum network and within a range of popular DeFi applications, as well as across most chains that run EVM virtual machines.

- Ontology, Neo, and Switcheo jointly announced the launch of Poly Network, a heterogeneous interoperability protocol alliance. Poly Network will permit cross-chain interoperability, greatly increasing transparency and accessibility. Enterprises leveraging different blockchain infrastructure can connect to Poly Network, and collaborate and interact with each other through an open, transparent admission mechanism.

Global Events

- John Izaguirre, Ontology’s Europe Ecosystem Lead, was invited to a panel at Indonesia Blockchain Week 2020 (IBW2020) to exchange insights on DeFi projects with other project founders and executives of Tron, Binance and Aave. John said, “The DeFi projects available in the market have yet to demonstrate full reliability, and barriers remain between DeFi projects and traditional financial projects. Ontology’s DID solutions are specifically tailored to tackle these problems as users’ assets and their credit levels can be integrated into the OScore system, a new capability that will greatly benefit the industry as a whole.”

Find Ontology elsewhere

Ontology website / ONTO website / OWallet (GitHub)

Twitter / Reddit / Facebook / LinkedIn / YouTube / NaverBlog / Forklog

Telegram Announcement / Telegram English / GitHubDiscord

Ontology Weekly Report (August 19–25) was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.

Tuesday, 25. August 2020

KuppingerCole

The 3 Steps to Secure IAM Modernization

When organizations modernize their Identity and Access Management (IAM), they have three fundamental requirements: an understanding of current capabilities, a migration strategy to transform the IAM infrastructure and finally, the staff with the expertise to execute the plan. The challenges on the way from legacy IAM to a modern IAM infrastructure are manifold and should be considered beforehand.

When organizations modernize their Identity and Access Management (IAM), they have three fundamental requirements: an understanding of current capabilities, a migration strategy to transform the IAM infrastructure and finally, the staff with the expertise to execute the plan. The challenges on the way from legacy IAM to a modern IAM infrastructure are manifold and should be considered beforehand.




Forgerock Blog

Fireside Chat With Former Australian Prime Minister Malcolm Turnbull 

The Role of Cyber Security & Digital Identity in the Modern Economy  While the connection between cyber security and Identity and Access Management continues to strengthen across Australia, it has become a keen area of focus for one of the country’s most influential leaders. I recently sat down with Malcolm Turnbull, the 29th Prime Minister of Australia, and ForgeRock Managing Director
The Role of Cyber Security & Digital Identity in the Modern Economy 

While the connection between cyber security and Identity and Access Management continues to strengthen across Australia, it has become a keen area of focus for one of the country’s most influential leaders. I recently sat down with Malcolm Turnbull, the 29th Prime Minister of Australia, and ForgeRock Managing Director for Australia & New Zealand, James Ross, to discuss the opportunities and trends that are driving innovation and investment in the region.  

Solving complex security challenges

Speaking to a virtual audience of ForgeRock customers and identity professionals from around the globe, Mr. Turnbull shared his views about the risks associated with handling cyber threats reactively. Globally, organizations are investing billions of dollars to reactively combat threats that can be delivered, in some cases, by a single skilled individual with an internet connection. The threat increases of course, with the investments that can be made by cybercriminals and foreign governments. The impact could lead to widespread disruption to our society.

The rationale behind Mr. Turnbull’s 2016 Australian National Cyber Security Strategy - which established the country’s first formalized approach to cyber security and saw an investment of A$230 Million dollars across 33 initiatives and the establishment of the national Cyber Security Centre - was to recognize the seriousness of cyber threats that was estimated to be costing the Australian economy A$7 Billion per year and offer a strategy to get on the front-foot in combating them. ForgeRock’s approach complements this thinking. So much of our work with customers and partners is to raise awareness that usernames and passwords are not sufficient. By bringing in behavior-based authentication and biometrics, organizations can quickly eliminate a major point of exploitation from hackers while simultaneously opening up opportunities for improving the customer and employee experience by providing users with a passwordless authentication option that is even more secure. Digital identity is now a critical part of overall cyber security planning and execution.

“Identity is trust and trust is identity"

People want to feel safe and secure when they engage with brands and services online. They expect their bank, healthcare provider, or favorite online retailer will keep their personal data protected from harm. No wonder then, that the tension that so many organizations are grappling with is how to make consumer experiences easy while putting the right security controls in place to assure people that their data is being well looked after.  The ability to login using a single credential and get access to multiple services (through single sign-on or SSO) eliminates the burden of remembering multiple user profiles and passwords.

What is clear is that people are happy to hand over sensitive information to trusted brands. In turn, that personal information is used to authenticate individuals and provide access to services. As Mr. Turnbull highlighted, “Identity is trust and trust is identity.” No wonder then that, alongside Australia’s National Cyber Security Strategy, Mr. Turnbull was behind the establishment of the country’s Digital Transformation Agency (DTA), which explored, among other things, identity and access management. Out of the DTA has come Australia’s MyGov and MyHealthRecord systems that make government digital for citizens - both of which are built on the premise that authenticated identity is the cornerstone to streamlined access to services.

Powering new opportunities

Of course, organizations that use identity to create secure, streamlined digital experiences will find themselves ahead of competitors. Personalizing services for customers or citizens makes life easier and allows people to self-manage their accounts and services and has a positive impact on an organization’s bottom line.  

In Australia, this opportunity is being driven by ambitious government initiatives. Following the UK’s Open Banking program, Australia launched the Consumer Data Right (CDR) regulation, which will enable consumers to more easily transfer their personal information to competing companies. Aside from empowering people to own and use their personal data to comparison shop, the CDR will also enable vendors and third parties to securely access the personal information stored by banks, powering further innovation and customisation of services. Identity management solutions are a critical component in the success of the legislated CDR roll out.

The role of identity management is being further thrust into the spotlight by the COVID-19 lockdown, which has sparked widespread upticks in demand for digital access. In Australia, national retailer Woolworths reported a 320 percent increase in app use, while ANZ Bank saw a five-fold increase in the use of its digital channels and these are just a  couple of relevant examples.

As Mr. Turnbull observed, digital is becoming the battleground for brands and organizations and for the public sector to provide better secured customer experience to its citizens and as he stressed during our discussion, “If we crack the digital identity nut, a lot of the issues we are facing will be mitigated.”

You can watch the replay of our virtual fireside chat with Mr. Turnbull here

 


Global ID

The GiD Report#124 — Fortnite’s philosophical crusade for a more open, interoperable future

The GiD Report#124 — Fortnite’s philosophical crusade for a more open, interoperable future Welcome to The GiD Report, a weekly newsletter that covers GlobaliD team and partner news, market perspectives, and industry analysis. You can check out last week’s report here. What we have for you this week: Epic and all the other huge companies fighting Apple (now a $2 trillion company
The GiD Report#124 — Fortnite’s philosophical crusade for a more open, interoperable future

Welcome to The GiD Report, a weekly newsletter that covers GlobaliD team and partner news, market perspectives, and industry analysis. You can check out last week’s report here.

What we have for you this week:

Epic and all the other huge companies fighting Apple (now a $2 trillion company) This isn’t Epic’s first rodeo — like the time they took on Valve (and everyone won — including Valve) Epic’s founder Tim Sweeney has been criticizing tech monopolies for years. Why he believes the future will be less centralized/closed and more distributed/open/interoperable. Tim Sweeney’s also a fan of blockchain. Greg on Occulus: “The Oculus move (under the direction of Facebook) towards a single, unified, and somewhat portable identity, bespeaks the future to come.” This week on TikTok/WeChat (spoiler: this is what Facebook has been lobbying for) A new era of the Federal Reserve Race for central bank digital currencies heats up Stuff happens 1. What’s really remarkable about the moment we’re in is just how suddenly the tides can shift — economically, culturally, and technologically. It speaks to where we are in the cycle. I’m, of course, talking about Apple, and what began as a trickle, the backlash against the App Store now feels like a tsunami.

Axios:

A steady drip of criticism over Apple’s App Store policies has become a torrent, as even other tech giants feel emboldened to pile on — but Apple’s path to satisfying its critics is uncertain.
Why it matters: Apple’s policies aren’t that different from those governing other digital marketplaces, but its size and inflexibility could fuel regulatory action from antitrust authorities in the U.S. and beyond.
The big picture: The criticisms of the App Store are many and varied, but most revolve around the 30% cut that Apple takes on sales of apps and other digital goods, with very limited workarounds.
Driving the news: Microsoft Sunday became the latest company to take aim at Apple, warning in a court filing that a move to restrict Epic Games’ access to developer tools could impact lots of games from other developers that rely on Epic’s Unreal game engine.

Hey, a new email app from the creators of Basecamp, should get some credit for sparking the latest wave of protest.

The thing is, we’ve seen that story before — some small fry developer complaining about big ‘ol Apple and their tyrannical policies. But Apple always seemed untouchable. They would ignore the criticism and continue on their merry way, never conceding any ground.

Moreover, Apple had some fair arguments in their defense. After all, in building the App Store from scratch, Apple created a new paradigm, unleashing immense value to the ecosystem. It made sense that Apple took a cut — they deserved credit for their work. It also made sense that they’d moderate the App Store — in itself a way to create more value in a safe, curated environment.

And as it created value for the ecosystem, it also created tremendous value for Apple, which just hit a market cap of $2 trillion:

Apple made Wall Street history when its 2020 stock surge pushed its market value over $2 trillion, a first for a U.S. company, Bloomberg reports.

“While it took Apple 38 years to reach its first $1 trillion in value, the next trillion only took two years.”

Apple shares have more than doubled since March.

Crazy stat: Apple, Microsoft, Amazon, Facebook and Google’s parent company, Alphabet, account for 23% of the S&P 500’s entire value.

Saudi Aramco in December became the world’s first company worth $2 trillion, but now trails Apple at $1.8 trillion.

As Apple has grown, so has the App Store and the nature of our relationship with our apps. Apps do a lot more now. Our expectations for what we want them to do has also grown. You could argue that Apple’s model for how it manages and charges for the App Store hasn’t evolved with the times.

Instead, most of their decisions seem to leverage their position, expanding power over the ecosystem across more touchpoints, demanding control of essential features and services such as identity and payments.

So it was only a matter of time before companies got tired of paying these exorbitant companies to the most valuable company on the planet.

Because small fries, these are not:

Epic itself, locked in a battle over the 30% commission, which wants a court to restore Fortnite to the App Store and stop Apple from cutting off its developer access. (Apple defended itself Friday, likening Epic to a shoplifter and insisting Epic’s issues were a crisis of its own making.)
Automattic, blocked from updating its WordPress app for not letting users buy .com domains through Apple’s in-app purchase system, despite not letting users buy such domains directly in the WordPress app at all. (Apple clarified its position and apologized for any confusion over the weekend.)
Plus, a group of news publishers asking for better App Store terms; Facebook, which suggests Apple’s terms will hurt small businesses; and Airbnb, which reportedly talked with House antitrust enforcers after Apple asked for a commission on Airbnb’s recently launched online experiences.

Spotify, for instance, says Apple gives itself a competitive edge by keeping all Apple Music subscription revenue while eating into that generated by other streaming services. EU antitrust regulators are now probing the App Store, in part due to Spotify’s complaints.
Match Group says Apple puts paid services at a disadvantage against ad-funded rivals that don’t have to give Apple a cut of their revenue.

For what it’s worth, here’s Apple’s take on the situation, via Matt Stoller:

During the Congressional hearings over the market power of large technology firms, Apple CEO Tim Cook insisted that Apple had little market power over mobile apps, because developers and consumers could always switch over to different types of phones or platforms on which to create software. Epic’s attempt to restructure terms with Apple is a great test case for Cook’s argument. One would expect, based on Cook’s views, that developers of popular apps have leverage against Apple, if Apple had little market power over app stores. Certainly, Epic’s Fortnite is popular, a massive multi-billion dollar game, as close to a must-have app as possible.

The problem, of course, is that I don’t want to give up my iPhone, and Apple knows this. Moreover, Google’s Play Store has more or less copied Apple’s model regarding commission and payment options.

So it’s a sticky situation for developers and users alike. They don’t really have much in the way of options. (Epic has a parallel suit with Google):

Yes, but: Apple’s rules are slightly stricter, but not all that different from those on other digital marketplaces, including those run by Amazon, Google and Microsoft.

The primary difference between iOS and Android, however, is that Epic is still able to provide their app directly to its users, bypassing the Play Store.

Still, this will likely be one for the regulators:

The bottom line: Expect these arguments to come from more companies in more settings, with courts and regulators likely having to decide if there’s an antitrust problem at play.

Related:

Apple reaches $2 trillion market value as tech fortunes soar Apple’s China Loopholes Are Starting to Close Big Tech’s Domination of Business Reaches New Heights Most Americans think social media platforms censor political viewpoints @FortniteGame 2. We can’t talk about the App Store controversy without talking about Epic Games, which is a super cool story in and of itself. First off, it’s not their first rodeo. They also took on Valve, which owns the Steam gaming platform.

For those who aren’t familiar with the gaming space, Steam is essentially the Apple App Store of games:

Countless challengers had tried and failed to overcome Valve’s primacy. Many carved niches for themselves, like GOG with DRM-free and classic games. But for the biggest PC releases? Why buy elsewhere if you already owned a few dozen (or few hundred) games through Steam? You wouldn’t.

Did Epic care? Not much so. They went after Valve with the same kind of swagger that they’re swinging around to take on Apple.

One interesting takeaway — without getting too much into the landscape of gaming — is that many believed that this was actually beneficial to Steam over the last two years. Funny how healthy competition works, huh?

A year ago, I wrote these words: “Epic Games is creating a Steam rival and Valve should be scared.” And for good reason: The Epic Games Store debuted December 6, 2018 and upended the PC gaming market. Valve, once unassailable, suddenly seemed very vulnerable. Flush with Fortnite money, Epic provided the first real competition to Steam since its inception 15 years prior. What company wouldn’t worry?
If I could amend that headline now though, I’d instead write “Epic Games is creating a Steam rival and Valve should be grateful.”

We can talk about the results a little bit, though — and you’ll soon see why. Here’s Matt Stoller:

I suspect that a legal approach was not what Epic wanted to do. Epic has experience using the marketplace to address monopoly power; a few years ago, Epic took on Valve, which owned a monopoly game store, Steam, by launching its rival Epic Games Store. And Epic has used Fortnite’s draw to force interoperability among PC, Mac, Xbox One, PS4 and mobile platforms.

It’s easy to get why this would be huge for users (gamers). Why was it great for Valve? Well, it forced them to up their game — literally. No longer able to rely on their platform dominance, they decided to improve their actual products. And they did exactly that. Steam, despite aggressive competition from Epic, is now better than ever.

Talk about a win-win-win situation. Goliath won, but so did David. And then users reaped the benefits.

But Epic’s battle with Apple will be fundamentally different in nature simply because of Apple’s complete dominance in their arena. It’s why Epic has taken the legal route — a strategy Greg has discussed and taken himself:

But because Apple’s monopoly power over the iPhone is so total, Epic chose to use the legal and political system to make an aggressive set of claims that, if accepted by a judge, would essentially destroy Apple’s control over the app ecosystem on the iPhone.

And according to Matt, Epic has a strong case (as well as a legendary attack dog):

These claims include the argument that Apple has tied its payments system to its app store, which is a fairly strong legal argument that judges have often (though not always) upheld. It also included an argument that the app store is what is called an ‘essential facility,’ and that Apple as a monopolist in control of such a facility has to share it with competitors or customers on reasonable terms. Judges have generally not upheld this kind of claim. So Epic is not only trying to take its dispute with Apple to the courts, it is also seeking to overturn court precedent, and encourage Congress to write statute overturning judge-made law.
As its head lawyer, Epic hired antitrust royalty, Christine Varney, who served as the Assistant Antitrust Attorney General under Obama, and before that worked for Netscape during the Microsoft battle in the 1990s. The suit draws heavily from the House Antitrust Subcommittee investigation and hearing, with multiple footnotes citing information unearthed by investigators, as well as a quote from Rep. Hank Johnson.

Let’s be clear, Epic is certainly on the attack. And as Matt notes, these are indeed peculiar times:

Typically large businesses reserve their political capital to limit government enforcers or regulators, but something strange started happening a few years ago, and it’s continuing today with this suit. Businesses began battling each other, and some started asking for an expansion of public power to structure markets. This suit is the latest on that score, the escalation of an ideological civil war within the business world, an attempt to undo deep changes in our society implemented forty years ago.

Hard to take your eyes off this one. Grab some popcorn.

Related:

Epic Games Kicks Off the Civil War in American Business Epic Games Seeks to Form Coalition of Apple Critics ‘Fortnite’ Creator Says Apple Is Threatening to Curb Access to Software Tools 3. We also can’t talk about Epic without talking about its founder, Tim Sweeney. Because for Tim, this isn’t simply a battle for money — although it clearly is. But Tim is waging a philosophical crusade in order to create a digital world that’s less closed and centralized and more distributed and open and interoperable. (Sound familiar?)

Here are some highlights from a fantastic VentureBeat interview:

We talked about his vision for the Metaverse, the virtual world envisioned by Neal Stephenson in his 1997 novel Snow Crash. Sweeney thinks it’s possible to build such a world and the digital economy that goes with it, but he believes it shouldn’t be owned by a single company. He has been outspoken about ensuring openness in gaming in the past couple of years.

For one, he’s an outspoken critic of tech monopolists (which incidentally, came after a deep chat about blockchain and crypto):

GamesBeat: It’s a tool to pull power away from centralized authority, which is something you’ve worried about. Platform owners have proprietary technology.
Sweeney: And they use it to exert arbitrary control. They tilt the market in their favor. They censor people whose political views they disagree with. It’s a big and growing problem, the amount of power possessed by Google and Facebook. President Eisenhower said it about the military-industrial complex. They pose a grave threat to our democracy.

He talks about games and virtual worlds like we talk about GlobaliD Groups:

Sweeney: There’s real value in that. If you look at why people are paid to do things, it’s because they’re creating a good or delivering a service that’s valuable to somebody. There’s just as much potential for that in these virtual environments as there is in the real world. If, by playing a game or doing something in a virtual world, you’re making someone else’s life better, then you can be paid for that.
We need to start rethinking the way we structure these large-scale game economies, especially as they get bigger and more complex. They should not simply be a means for the developer to suck money out of the users. It should be a bi-directional thing where users participate. Some pay, some sell, some buy, and there’s a real economy.
If you look at what Valve has done with CS: GO and other games, they have a vibrant economy. Lots of people earn a living in those games. When you buy something, you’re not just buying it from Valve. You’re often buying it on an open marketplace from some other seller. It’s a very complex economy. There’s a lot of potential in that. We need to move toward a model like that in order to scale up to something like the metaverse. As long as it’s just the machination of a single company, there’s not going to be this bidirectional exchange of value. We need to keep that circulating.

He believes we’re on the cusp of a new chapter for our digital world — one that’s less centralized closed and increasingly distributed and open:

Sweeney: It’s really popular. That’s not even a closed economy. You’re not actually getting any incremental value for these tips. Imagine if you actually got additional benefits for doing that — recognition or power or something like that. I think that could go much further than it’s gone before.
These technology revolutions come in waves. The social network could have developed 10 years earlier than it did. You just needed a generation of people that were okay with putting all their personal information into these services. The limiting factor wasn’t technology. It was social expectations.
Now we’re approaching 10 years of the blockchain. As these technologies become more widely accepted, we’ll see a whole new generation of distributed systems, which really changes the game — away from closed to open. Instead of building a business that extracts money from a service, you build an economy in which everybody can be rewarded for participating in many different ways.

Tim’s been talking about this stuff for years. He’s been in the industries for decades — started programming BASIC on an IBM PC at the age of 11.

What’s super cool, though, is that he’s putting his money where his mouth is. He’s on the frontlines, battling for a digital world that he truly believes in.

4. BONUS: What Tim Sweeney thinks about blockchain stuff.

Let’s just say, he’s a fan:

Sweeney: There are some incredibly powerful ideas there. I feel like they’ve only begun to be developed. There’s Bitcoin, but the more interesting one is Ethereum, which runs general programs in the blockchain. Instead of mining coins by verifying transactions, miners are running programs specified in the blockchain and computing the results of those. They can be anything from programs to effect a single transaction, exchanging money for a good, or they can be entire distributed autonomous organizations that execute governance rules designed by shareholders and can be extended over time by majority vote. You can implement these arbitrarily complex things.
You come to the realization that the blockchain is really a general mechanism for running programs, storing data, and verifiably carrying out transactions. It’s a superset of everything that exists in computing. We’ll eventually come to look at it as a computer that’s distributed and runs a billion times faster than the computer we have on our desktops, because it’s the combination of everyone’s computer.

5. Digital worlds, digital identities. Facebook is merging Oculus identity with Facebook’s.

/gregkidd:

The Oculus move (under the direction of Facebook) towards a single, unified, and somewhat portable identity, bespeaks the future to come.

Related:

A Single Way to Log Into Oculus and Unlock Social Features Instagram launches QR codes globally, letting people open a profile from any camera app 6. Some quickies on TikTok and WeChat: Mark Zuckerberg lobbied hard for TikTok ban, and Trump is privately telling businesses they can still use WeChat.

WSJ on Mark:

When Facebook Inc. Chief Executive Mark Zuckerberg delivered a speech about freedom of expression in Washington, D.C., last fall, there was also another agenda: to raise the alarm about the threat from Chinese tech companies and, more specifically, the popular video-sharing app TikTok.
Tucked into the speech was a line pointing to Facebook’s rising rival: Mr. Zuckerberg told Georgetown students that TikTok doesn’t share Facebook’s commitment to freedom of expression, and represents a risk to American values and technological supremacy.
That was a message Mr. Zuckerberg hammered behind the scenes in meetings with officials and lawmakers during the October trip and a separate visit to Washington weeks earlier, according to people familiar with the matter.
In a private dinner at the White House in late October, Mr. Zuckerberg made the case to President Trump that the rise of Chinese internet companies threatens American business, and should be a bigger concern than reining in Facebook, some of the people said.
Mr. Zuckerberg discussed TikTok specifically in meetings with several senators, according to people familiar with the meetings. In late October, Sen. Tom Cotton (R., Ark.) — who met with Mr. Zuckerberg in September — and Sen. Chuck Schumer (D., N.Y.) wrote a letter to intelligence officials demanding an inquiry into TikTok. The government began a national-security review of the company soon after, and by the spring, Mr. Trump began threatening to ban the app entirely. This month he signed an executive order demanding that TikTok’s Chinese owner, ByteDance Ltd., divest itself of its U.S. operations.
Few tech companies have as much to gain as Facebook from TikTok’s travails, and the social-media giant has taken an active role in raising concerns about the popular app and its Chinese owners.

A touch of irony here given the influence Facebook had in making TikTok a raving success, as we discussed a couple weeks ago.

On the WeChat front, there’s now a lawsuit. WSJ:

The lawsuit, filed in U.S. District Court in San Francisco, claims the executive order is unconstitutional. It was filed by U.S. WeChat Users Alliance, a nonprofit organization, as well as other plaintiffs including a small business and several individuals.
The plaintiffs aren’t connected with the popular app or its Chinese-based parent, Tencent Holdings Ltd., representatives said. They said the user alliance was formed by people who depend on WeChat in their business and personal lives.

(TikTok is also filing a lawsuit.)

In classic Trump fashion, the president is playing both sides:

The Trump administration is privately seeking to reassure U.S. companies including Apple Inc. that they can still do business with the WeChat messaging app in China, according to several people familiar with the matter, two weeks after President Donald Trump ordered a U.S. ban on the Chinese-owned service.
In recent days, senior administration officials have been reaching out to some companies, realizing that the impact of an all-out ban on the popular app, owned by China’s Tencent Holdings Ltd., could be devastating for U.S. technology, retail, gaming, telecommunications and other industries, people familiar with the discussions said.

Just another day, I guess.

Related:

Facebook Faces Hate-Speech Questioning by Indian Lawmakers After Journal Article Mark Zuckerberg Questioned Under Oath in F.T.C. Antitrust Inquiry TikTok Poised for Deal to Avoid Millions in U.S. Privacy Damages Oracle enters race to buy TikTok’s US operations 7. Meanwhile, in the dollar-denominated world. Here’s a quick update from the latest Fed policy meeting:

Minutes from the Fed’s July policy meeting were released Wednesday and policymakers’ dour outlook suggests that more easing and stimulus could be on the way, strategists who closely watch the central bank say.

Why it matters: More liquidity from the Fed could mean more gains for stock and bond prices and further erosion of the dollar.

Background: The Fed has tapered off its quantitative easing bond-buying program and additions to its balance sheet in recent months as credit markets have smoothed and the S&P 500 has risen back to record highs.

But worries about a languishing economy, a lack of action from Congress and rising long-dated bond yields could spur action.

What they said: “Noting the increase in uncertainty about the economic outlook over the intermeeting period, several participants suggested that additional accommodation could be required,” the minutes noted.

Visually, kind of similar to Apple’s chart — but with very different implications.

8. On the other side of the (digital) coin, the race is heating up.

Axios:

China is moving ahead “rapidly” with its version of a central bank-issued digital currency and the Fed looks to be prioritizing development and moving forward with urgency to produce one in the U.S. as well.
Why it matters: Digital currencies would provide a number of new policy tools to help stimulate the economy, including allowing Congress to send money more quickly and efficiently to Americans or facilitating direct transmissions from the Fed to consumers.
What’s happening: Spending patterns by the unemployed and others who received funds through the CARES Act and private company progress on digital currencies “has intensified calls for [central bank digital currencies] CBDCs to maintain the sovereign currency as the anchor of the nation’s payment systems,” Fed governor Lael Brainard said in a speech earlier this month. “Moreover, China has moved ahead rapidly on its version of a CBDC.”

Watch this space: Separate House and Senate bills have emerged this year proposing the creation of digital currencies.

The intrigue: A digital currency could also help the Fed implement monetary policy by setting interest rates on consumers’ accounts holding the digital currency — analysts say this also could be a more efficient way for the Fed to institute negative interest rates to boost consumption and inflation.
Between the lines: China already is far ahead of the U.S. and much of the developed world in mobile payments, and a new digital currency could provide another avenue to challenge the dollar’s supremacy as the world’s funding currency.

What’s next: The Fed could provide some hints on the next steps in developing a digital currency at its annual Jackson Hole Symposium, which is set to kickoff virtually on Thursday.

This new “Cold War” with China — whatever you want to call it — may very well have a silver lining. Sure, there’s going to be plenty of disruption — as we’re already seeing with the trade war, Huawei, TikTok, etc.

But it’s also creating real urgency for companies, central banks, and governments to deliver the goods. There’s nowhere to hide and everyone’s looking over the shoulder. That’s a tense environment, but it also portends potential forward progress.

Maybe a little competition is exactly what we need.

Via /pstavhttps://twitter.com/MariaShen/status/1296225281779220480:
“China is starting to test the Digital RMB and educate its citizens. 🇨🇳 This video is circulating WeChat: China’s digital currency in action w/ the Agricultural Bank of China”
New Digital Realities; New Oversight Solutions | Shorenstein Center A Global Look at Central Bank Digital Currencies | Full Research Report — The Block The OCC’s Crypto Custody Letter Was Years in the Making — CoinDesk Bond trading: technology finally disrupts a $50tn market JPMorgan report reveals ‘dramatic’ Covid shift to electronic bond trading APIs Will Decentralize CBDCs — CoinDesk 9. Stuff happens: Exclusive: Joe Biden unlikely to push carbon tax as part of climate change plan Via /jvs — Taking the Sovrin Foundation to a Higher Level: Introducing SSI as a Universal Service — Sovrin A world divided into “cans” and “cannots” Via /pstavAmerica’s College Towns Are Facing an Economic Reckoning Dan Primack — Filecoin drama

The GiD Report#124 — Fortnite’s philosophical crusade for a more open, interoperable future was originally published in GlobaliD on Medium, where people are continuing the conversation by highlighting and responding to this story.


KuppingerCole

The Role of AI in Modern Business Intelligence Platforms

by Anne Bailey Business intelligence (BI) platforms have become essential for enterprises with an exponential increase in data generation and usage. The next generation of BI platforms will expand on the volume of data analysis and the degree of control a business user has over the process. This Leadership Brief presents the several roles that artificial intelligence plays in BI, and what your or

by Anne Bailey

Business intelligence (BI) platforms have become essential for enterprises with an exponential increase in data generation and usage. The next generation of BI platforms will expand on the volume of data analysis and the degree of control a business user has over the process. This Leadership Brief presents the several roles that artificial intelligence plays in BI, and what your organization can do to take full advantage of it.


Privacy and Consent Management

by Anne Bailey This report provides an overview of the market for Privacy and Consent Management platforms and provides you with a compass to help you to find the solution that best meets your needs. We examine the market segment, vendor service functionality, relative market share, and innovative approaches to providing solutions that enable you to collect and manage consent in a compliant and p

by Anne Bailey

This report provides an overview of the market for Privacy and Consent Management platforms and provides you with a compass to help you to find the solution that best meets your needs. We examine the market segment, vendor service functionality, relative market share, and innovative approaches to providing solutions that enable you to collect and manage consent in a compliant and privacy-centric manner.


SELFKEY

Undercollateralized Loans: The Future of DeFi?

In this article we discuss the latest service-offering from the DeFi world, undercollateralized loans, and how it is touted as the future of DeFi. The post Undercollateralized Loans: The Future of DeFi? appeared first on SelfKey.

In this article we discuss the latest service-offering from the DeFi world, undercollateralized loans, and how it is touted as the future of DeFi.

The post Undercollateralized Loans: The Future of DeFi? appeared first on SelfKey.

Monday, 24. August 2020

Authenteq

Anti-spoofing tips and tools to keep you, and your company, safe

Those that work in the digital privacy space may feel that they aren’t at risk like other companies or individuals. We here at […] The post Anti-spoofing tips and tools to keep you, and your company, safe appeared first on Identity Verification & KYC | Authenteq.

Those that work in the digital privacy space may feel that they aren’t at risk like other companies or individuals. We here at Authenteq are not those people. We know that protecting one’s privacy online extends far beyond just safeguarding the identity of our customers, we take the security of our own digital identities, and our shared systems, seriously. As the saying goes, sometimes you have to eat your own dog food. So, in the spirit of transparency, we thought we’d share how we handle things here to help anyone that may be looking for ways to better protect themselves.

The Cost of Spoofs

No one who uses the internet – and really, is there anyone who doesn’t? – is immune to cyberattacks. A simple check on sites like Pwned can show you all the compromised sites where your email address has been used. Last year, a hacker gained access to CapitalOne’s system affecting 100 million Americans and 6 million Canadians, costing the company upwards of $100 million. Phishing and spoofing attacks have risen steadily every year and according to Agari’s H2 2020 Email Fraud & Identity Deception Trends Report, 80 percent of FTSE 100 companies are vulnerable to brand impersonation online. TL; DR anti-spoofing efforts should matter to everyone.

Email and Phone

Ensuring that you request some pretty basic information may seem obvious, but there are easy steps to implement to protect your employees and customers that can go a long way.

At Authenteq, we’ve changed fields on our website so that only someone with a professional, or corporate, an email address can schedule a demo. That means no Hotmail, no Gmail, no personal emails at all. It’s a good first step in preventing either bad actors from entering your system as a worst-case scenario, but it’s also a way to prevent low quality leads from taking up time from your workforce by flooding support with questions, booking demos of products they don’t intend to purchase or license, and generally wasting time and resources. If an email address does make its way into your system, flag it as spam and block it immediately.

Regular Training 

Keep your staff up to date by offering regular training around anti-spoofing measures. Just like any facet of life, trends come and go, and so it’s important to know the latest. When new scams emerge, ensure that you have measures in place to protect yourselves. Your employees should also be aware of any new scams in case customers get targeted and turn to their sales or account representative looking for guidance or assurance.

COVID, for example, brought a new wave of scams, particularly around robocalls, as a flood of new financial assistance funds were released. People were stuck at home, either by their own choice or by their government’s insistence, which meant companies or government agencies suddenly reaching out by phone didn’t feel at all out of place.

Furthermore, your employees themselves need to make sure they are verifying their own emails for suspicious emails, know not to share passwords or private information via email, and to always back up their work. All it takes is one employee that opens a phishing email and your entire operation could be compromised. If an employee gets an email from their CEO that seems off, they should know and feel empowered to be able to question it instead of opening it and clicking suspicious links.

Two-Factor Authentication

As companies grow, the security measures that worked for a team of say 10 employees will not work for a team of 100. One thing companies often fail to implement is increased security as this growth happens. Having to go back and adjust things retroactively is much harder than putting in what may feel like an overly vigorous system in place before the need is felt. Two-factor authentication on emails, sites where internal documents are kept like Jira or GitHub, or even shared social media sites is a great best practice to set up. At Authenteq, we require employees to set up two-factor authentication for our internal emails, and they must reverify their authentication every time a new device is used or if they’re in a new location.

In addition to 2FA, we use 1Password, a password manager, for our shared accounts. We also have three different VPNs. All employees’ pay slips are encrypted. And for physical onsite security, the Nuki smart lock and app is the only way to gain access inside the office, and we keep employee records, confidential documents, and any expensive hardware in a safe.

Planning

Having a plan in place to respond to any worst-case scenario is a good idea for some and crucial for others. Even if it’s not on your “need to have”, we can’t think of any situation in which this wouldn’t be a “nice to have.” If a hacker breached your internal IT system, consider what that would mean, what and who would be affected and most at risk, and the people on your team who could respond.

Liveness Checks 

Integrating a fully automated KYC solution into your onboarding process is the ultimate tactic to prevent spoofers from sneaking in. But identity verification tools with an accompanying liveness check leaves almost no room for them.

Liveness checks can also prevent deepfakes from entering your system, the latest in spoofing threats. Using neural network-powered artificial intelligence and machine learning, images and video footage can be fed into algorithms to create a “mask”, replacing or manipulating the original faces with lookalike data. Deepfakes do pose a threat to identity theft online, but can also be prevented with sophisticated KYC tools and liveness checks.

If you’re looking to take your anti-spoofing efforts to the next level, schedule a demo with our experts today. 

The post Anti-spoofing tips and tools to keep you, and your company, safe appeared first on Identity Verification & KYC | Authenteq.


Ontology

In The Fast Lane: Ontology Accelerates Decentralized Identity for the Automotive Industry

Ontology Launches New Decentralized Identity For the Automotive Industry Ontology is focused on building trust in the automotive industry through the development of a blockchain-based decentralized digital identity solution that allows for driving data to be recorded, stored, and verified in a safe and secure way, offering a range of benefits to drivers and the wider automotive industry. How doe

Ontology Launches New Decentralized Identity For the Automotive Industry

Ontology is focused on building trust in the automotive industry through the development of a blockchain-based decentralized digital identity solution that allows for driving data to be recorded, stored, and verified in a safe and secure way, offering a range of benefits to drivers and the wider automotive industry.

How does it work? Meet Daphne, and see how her driving experience is made easier with Ontology

Ontology Records Daphne’s Driving Data Making Her Driving Experience Better

Daphne is a florist and vlogger. She begins her day driving to the flower shop, activating ONT ID, Ontology’s mobile digital ID application and decentralized framework that allows users to securely manage their digital identity by storing their data locally on a phone or on trusted cloud storage with a private key that grants access only to the verified user. When Daphne opens her smart lock enabled car, her driving data has been recorded and stored in a safe and secure way.

Ontology Allows Daphne To Book Concert Tickets Quickly Using Ontology NFTs

While making her way to the flower shop, Daphne uses her built-in ONTO self-sovereign data wallet, a one-stop mobile application for the management of digital identity, data, and assets, to book concert tickets with her friends. She pays using ONT, Ontology’s utility token. Daphne books the tickets using built-in voice recognition, meaning she can safely use the application while driving. Daphne is then able to send the tickets directly to her friend through the Ontology network, which transfers the cinema ticket in the form of a Ontology Non-Fungible Token (NFT).

Ontology Allows Daphne To Earn ONT By Renting Her Car To Other Verified Safe Drivers

Later in the day, Daphne goes online to an Ontology-powered marketplace where she can earn ONT by renting her car to other verified, safe drivers. Daphne is able to easily verify that the driver she rents her car to is a licensed driver with a good driver history according to their ONT Score, Ontology’s decentralized reputation scoring system.

Following A Minor Car Accident, Daphne Settles Her Insurance Payment Immediately Through An Ontology-powered Insurance Application

After work, Daphne is back in her car on the way to the cinema, when she gets into a minor road traffic accident. By using an Ontology-powered insurance platform, Daphne can settle her insurance payment immediately. Because the information is recorded on the Ontology blockchain, the process is fully verified, fast, and efficient. Now needing to fix her car, Daphne is able to easily order replacement car parts. Details of Daphne’s car model is recorded in a decentralized way, making the process simple and ensuring Daphne can purchase the right parts.

Throughout Daphne’s day, Ontology has helped make her driving experience better. By integrating Ontology’s decentralized identity solution, Ontology is bringing much-needed trust to the driving industry, not only for Daphne, but for insurers, car renters, and other drivers, too.

Want to find out more about Ontology Automotive Solution?

Visit https://ont.io/automotive or get your customized solution by emailing us at business@ont.io.

Find Ontology elsewhere

Ontology website / ONTO website / OWallet (GitHub)

Twitter / Reddit / Facebook / LinkedIn / YouTube / NaverBlog / Forklog

Telegram Announcement / Telegram English / GitHubDiscord

In The Fast Lane: Ontology Accelerates Decentralized Identity for the Automotive Industry was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


procivis

The Digital State, Part 4

The symbol for the denarius, the reference coin in the Roman Empire This is the fourth in the series on the paradigm shift to a digital state, this time focusing on new digital forms of government-issued money. Technology is changing the building blocks of government. In Part 1 we saw that the trend to e-government […] The post The Digital State, Part 4 appeared first on Procivis.

The symbol for the denarius, the reference coin in the Roman Empire

This is the fourth in the series on the paradigm shift to a digital state, this time focusing on new digital forms of government-issued money.

Technology is changing the building blocks of government. In Part 1 we saw that the trend to e-government is leaving no pillar untouched. Part 2 showed how the technology by which people make their political choices is changing, and Part 3 looked at how law-making is being shaped by new digital tools.

The rush to digital government is now evident everywhere. Recently, for example, we learnt about the chaos caused in the UK by a government algorithm that decided which students passed their university entry exams. [1] The point of this negative example is not to be negative, but merely to show that the state is now embracing digital tools big time.

And what can be a bigger than government-issued digital currencies?

E-government and its money

The latest technology has always shaped the physical characteristics of money. We know that there was a time without coins – a pre-metallic monetary period. One form of money then was cowrie shells, which could not be counterfeited. Its monetary area encompassed the Indian ocean coastal regions of Africa, Asia and Oceania. We know that when the shift to metallic money happened, coins started out as little blobs. As the technology got more sophisticated, the coins became rounder and flatter, with symbols stamped on them. We know that the metallic composition of coins was impacted by mining discoveries – also linked to new technology. We know that Marco Polo was hugely impressed when he discovered that the Chinese were using a new technology for money: paper. And we know that Australia introduced the first banknotes made from a rubber-like substance (polymer). So, we know that the composition of money reflected the technology and resources of the era.

Should we now be terribly surprised, therefore, that in our era the state is experimenting with new technology for the issuance, distribution, safe-keeping and exchange of money? No. And should we be surprised that the essence of this new money is something we cannot see or touch: digital stuff? No. And wouldn’t Marco Polo have been impressed? Yes!

So, what’s new about CBDCs?

While computers are not new and electronic payments are not new and central bank digital book entries are not new, there is something new about central bank digital currencies (CBDCs). The new feature is that technology is changing how and where they can be distributed and stored.  

A new digital highway, interspersed with cool digital gadgets, is creating a new monetary transmission mechanism directly from the central bank to those devices. The technology is not holding up its introduction, the missing legal framework is. Where is the law that governs the new digital wallets at the central bank? Where are the laws that govern the settlement of cross-border payments linked to these new wallets?

What we know from history is that where technology leads, society and the law inevitably follow. The new laws will also follow from the local driving forces. One such case is currently happening in Switzerland. The country’s stock exchange is building a trading platform for “digital assets”. The central bank needs to understand what that might mean for the payments system, counterparty risk and systemic stability. One potential outcome could be a “digital Swiss franc”, which would be used to settle payments between wholesale participants on that platform. Nothing has been decided yet. But if that were decided, it could happen as early as 2021.

Central banks all over the world are taking a closer look at what this new digital infrastructure might mean for their mandates. While the motivations and urgency differ from country to country, it appears at this point that the introductions of the first central bank digital currencies will not replace “cash” – they will merely be an additional pool of digital units for a newly defined purpose, distributed and stored via a new infrastructure. It also appears clear at this stage that central banks are not keen on the idea that retail clients should have digital wallets at the central bank.

Blockchain agnostic

Central bank digital currencies do not need blockchain. Recall that bitcoin was born as the anti-central bank fiat currency. Central banks want control over the amount of money they issue; they are not going to give up that control to a public blockchain. Central bank digital money is anti-bitcoin money; it has a central counterparty: the central bank. Some central banks are indeed experimenting with blockchain, and it is not inconceivable that some could even issue a currency on a private or permissioned blockchain, but that means they will still exercise control.

Concluding thoughts: What we can learn from the big Ӿ

The era of central bank digital currencies is dawning just as governments all over the world have issued war-time levels of debt and central banks have increased their balance sheets to previously unimaginable sizes. It is not unreasonable to ask if central bank digital currencies stored on new digital wallets would not make it easier to push more digital units through the system or even remove units via new negative interest rates? The answer to both questions is probably yes.

The algorithm problem in the UK cited above is also meant to illustrate the point that e-government also needs a superior governance framework. This is not just about downloading new software; e-government should also be smart government that serves the people more efficiently.

Similarly, the introduction of central bank digital currencies will need a superior governance framework to prevent bad outcomes. Here we perhaps need to be reminded about bad monetary outcomes in history. It is difficult to separate the role of the issuer of money, be it a mint in ancient Greece or a modern central bank, from the fiscal obligations of its owner, the sovereign.  These are two sides of the coin. Several writers who focused on the finances of the ancient world conclude that the unsustainable public finances of ancient Greece and Rome were ultimately the real cause of the collapse of those empires. [2]

In one of the largest recorded increases in the money supply of the Roman Empire – a tenfold rise in the circulation of mainly silver coinage between 157BC and 50BC – smart administration and structural changes initially held back inflation. The measures included more cash reserves held by the banks and the Roman treasury. But these supportive factors eventually reversed. The Roman coin the denarius (which had its own symbol Ӿ), was gradually debased from a practically pure silver coin to one just washed in silver. Its value plummeted (figure 1).

Figure 1: The price of one pound of gold in denarii, log scale, 50 BC to 350 AD

Source [3]

The wealthy were initially able to protect themselves from inflation with a diversified portfolio that included gold, silver and real estate. But when Rome eventually fell to the invaders, and the new capital moved to Ravenna, the rich could not take their land with them.

Rome was not built in a day and its money was not destroyed in a day either. We will need a smart new governance framework for the era of central bank digital currencies.

By Costa Vayenas

References

[1] https://www.engadget.com/uk-algorithm-a-levels-gcse-results-143503870.html

[2] Davies, Glyn. 1994. A History of Money: From Ancient Times to the Present Day. University of Wales Press, p. 95-96

[3] Davies, Glyn. 1994. A History of Money: From Ancient Times to the Present Day. University of Wales Press, p. 107.

The post The Digital State, Part 4 appeared first on Procivis.


PingTalk

Future Proofing Your Workforce

Identity security is transforming the way we work   This Article was first published as a commercial feature on bbc.com and was created by BBC StoryWorks, BBC Global News’s commercial content division, on behalf of Ping Identity.   Richard Bird, the Chief Customer Information Officer at Ping Identity, has been busy. Since stay-at-home orders relegated entire global corporations t
Identity security is transforming the way we work

 

This Article was first published as a commercial feature on bbc.com and was created by BBC StoryWorks, BBC Global News’s commercial content division, on behalf of Ping Identity.

 

Richard Bird, the Chief Customer Information Officer at Ping Identity, has been busy. Since stay-at-home orders relegated entire global corporations to the home office, he has been racing to help clients secure the digital identities of hundreds of thousands of employees, condensing projects that typically take months into mere days.

 


Smarter with Gartner - IT

6 Trends on the Gartner Hype Cycle for the Digital Workplace, 2020

When employees were sent home from their offices en masse amid the global onset of COVID-19, many businesses scrambled to adopt technology solutions to enable their teams to work remotely. IT leaders have now realized the urgency to scale up their digital workplace technology stacks to ensure the long-term resiliency of their business. The pandemic rapidly elevated many digital workplace techn

When employees were sent home from their offices en masse amid the global onset of COVID-19, many businesses scrambled to adopt technology solutions to enable their teams to work remotely. IT leaders have now realized the urgency to scale up their digital workplace technology stacks to ensure the long-term resiliency of their business.

The pandemic rapidly elevated many digital workplace technologies from nice-to-have to must-have status

“According to the Gartner 2020 Digital Workplace Survey, 68% of respondents agreed that more C-level execs have expressed involvement in the digital workplace since COVID-19,” says Matt Cain, Distinguished VP Analyst, Gartner. “From meeting solution software, to enterprise chat platforms, to desktop-as-a-service, the pandemic rapidly elevated many digital workplace technologies from nice-to-have to must-have status.” 

The need for replacements for in-person activities is leading to heightened interest in the emerging technologies included in the Gartner Hype Cycle for the Digital Workplace, 2020. Here are the top digital workplace trends on the Hype Cycle that CIOs will be paying attention to in the years to come.

Learn more: About the Gartner Hype Cycle Methodology

[swg_ad]

Trend 1: The new work nucleus

The “new work nucleus” refers to a collection of SaaS-based personal productivity, collaboration and communication tools, combined into one cloud office product. It generally includes email, instant messaging, file sharing, conferencing, document management and editing, search and discovery, and collaboration. 

The new work nucleus has become a cornerstone of most organizations’ digital workplace infrastructures, particularly in today’s remote work environment. Reliance on cloud office technologies has increased due to a general preference for cloud, as well as the desire to reduce costs, drive simplicity and continuously provide more functionality to employees. Vendors are upgrading cloud services with new mobility, content discovery and artificial intelligence (AI) features.

Trend 2: Bring your own thing to the digital workplace

Individuals are beginning to use more personal Internet of Things (IoT) devices or wearables at work, in a trend known as bring your own thing (BYOT). BYOT involves a wide range of objects, such as fitness bands, smart lights, air filters, voice assistants, smart earbuds and consumer virtual reality (VR) headsets. In the future it will include highly sophisticated devices such as robots and drones.

“As homes and domestic technology become smarter and consumers acquire more IoT technology, a growing range of personal things will be brought into offices or used to support remote working,” says Cain.

Explore now: 5 Trends Drive the Gartner Hype Cycle for Emerging Technologies, 2020

Trend 3: The distance economy

In-person events and meetings were once the norm and virtual meetings the exception, but COVID-19 has flipped those scenarios. The pandemic influenced the emergence of the distance economy, or business activities that don’t rely on face-to-face activity. Organizations with operating models that depend on first-party or hosted events have switched quickly to virtual alternatives.

Simultaneously, as internal meetings, client interactions, new hire interviews and a variety of other business activities have gone virtual, the distance economy has given rise to a new generation of meeting solutions that attempt to more closely mimic an in-person meeting.

Read more: 3 Keys to Leading Large-Scale Virtual Meetings

Trend 4: Smart workspaces

A smart workspace leverages the growing digitalization of physical objects to deliver new ways of working and improve workforce efficiency. Examples of smart workspace technologies include IoT, digital signage, integrated workplace management systems, virtual workspaces, motion sensors and facial recognition. Any location where people work can be a smart workspace, such as office buildings, desk spaces, conference rooms and even home spaces. 

As employees return to work post-COVID-19, organizations will take full advantage of smart workspaces as they revisit design strategies to better understand how people participate in physical spaces or adhere to social distancing. Such insight can create new capabilities related to seating and room allocation, access management and wayfaring.

Trend 5: Desktop-as-a-service

Desktop as a service (DaaS) provides users with an on-demand, virtualized desktop experience delivered from a remotely hosted location. It includes provisioning, patching and maintenance of the management plane and resources to host workloads. 

COVID-19 highlighted the value and business continuity strength of DaaS

Organizations have long been interested in adopting virtual desktop infrastructure, but complexity and capital investment have made implementations difficult. COVID-19 highlighted the value and business continuity strength of DaaS in its ability to rapidly enable remote work where on-premises options have stalled. The pandemic is likely to accelerate adoption of DaaS, and it may even perpetuate as a delivery architecture when employees return to the office.

Trend 6: Democratized technology services

Technology services of the future will be assembled and composed by the people that actually use them. A few examples of democratized technology services include:

Citizen developers, or employees who create new business applications using development tools and runtime environments sanctioned by corporate IT. Citizen developers are empowered by the availability and power of low-code and “no code” development tools.  Citizen integrator tools, which enable expert business users with minimal IT skills to handle relatively simple application, data and process integration tasks by themselves through very intuitive, no-code development environments. Citizen data science, an emerging set of capabilities that enable users to extract advanced analytic insights from data without the need for extensive data science expertise.

The post 6 Trends on the Gartner Hype Cycle for the Digital Workplace, 2020 appeared first on Smarter With Gartner.

Sunday, 23. August 2020

KuppingerCole

KuppingerCole Analyst Chat: Ephemeral Credentials

Alexei Balaganski and Matthias Reinwarth discuss the concept of ephemeral credentials and its benefits for privilege management, DevOps and beyond.

Alexei Balaganski and Matthias Reinwarth discuss the concept of ephemeral credentials and its benefits for privilege management, DevOps and beyond.



Friday, 21. August 2020

KuppingerCole

Oct 08, 2020: How to Hunt Threats Effectively With Network Detection & Response Solutions

The number of cyber-attacks globally continue to rise. Attacks are growing increasingly sophisticated. The tactics, techniques and procedures that were once only used by well-funded state actors are being commoditized by cybercriminals. State actors sometimes employ tools that were formerly mostly used by cybercriminals. The threat landscape evolves continuously.
The number of cyber-attacks globally continue to rise. Attacks are growing increasingly sophisticated. The tactics, techniques and procedures that were once only used by well-funded state actors are being commoditized by cybercriminals. State actors sometimes employ tools that were formerly mostly used by cybercriminals. The threat landscape evolves continuously.

Thursday, 20. August 2020

KuppingerCole

Paul Fisher: In the Future PAM will Become Embedded in the IT Stack

Paul Fisher will expand on his analysis of how Privileged Access Management platforms will develop support for DevOps and other key users. This will mean that certain PAM functions will be embedded within the technology stack, opening up password free and secure access paths and enable rapid task fulfilment.

Paul Fisher will expand on his analysis of how Privileged Access Management platforms will develop support for DevOps and other key users. This will mean that certain PAM functions will be embedded within the technology stack, opening up password free and secure access paths and enable rapid task fulfilment.




Kari Nousiainen: Planning and Deploying Identity Federation based PAM Using Certificates

 Metso Outotec have recently deployed PrivX from SSH.COM as a Privileged Access Management system to provide audited secure access to server administrators and developers. They have integrated the PrivX PAM solution into their existing workforce identity management solution to provide secure audited access using Just-In-Time certificate-based access rather than passwords.

 Metso Outotec have recently deployed PrivX from SSH.COM as a Privileged Access Management system to provide audited secure access to server administrators and developers. They have integrated the PrivX PAM solution into their existing workforce identity management solution to provide secure audited access using Just-In-Time certificate-based access rather than passwords.




John Ovali: Privileged Access Management – Motivation and Benefits

Why PAM is a must and how you can benefit from it: Many corporations need to comply with regulations which result in extended logging and monitoring of privileged activities. The Presentation shows how to start a successful PAM implementation and how to benefit from it.

Why PAM is a must and how you can benefit from it: Many corporations need to comply with regulations which result in extended logging and monitoring of privileged activities. The Presentation shows how to start a successful PAM implementation and how to benefit from it.




David Wishart: How to Solve the Top 5 Access Management Challenges in Hybrid Cloud Environments

SSH.COM polled 625 IT and application development professionals across the United States, United Kingdom, France, and Germany to find out more about their working practices. We found that cloud and hybrid access solutions, including privileged access management software, slow down daily work for IT and application development professionals. These hurdles encourage users to

SSH.COM polled 625 IT and application development professionals across the United States, United Kingdom, France, and Germany to find out more about their working practices. We found that cloud and hybrid access solutions, including privileged access management software, slow down daily work for IT and application development professionals. These hurdles encourage users to take risky shortcuts and workarounds that put corporate IT data at risk. 

Join SSH.COM’s David Wishart, VP Global Partnerships, to learn:

Why the user experience of the privileged access management (PAM) solution matters for IT admins and developers in hybrid cloud environments How just-in-time access, without leaving behind credentials, helps towards that goal while improving operational speed


Interview with Dave Wishart




Alexander Koerner: Success Factors PAM Projects

Often I saw in the planning of PAM Projects that the Manager „only“ plan the implementation of the tool. It was a quite good installation but the project was not successful. Here some points I learned to have the customer happy and bring the Project to success:   Right Projectplanning with the right scope The Right Strategy Hand Over to run phase Documentation Interfaces to othe

Often I saw in the planning of PAM Projects that the Manager „only“ plan the implementation of the tool. It was a quite good installation but the project was not successful. Here some points I learned to have the customer happy and bring the Project to success:

 

Right Projectplanning with the right scope The Right Strategy Hand Over to run phase Documentation Interfaces to other Systems Lessions Learned


KuppingerCole Analyst Chat: NIST’s Zero Trust Architecture

John Tolbert and Matthias Reinwarth look at SP 800-207, the NIST special publication on Zero Trust architecture and discuss how it aligns with KuppingerCole's own vision of this topic (spoiler: it does align very well!)

John Tolbert and Matthias Reinwarth look at SP 800-207, the NIST special publication on Zero Trust architecture and discuss how it aligns with KuppingerCole's own vision of this topic (spoiler: it does align very well!)




Theresa Laager: How to Wreak Your PAM Project

A PAM Project needs to be handled like a relationship, if you neglect it and don’t treat it well, it will fail. Let me introduce you to some failsafe methods for ruining your PAM project

A PAM Project needs to be handled like a relationship, if you neglect it and don’t treat it well, it will fail.

Let me introduce you to some failsafe methods for ruining your PAM project




Jens Bertel Nykjær: Implementing PAM, How Did We Get Support and Buy-In From the Organisation?




Pooja Agrawalla: Are You Doing Privileged Access Management Right?




Joseph Carson: Privileged Access Cloud Security: Insider Tips and Best Practices

As the adoption of cloud applications and services accelerates, organizations across the globe must understand and manage the challenges posed by privileged access from remote employees, third parties, and contractors. With 77% of cloud breaches due to compromised credentials, making sure your users get easy and secure access to the cloud should be a top priority. Join Thycotic chief security

As the adoption of cloud applications and services accelerates, organizations across the globe must understand and manage the challenges posed by privileged access from remote employees, third parties, and contractors. With 77% of cloud breaches due to compromised credentials, making sure your users get easy and secure access to the cloud should be a top priority.
Join Thycotic chief security scientist and author Joseph Carson as he explains a practical approach to help you define and implement privileged access cloud security best practices. He will also share how Thycotic’s new Access Control solutions can safeguard cloud access by both IT and business users.




Panel - Addressing Enterprise Security Challenges with PAM




Rohit Nambiar: Cloud PAM: Challenges, Considerations And Approach

As Enterprises transitions to IaaS, Cloud Security and specifically IAM strategy and execution becomes crucial. IAM controls for IaaS/Public Cloud need to identify, secure and monitor Privilege Assets at the same time deal with the inherent elasticity, scalability and agility of the Public Cloud . As such a Privileged Access Management Program for Cloud i.e Cloud PAM is required to meet

As Enterprises transitions to IaaS, Cloud Security and specifically IAM strategy and execution becomes crucial. IAM controls for IaaS/Public Cloud need to identify, secure and monitor Privilege Assets at the same time deal with the inherent elasticity, scalability and agility of the Public Cloud . As such a Privileged Access Management Program for Cloud i.e Cloud PAM is required to meet the increasingly stringent compliance and audit regulations and keep enterprises secure.




Vibhuti Sinha: Cloud PAM on the Rise: The Future is Now

The new normal demands organizations to enable remote workplace in a rapid and secure way.  The new normal requires privileged asset owners to make intelligent, informed and right decisions even with a fragmented view of risk.  The new normal requires governance to be integrated and inherent with privileged access workflows and not an after-thought.  This session wou

The new normal demands organizations to enable remote workplace in a rapid and secure way. 

The new normal requires privileged asset owners to make intelligent, informed and right decisions even with a fragmented view of risk. 

The new normal requires governance to be integrated and inherent with privileged access workflows and not an after-thought. 

This session would give insights and best practices to create the least privileged model, minimizing the risks associated with standing privileges and prepare enterprises to rapidly transform themselves through secure digital transformation.




MedCreds

AB 2004 Passes Senate Appropriations 6-0

CA AB 2004 authorizes Verifiable Credentials for the transmission of medical test results from authorized medical authorities to patients. Creates a working group to develop best practices, and sets a standard providing the highest levels of consumer privacy and trust in this new technology.

August 10, 2020

CA AB 2004 authorizes Verifiable Credentials for the transmission of medical test results from authorized medical authorities to patients. Creates a working group to develop best practices, and sets a standard providing the highest levels of consumer privacy and trust in this new technology.

By passing the Senate Appropriations Committee today, AB 2004 takes one step closer to becoming law in California.  Watch the video of the Committee below!

This follows the bills introduction to the California House Privacy Committee on May 6th, 2020.  You can view a video of our Senior Medical Advisor, Raj Gupta of Vanderbilt University provide testimony to the CA House Privacy Committee Here.

Following the Privacy Committee, AB 2004 went to House Appropriations committee where it passed unanimously and then on the the House Floor. The House Floor hearing can be viewed here, where the bill was introduced by Assemblymember Ian Calderon. On August 8th, the Senate Business, Professions, and Economic Development Committee heard the bill and the video can be seen here.

Next steps, the bill will go to the full Senate Floor by August 31st. If the bill passes through the Senate, the Governor's office will have until September 30th to sign the bill into law.

For more information about MedCreds e-mail contact@medcreds.com or you can schedule a meeting with our team!


Trinsic (was streetcred)

Trinsic Basics: What Are SSI Digital Wallets?

The easiest way to explain self-sovereign identity (SSI) is to pull out an item that people carry on a daily basis—a wallet.¹ It’s strange to use a leather pouch to explain such a complex, nascent technology as SSI, but it works! And it works because most of the magic of SSI happens within and between […] The post Trinsic Basics: What Are SSI Digital Wallets? appeared first on Trinsic.

The easiest way to explain self-sovereign identity (SSI) is to pull out an item that people carry on a daily basis—a wallet.¹ It’s strange to use a leather pouch to explain such a complex, nascent technology as SSI, but it works! And it works because most of the magic of SSI happens within and between digital wallets.

Going from leather to digital

Apart from money, the leather wallet you carry in your pocket or purse stores important identifying cards (i.e., your driver’s license, health insurance card, employee badge, etc.). We show these cards to people and organizations on a daily basis to receive access to services by proving we are who we say we are. A wallet makes it easy to access these cards when we need them, and it keeps the cards from being lost or stolen.²


Just like a leather wallet stores a variety of important physical cards, an SSI digital wallet stores your verifiable credentials, making it easy for you to show those credentials to others in a secure, private way. As the name implies, a digital wallet is digital and can exist locally (ie., on your mobile device or desktop) or in the cloud. A more formal definition of digital wallets is below:

A digital wallet, in the context of self-sovereign identity, is a software application and encrypted database that stores credentials, keys, and other secrets necessary for self-sovereign identity.³

Each participant in the SSI ecosystem (the issuer, the holder, and the verifier) needs a digital wallet in order to issue, hold, and verify verifiable credentials.

What can you do with an SSI digital wallet? Collect credentials

Just like the leather pouch in your pocket stores important documents, your digital wallet stores your verifiable credentials. When you are issued a credential, it will be stored in your wallet. The wallet makes it so that the credentials are:

Easily accessible: With a click of a button, you have access to all of your credentials which you can seamlessly share with others. Secure: Your credentials are stored only on your device. No one else has access to them—not even the wallet provider (e.g., Trinsic). Private: You are in full control of who you send the credential to and what pieces of information on the credential you show them. Share verifiable information

Here’s where the rubber meets the road. When you have a digital wallet, anybody can request information from you. The information is entirely under your control, so you can choose to share the information or reject their request. If you decide to respond, the wallet finds credentials with the relevant data and shares just what they need to know, without sharing everything. For example, you might share your address from your driver’s license credential without sharing your name or birthdate. This is possible through zero-knowledge proofs (ZKPs).


Best of all, the information you share is instantly verifiable, so you can receive access to services seamlessly, just like you do in the real world when you use your physical wallet.

Connect with others

Finally, if there are parties you want to interact with frequently over time, you can establish a super-private, securely-encrypted connection between you and the other party. You can share information, messages, or any other interaction through this channel with a high degree of trust. While connections are powerful, they’re not always needed; connectionless credential exchange is sometimes a faster, more convenient way to interact, especially when the relationship isn’t long-term.

 

Making a connection is as easy as scanning a QR code with your wallet, after which your connection will show up under your “Connections” tab in your wallet. You can also create your own QR code so others can connect with you.

The Trinsic Wallet

The Trinsic Wallet is the world’s first SSI digital wallet to have proven interoperability and has become one of the most-used SSI wallets globally. As a cross-platform mobile application with a locally-stored wallet, the Trinsic Wallet is built to be ideal for developers while being accessible and simple enough for the average user. Also, when you use the Trinsic Wallet, you don’t need to worry about vendor lock-in because you can easily export your verifiable credentials and import them into another Aries-based digital wallet that supports import/export functionality. Download the Trinsic Wallet for free today or build your own wallet with Trinsic’s tools.

Where to start?

If you’re seeking to implement an SSI solution, you have two options:

Use an existing mobile wallet, like the popular Trinsic WalletConnect.Me, or Lissi. Develop your own wallet, custom-made for your use case.

To determine which option is best for you, read our blog “Building SSI Digital Wallets: The 3 Major Decisions“. Read more about SSI digital wallets in our documentation.

Notes Drummond Reed, an SSI expert, is famous for pulling out his wallet during presentations on SSI. Timothy Ruff, wrote about this tactic in his blog “When Explaining SSI, Start with the Wallet“. Of course wallets are not foolproof. They can be stolen or lost, but overall, they do a pretty good job of keeping all of our cards in one place that we can keep on us most times throughout the day. SSI digital wallets have their own security protections that are not listed out in detail here as this is an intro post. For simplicity’s sake, at Trinsic, we call it all a “wallet” although in more technical circles, you’ll find the term “agent” used for the application that routes messages and decrypts the wallet, and “wallet” used for the storage layer of the agent.

The post Trinsic Basics: What Are SSI Digital Wallets? appeared first on Trinsic.


One World Identity

Nok Nok Labs: Passwordless Authentication at Scale


KuppingerCole

Access Governance & Intelligence

by Richard Hill The Access Governance (AG) market is continuing to evolve through more intelligent features. This Leadership Compass will give an overview and insights into the AG market, providing you a compass to help you find the products that can meet the criteria necessary for successful AG deployments.

by Richard Hill

The Access Governance (AG) market is continuing to evolve through more intelligent features. This Leadership Compass will give an overview and insights into the AG market, providing you a compass to help you find the products that can meet the criteria necessary for successful AG deployments.


Oct 28, 2020: Using Deception for Early and Efficient Threat Detection

Most organizations are benefiting from the scalability, flexibility, and convenience of modern cloud services and new, highly distributed hybrid corporate networks. Unfortunately, many have also learned the hard way that defense of these systems and the assets they contain continue to remain prone to and the victim of cyberattacks and other security risks.
Most organizations are benefiting from the scalability, flexibility, and convenience of modern cloud services and new, highly distributed hybrid corporate networks. Unfortunately, many have also learned the hard way that defense of these systems and the assets they contain continue to remain prone to and the victim of cyberattacks and other security risks.

Wednesday, 19. August 2020

Global ID

GlobaliD App: Introducing SEPA and crypto transfers to your Wallet

Move Bitcoin directly into your GlobaliD Wallet! The GlobaliD App keeps getting better and better— now with SEPA and crypto transfers directly to your GlobaliD Wallet! See also: GlobaliD App — Introducing the Wallet As always, you can download it at your favorite app store — iOS or Android. Here’s what we have for you in this latest release, which includes enhanced privacy
Move Bitcoin directly into your GlobaliD Wallet!

The GlobaliD App keeps getting better and better— now with SEPA and crypto transfers directly to your GlobaliD Wallet!

See also: GlobaliD App — Introducing the Wallet

As always, you can download it at your favorite app store — iOS or Android.

Here’s what we have for you in this latest release, which includes enhanced privacy controls, upgrades to m essaging, and various bug fixes:

Manage Groups conversations from the Groups tab within messaging Toggle your Profile between Public and Private modes Find people more easily with improved search logic Self declare your legal information and taxpayer ID (SSN) with new verification apps Move in funds to your wallet via SEPA transfer or crypto accounts

As always, we want to give a big and warm shoutout to our design, product, and engineering teams working incredibly hard behind the scenes to provide you with the best version of GlobaliD with each new update.

But wait, there’s more!

The GlobaliD Web App has also gotten a major upgrade, which now includes Messages for Groups:

Check out the Web App by clicking GlobaliD Connect Create or join a Group by logging into GlobaliD Groups

If you have any questions, feel free to contact us at: support@globalid.com

Further reading:

How GlobaliD verifications preserve your privacy GlobaliD joins the Open Payments Coalition along with Ripple, Uphold, and Brave GlobaliD app — Introducing groups, multi-person chat, and vouches

Join a growing trusted community and experience how digital identity works for you.

GlobaliD App: Introducing SEPA and crypto transfers to your Wallet was originally published in GlobaliD on Medium, where people are continuing the conversation by highlighting and responding to this story.


Ontology

Ontology Weekly Report (August 11–18)

This week we kicked off our new dApp Incentive Plan, “Renaissance 2.0”, in order to boost the growth of Ontology’s DeFi ecosystem. As part of the program, we will be returning up to double the transaction fees generated by DeFi dApps on Ontology to their respective development teams. We also partnered with Waves to build a cross-chain communication infrastructure that would enable inter-chain DeFi

This week we kicked off our new dApp Incentive Plan, “Renaissance 2.0”, in order to boost the growth of Ontology’s DeFi ecosystem. As part of the program, we will be returning up to double the transaction fees generated by DeFi dApps on Ontology to their respective development teams. We also partnered with Waves to build a cross-chain communication infrastructure that would enable inter-chain DeFi solutions and dApps for the next generation of reliable Web 3.0 applications.

Back-end

- Ontology v.2.1.0 launched on MainNet

Product Development

ONTO v3.2.0 Released

- Upgraded the credential function, and added templates for different types of credentials

- Added support for Plaid financial data verification

- Optimized transaction fees for ETH transactions

- Integrated new financial services available within ONTO, including an investment product from Babel Finance.

- Updated several UIs

dApp

- 70 dApps live on Ontology

- 6,046,942 dApp-related transactions since genesis block

- 10,864 dApp-related transactions in the past week

Bounty Program

- Currently recruiting SDK community developers

- 1 new application for SDK bounty, 1 new application for technical document translation

- 1 new feedback response in “Need Your Idea” bounty from the Bounty Program

Community Growth

- We onboarded 715 new members across Ontology’s Korean, Sinhala, and Vietnamese communities.

Newly Released

- Ontology released the “Renaissance 2.0” DeFi dApp incentive plan with a three-month trial period, during which developers may get up to double the transaction fees returned if they integrate and use ONT ID in their dApp.

- Ontology partnered with Waves to build a cross-chain communication infrastructure for DeFi. Empowered by Gravity, Waves’ cross-chain oracle network, Ontology and Waves have joined forces to offer inter-chain DeFi solutions and dApps to build the next generation of reliable Web 3.0 applications.

Global Events

- On 11 August, Jun LI, Founder of Ontology, was invited to speak on a panel at the POWER 2020 Technology and Application Summit organized by Mars Blockchain. LI was joined by Peng DENG, Director of Mars Blockchain Industry Research Center, and Phillip FEI, Head of Chinese Business at Chainlink, along with other guests to engage in a discussion on “New Opportunities for Blockchain Developers”. During the discussion, LI said, “Blockchain is not here to overturn the internet but to build links and connections. Among these blockchain-enabled connections, the focus lies in the connection of assets, identity, and data.”

- On 13 August, Kendall MAO, Dean of the Ontology Research Institute, shared his insights on Ontology’s new staking model and distributed data infrastructure in an interview with WALI Finance. He emphasized that Ontology will continue to prioritize self-sovereign identity and data.

Find Ontology elsewhere

Ontology website / ONTO website / OWallet (GitHub)

Twitter / Reddit / Facebook / LinkedIn / YouTube / NaverBlog / Forklog

Telegram Announcement / Telegram English / GitHubDiscord

Ontology Weekly Report (August 11–18) was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.

Tuesday, 18. August 2020

Global ID

The GiD Report#123 — Balkanization and what that means for the future of the internet

The GiD Report #123 — Balkanization and what that means for the future of the internet Welcome to The GiD Report, a weekly newsletter that covers GlobaliD team and partner news, market perspectives, and industry analysis. You can check out last week’s report here. Hello, hello! What we have for you this week: Why a more balkanized internet is a sign that we’re entering a new ch
The GiD Report #123 — Balkanization and what that means for the future of the internet

Welcome to The GiD Report, a weekly newsletter that covers GlobaliD team and partner news, market perspectives, and industry analysis. You can check out last week’s report here.

Hello, hello!

What we have for you this week:

Why a more balkanized internet is a sign that we’re entering a new chapter of decentralization (according to Peter Thiel) Real demand for new solutions from mega corporations and individuals alike It’s full circle for Apple circa 1984–2020 The EU version of a balkanized internet and the unintended consequences An open question about moderation Closing remarks from Peter Thiel on the state of innovation and progress in the U.S. 1. The biggest fear of a balkanized internet is that it means we’ll be more closed off and less connected. But the fear itself doesn’t mean it will become a reality. Image: University of Chicago

Because one reason we have this fear is because we’re looking at today’s events through the lens of today’s frameworks — which are grounded in a centralized approach.

China wants control so it won’t let in centralized Big Tech. The U.S. doesn’t trust China so ditto state-connected enterprises like ByteDance and Tencent.

But hey, email still works.

What that means is that a balkanized internet will only accelerate the transition toward new approaches so that we can continue to connect even as the rules of the game shift.

Anyway, that’s what Peter Thiel is betting on. (Always the contrarian and often controversial, but never disappoints with his insights.)

Here’s Peter speaking at the Discovery Institute’s technology summit in July:

If we’d been assembled in 1969, the future of computers was going to be massive centralization. It was giant databases, giant AI-like computer intelligences that would run everything. IBM was HAL transposed in the movie The Space Odyssey, one letter off. It was one of the early Star Trek episodes. They come to the planet Beta, where thousands of years earlier, somebody had unified the planet and left a computer program that ran the whole planet and all the people were peaceful, but very docile — nothing ever happened. But the future of the computer age circa 1969 was centralization — a few large companies, a few large governments, a few large computers that controlled everything.
Fast forward to 1999, the future of the computer age was going to be massive decentralization. It was sort of libertarian and anarchist. It was the corollary to the end of the Soviet Union, that information had this decentralizing tendency, and the internet was going to fragment things and it was going to be this anarchic libertarian place.
If we fast forward to 2019, the consensus view of the future today, I would submit, is that the pendulum has somehow swung back all the way to 1969. And the consensus view again is that it is about large centralization — Google like governments that control all of the world’s information in this super centralized way.
The Life After Google thesis that I agree with and endorse is that if we look at this past — and people got it terribly wrong in ‘69 — and things were going to go to decentralization in ’99, it actually started going back the other way from the point of view of 2019. Even if I’m hesitant to talk about the absolute future and where this all ends, ultimately, perhaps the contrarian thing is to say maybe the pendulum can swing back towards more decentralization, more privacy, and things like that. This is what seems to be at least contrarian and at least something we should always take more seriously.
If you want to frame it in terms of the buzzwords of the day, if it were in terms of crypto and AI, it is easily understood by people. It’s always understood that crypto is somehow vaguely libertarian, but we never are willing to say the opposite, which is that AI is communist. It’s because it’s centralized. The computer knows more about you than you know about yourself. It’s totalitarian.
Communist China loves AI and dislikes crypto. We should at least consider the possibility that Silicon Valley is probably way too enamored of AI, not just for technological reasons, but also because it expresses this sort of left wing centralized zeitgeist.
The first sort of contrarian idea I have is that perhaps it’s time for the pendulum to swing back and Life After Google, at its core, means that we are going to go back from this very centralized world today towards a more decentralized one. That seems to me to be the correct thing to bet on.

Going by Peter’s timeline, 1999 might have felt like the height of the decentralization movement but it was also the beginning of the cycle for the rise of Google, Amazon, Apple (again), and later Facebook.

Those companies are now among the most valuable on earth.

So even if today feels like peak centralization, it’s reasonable — particularly if you’re feeling contrarian — that this only means that the pendulum is about the swing the other way. The forces are already in motion.

The human cost of a WeChat ban: severing a hundred million ties Nationalism and authoritarianism threaten the internet’s universality Via /mgWho wields the power online? | DW Documentary Competition Bureau seeks input from market participants to inform an ongoing investigation of Amazon — Canada.ca Trump says looking at pressuring other Chinese companies after Bytedance The Clean Network — United States Department of State A WeChat Ban Should Be the Moment for Decentralized Tech. But It’s Not. — CoinDesk 2. It’s unclear that things will play out as cleanly as Peter Thiel’s proposed logic. What’s clear is that there’s suddenly ample demand for new solutions.

The WSJ reports:

More than a dozen major U.S. multinational companies raised concerns in a call with White House officials Tuesday about the potentially broad scope and impact of Mr. Trump’s executive order targeting WeChat, set to take effect late next month.
Apple Inc., Ford Motor Co., Walmart Inc. and Walt Disney Co. were among those participating in the call, according to people familiar with the situation.
“For those who don’t live in China, they don’t understand how vast the implications are if American companies aren’t allowed to use it,” said Craig Allen, president of the U.S.-China Business Council. “They are going to be held at a severe disadvantage to every competitor,” he added.
Other participants in the call Tuesday included Procter & Gamble Co., Intel Corp., MetLife Inc., Goldman Sachs Group Inc., Morgan Stanley, United Parcel Service Inc., Merck & Co. Inc. and Cargill Inc., according to the people.

That’s the long game.

There’s ample demand from individuals, too, BTW:

Americans using digital services would gladly switch to companies that are more committed to data privacy, according to survey results shared exclusively with Axios’ Kyle Daly.
Why it matters: It’s the latest sign people are frustrated with the digital status quo, even as companies make efforts to give users more control over how their data gets collected, stored and used.
Details: The survey of 1,018 Americans, conducted in June, found people want more control over what happens with their personal information and think existing tools seem outdated and should be easier to use.
93% of Americans would switch to a company that prioritizes their data privacy.
91% would prefer to buy from companies that always guarantee them access to their data.
88% are frustrated that they don’t have more control over their data.
73% reported finding the process of downloading their data from a company “outdated”; 32%, “hard”; 32%, “confusing.”
About two-thirds said they want to be able to choose what data companies can and can’t collect.

They’re also just waiting for an alternative.

3. And just to point out the swings of the pendulum — there’s Apple, complaining about the WeChat ban. Back in 1984, Apple released that iconic Super Bowl ad about IBM referencing George Orwell.

Now they’re getting a taste of their own medicine with Riot’s Fortnite spoof.

Of course, Facebook sees an opportunity:

Between the lines: Facebook is trying to position itself as friendlier to small businesses than Apple, which also faces a lawsuit from Fortnite maker Epic Games over its commission and in-app payment restrictions.
What’s happening: Facebook said Friday that it will launch “Paid Online Events” for small businesses in 20 countries around the world to charge Facebook users to attend their classes, instructions and other events.
The feature could be useful for any small business or individual offering a service, such a preacher, musician, yoga teacher or cooking instructor.
Facebook asked Apple to either waive its 30% cut or let Facebook go around it and process event payments via Facebook Pay, in either case letting event hosts keep all the revenue they generate. Apple declined, according to Facebook.
Epic sues Apple, Google as Fortnite is pulled from app stores Apple, Epic, and the App Store Pro-tech advocacy group backed by Facebook and other silent investors releases its first ad Via /alesFortnite May Have Just Laid the Perfect Antitrust Trap for Apple — and They Fell For It [Another Update: Google Just Kicked Fortnite Out of Its App Store, Too] Facebook goes after Apple’s in-app purchases fee in wake of Epic lawsuit 4. The other thing about technological balkanization is that it’s not always going to be black and white. A lot of the time it’s gray. That’s a good way to describe what’s been happening in Europe in terms of how they’ve dealt with the influence of American Big Tech. Rather than ban anything outright, they just force companies to play by their rules.

That’s well and good in theory. In practice, it’s really messy with a lot of unintended consequences.

For instance:

Some businesses fear growing liability while others worry that small and mid-sized firms will get hurt as the U.S. and Europe begin work to replace Privacy Shield, the pact that let thousands of firms transfer data across the Atlantic without breaking EU privacy rules, Axios’ Ashley Gold reports.
Why it matters: Without a replacement in place after the EU’s high court struck Privacy Shield down last month, thousands of businesses will be stuck complying with an agreement that no longer applies in the EU while scrambling to figure out how to get data over from Europe without exposing themselves to legal risks.
What’s new: This week, the Department of Commerce and European Commission announced they have started discussions to come up with a new framework to govern data transfers between the EU and the U.S.

Related:

Joint Press Statement from U.S. Secretary of Commerce Wilbur Ross and European Commissioner for Justice Didier Reynders A Princess Is Making Google Forget Her Drunken Rant About Killing Muslims Strong Customer Authentication, a Requirement to Combat Payments Fraud, Might Lead to Many Customers Abandoning Purchases at Checkout 5. We’ve talked about balkanization. We’ve touched on regulation. One of the missing pieces that’s become a pivotal question of the day is moderation, which is understandably a controversial topic of the day. If you listened at all to the recent antitrust hearings, then you know that Republicans have their collective pitchforks out about platform censorship.

And it’s easy to think about moderation negatively particularly with what’s going on in China.

But just as not every habit needs to be labeled an addiction — there are good habits, after all — not all moderation should be labeled censorship.

Good moderation is helpful in creating the right incentives that align with a community’s core values — whether that’s a small group of friends or an entire country.

And good moderation will mean different things for different people.

A big part of why there’s such a negative slant to the idea of moderation with today’s status quo isn’t moderation itself, it’s the issue of centralized power and the lack of transparency and accountability.

Here’s the New Yorker (via /gregkidd):

In a 2018 article published in the Harvard Law Review, “The New Governors: The People, Rules, and Processes Governing Online Speech,” Kate Klonick, who is now a professor at St. John’s University Law School, tallies the sometimes conflicting factors that have shaped the moderation policies at Twitter, Facebook, and YouTube. The companies, she writes, have been influenced by a fundamental belief in American “free speech norms,” a sense of corporate responsibility, and user expectations. They’ve also reacted to government requests, media scrutiny, pressure from users or public figures, and the demands of third-party civil-society groups, such as the Anti-Defamation League. They have sometimes instituted new rules in response to individual incidents. There are downsides to this kind of improvisational responsiveness: a lack of transparency and accountability creates conditions ripe for preferential treatment and double standards.

It’s also the type of problem inherent to huge centralized organizations and platforms. The more decentralized, transparent, and flexible moderation is, the less oppressive moderation feels.

I mean I get it. Why should Mark Zuckerberg decide the rules for billions of people? Who made him king? (And I get Mark’s side — when you have your own employees protesting because you won’t take stuff down because they don’t like it.)

It’s like that thing that Aristotle said: That dude that has a lotta friends basically has no friends.

Mark gets it, too. That’s why he created an independent (supposedly) subcommittee to decide that stuff.

As a solution, though, it’s still kind of a bandaid. When you’re as big and as powerful as Facebook, any solution to the problem will be a bit cumbersome.

When you don’t have a system built on real trust and real identities, it’s impossible to delegate that responsibility to your users or your communities. If you never trusted them in the first place, how can you provide them agency.

It’s a point that we touched upon a couple weeks ago.

Vitalik Buterin:

If you don’t have any notion of persistent identity, then you can’t have positive incentives because there’s no persistent thing for those incentives to hook onto. There’re a lot of ways you can have persistent identity, persistent reputation — even in ways that are very strongly privacy preserving.

Anyway, this was more of an open question. Something to think about.

Related:

To Head Off Regulators, Google Makes Certain Words Taboo — The Markup 6. OK. I’ll leave you with a little more Peter Thiel on the state of innovation and progress in the U.S. from that same talk. (Sorry, Chamath is pushed back again.)
If we look at the rate of progress in Silicon Valley, it was charismatic because it was the one place where things were still happening relative to the rest of the U.S., and it’s become a lot less charismatic in the last five years.
We think about the vibe in 2014, it was this place where the future was being built. In 2019, the big tech companies are probably as self hating in some ways as the big banks were in 2009.
There’s a sense of, “uh, it’s not quite working.”
If you pick on Google a little bit here, the Google propaganda of the future was that it was all going to be more automation. The story in 2014 were things like Google Glass so you could identify anybody you looked at at any time. It was the self-driving car. I would say these aren’t that big of a set of innovations — a self-driving car is a step from a car but not as big a step as a far from a horse.
So we can debate quite how big these things are and how to quantify them. But that was still the narrative that was very intact in 2014.
When you fast forward to 2019, it’s striking how there’s absolutely no narrative of the future left. Google doesn’t even talk about the self-driving car very much. There’s a sense that it may still happen, but it’s further in the future. The expected time seems to be getting further away.
So there’s sort of the sense that perhaps there’s this danger that we have slowed progress even in tech, even in the world of information technology.
….
Somehow Silicon Valley has consolidated into larger companies. It’s gotten harder for new companies to break through, and it’s gotten harder because new companies or small companies are good at doing new things and people are doing fewer new things. Then the big companies are more dominant.
….
The stagnation idea is that at the end of the day, technology is about people. It’s not about inanimate forces. It’s not some kind of Marxist historicism about the way things are inevitably going to happen. The stress is always on individuals, small teams that start companies, new projects that do new things.
It’s a question of human agency. It’s not deterministic. We have every possibility to do these things, but at the end of the day, it is up to us to make it happen. And it’s not set in stone that it’s going to happen one way or another.
In conclusion, I think one other gloss in Life After Google is that perhaps you should have the title with “life” being italicized or stressed or put in bold.
The critical thing is there is life, life goes on and in particular, human life, humanity goes on. Even though the dominant narrative in tech is about inanimate forces or Marxist historicism — it really is, at its core, about human beings and we should always bet on the indomitability of the human spirit.
7. Stuff happens Instagram Faces Lawsuit Over Illegal Harvesting of Biometrics Why cash has been piling up during the pandemic To Help the World’s Poor, Put Money on Their Phones Via /jvsGoogle rolls out virtual visiting card in India — TechCrunch Via /vsAnnouncing FedNow Service features and functionality Kenya’s M-Pesa sees consumer, business services opportunities in COVID-19 Swapping cash for online payments in Rwanda WhatsApp Pay: Awaiting Supreme Court verdict to roll out UPI-based payment service in India — TechnoSports Facebook Financial Formed to Pursue Company’s Payments Plans What’s new in Verity? — Evernym This Is a Mostly True Story About… Business Cards MicroStrategy Buys $250M in Bitcoin, Calling the Crypto ‘Superior to Cash’ — CoinDesk

The GiD Report#123 — Balkanization and what that means for the future of the internet was originally published in GlobaliD on Medium, where people are continuing the conversation by highlighting and responding to this story.


KuppingerCole

Recommendations for aligning ITSM with IAM/IGA

by Warwick Ashford The versatility of modern IT Service Management systems (ITSM) is leading many organizations to configure these systems to deal with all employee service requests, including those related to IAM/IGA. But this is a risky strategy from a maintenance and compliance point of view. This Leadership Brief outlines the key reasons for aligning ITSM with IAM/IGA systems, and how this is

by Warwick Ashford

The versatility of modern IT Service Management systems (ITSM) is leading many organizations to configure these systems to deal with all employee service requests, including those related to IAM/IGA. But this is a risky strategy from a maintenance and compliance point of view. This Leadership Brief outlines the key reasons for aligning ITSM with IAM/IGA systems, and how this is best achieved.


AI-Powered Data for All – Informatica's Acquisition of GreenBay Technologies

by Anne Bailey Informatica has just announced that they have made another acquisition this summer: GreenBay Technologies, a startup focused on AI and machine learning. Read about their July 2020 acquisition here. GreenBay Technologies brings CloudMatcher to Informatica’s Intelligent Data Platform (IDP). CloudMatcher uses machine learning to automate entity matching and schema matching tasks wit

by Anne Bailey

Informatica has just announced that they have made another acquisition this summer: GreenBay Technologies, a startup focused on AI and machine learning. Read about their July 2020 acquisition here.

GreenBay Technologies brings CloudMatcher to Informatica’s Intelligent Data Platform (IDP). CloudMatcher uses machine learning to automate entity matching and schema matching tasks with high accuracy. This impacts several key data management capabilities such as master data management, data cataloging, data quality, governance, and data integration.

This acquisition adds to the core capabilities of Informatica’s CLAIRE® engine. Informatica has previously collaborated with and invested in GreenBay Technologies, with some elements of CloudMatcher technology already embedded in Informatica products. Founders and team members of GreenBay Technologies will join Informatica as full-time employees. The collaboration, investment, and mutual aim to provide a more complete view and understanding of enterprise data foreshadows a successful acquisition.

Empowering the Business User with AI-Powered Data Management Tools

CloudMatcher further strengthens Informatica’s approach to no-code data management. It brings innovations to both schema matching and entity matching. While schema matching determines that two columns in two tables, such as “ename” and “emp name”, are semantically the same, entity matching determines that two records, such as (David Smith, Acme Company) and (Dave M. Smith, Acme), are the same real-world entity.

Historically, entity matching has remained a complex problem because the solutions are rule-based that require skilled developers and a significant amount of time. GreenBay Technologies has created a blend of “declarative rules” and “AI rules” for match classification. Many existing solutions to schema and entity matching use hand-crafted rules. With this acquisition, Informatica adds a powerful machine learning model that can capture complex and powerful matching rules, the kind of rules that users cannot create manually. The supervised learning used in this approach requires relatively little effort from the user to train the system. The active learning stage presents the business user with interactive labeling exercises for tuple pairs from two separate tables. Simultaneously, the system learns from more user feedback and curation to continuously improve over time. This workflow goes through multiple iterations and yields a high degree of accuracy. CloudMatcher is also highly amenable to distributed and parallel processing, allowing the solution to scale to very large data sets and dramatically reduce manual data stewardship required.

This opens the door for many data management services to be “hands-off”, meaning easily usable by business users without the need for coding or a developer background. As data sets are getting larger and larger, the improved match rate offered by CloudMatcher will help reduce the false positives. By applying a crowdsourcing approach with business users labeling training data, Informatica can configure a sophisticated data matching system that adapts to an organization’s data landscape in a few hours as opposed to days or weeks. This technology also helps in extending matching beyond identity data by matching product, supplier, location, and other types of data domains with even higher accuracy.

AI on Demand: A Continuation of the Platform Economy

We see the trend of harnessing AI to make complex tasks more accessible to non-technical users rising in many different fields. Conversational AI Building Platforms is one such area with many vendors providing the general technology and language capabilities, with support to train the chatbot on a customer’s specific industry and corporate knowledge. Other areas include cybersecurity, with AI capabilities already integrated into the cybersecurity solution and able to be trained in a context-specific environment rather than a “build it yourself” approach. This may shift in the future though, as methodologies for presenting the AI development process (data preparation, model selection, training, validating, implementation, maintenance, and retirement) to non-AI experts improve. Some tech companies (read about some examples here and here) are offering AI capabilities irrespective of industry; they aim to provide the infrastructure, data pipeline management, automated support to determine appropriate learning models, and governance for the AI lifecycle.

The data management use cases that we see Informatica addressing have the potential to unlock more data-based insights at a lower cost – be it through reducing time-to-value or by enabling subject matter experts to work with their data as a data scientist or developer would. Enabling more data-driven business will shape the processes, internal decisions, production goals, and much more. But this is only possible with strong data management practice, including the correct labeling of entities. User-friendly AI to manage this step is key to seeing more value from data management products.


Trinsic (was streetcred)

Trinsic Leads SSI Digital Wallet Portability

“Portable” is one of the 10 principles of self-sovereign identity (SSI). In order to achieve portability or self-sovereignty, an individual must be able to control where their identity information and credentials are stored. They must be able to leave their current provider and move to a new provider and never be trapped in vendor lock-in. […] The post Trinsic Leads SSI Digital Wallet Portabilit

“Portable” is one of the 10 principles of self-sovereign identity (SSI). In order to achieve portability or self-sovereignty, an individual must be able to control where their identity information and credentials are stored. They must be able to leave their current provider and move to a new provider and never be trapped in vendor lock-in.

 

Wallet portability for individuals has always been an aspiration of wallet providers, but until today, has never been successful. We’re proud to announce that Trinsic has achieved interoperable wallet portability with two other SSI wallet vendors—Lissi and esatus AG. For the first time, an individual can “fire their wallet”¹ and use a new one.

What is wallet portability?

When we talk about wallet portability with other vendors, we mean you can seamlessly transfer your SSI wallet between these different applications by exporting from one wallet and importing into another (“import/export”). With import/export functionality, you are no longer tied to or reliant on a single technology provider.

 

“Portability between Aries wallets is a long time coming. It’s a huge win for the vision of SSI,” said Trinsic CEO Riley Hughes. “Import/export should be standard in any SSI wallet—because if a wallet doesn’t support portability, it’s not really self-sovereign.”

 

This first implementation amongst 3 vendors is a first step in what we hope will be a larger community effort to converge on best practices for interoperability of SSI wallets.

Wallet portability in action

The Trinsic Wallet, the Lissi app, and the esatus Wallet are three separate SSI wallets that support the import/export feature. The video below shows Trinsic CEO Riley Hughes easily moving his wallet from Trinsic to Lissi and then back again, all while receiving and verifying credentials with no interruption:

Below are the step-by-step directions of what the video shows.

Transfer credentials from the Trinsic Wallet to the Lissi app:

 

In the Trinsic Wallet, go to Settings and click the Export Wallet tab. Copy down the recovery phrase. Click on the Export Wallet button. Save the wallet to wherever you’d like to store it. Open the Lissi app (this can be done with the esatus Wallet as well), go to Settings, and click on the Recover Account tab. Click on Open File and choose the Trinsic Wallet file. Paste in the recovery phrase that was copied from the Trinsic Wallet and click Recover Backup.

Transfer credentials from the Lissi app to the Trinsic Wallet:

In the Lissi app, go to Settings and click on the Create Export/Backup tab. Write down the recovery phrase and click Continue. Verify the recovery phrase by clicking on the words in the correct order. Name your backup file and click Export. Unzip the exported file. Open the Trinsic Wallet, click on Settings, and click on the Backup and Restore tab. Click on the Restore Wallet button, select Local Backup, and choose the Lissi wallet file. Type in the recovery phrase and click the Recover Wallet button.

“We celebrate with the community in the achievement of SSI wallet portability and are proud that Lissi Mobile Wallet is one of the first truly portable wallets,” said Helge Michael, the Program Manager at Main Incubator who runs Lissi. “Self-sovereign identity has made incredible progress in the last couple of years, and we can now add wallet portability to the list.”

Our commitment

The Trinsic team is driven by the goal to make self-sovereign identity more accessible to the world, and wallet portability is an important part of that. We’ve developed the world’s most powerful verifiable credential & digital wallet platform that powers hundreds of companies and developers, all while keeping interoperability and portability as a priority.


With Trinsic’s fully-loaded package of 3 APIs, a front-end Studio, robust documentation, and SDKs in popular languages, your team is equipped with the flexibility and functionality needed to build something extraordinary.

Trinsic Studio: An easy-to-use web interface for managing credential exchange with no code. Also serves as the mechanism to acquire API keys and manage billing for paid plans. Try it for yourself completely free, and issue a credential in less than 5 minutes! Provider API: Our newest API enables developers to programmatically provision issuer and verifier cloud agents. Learn more about the provider API in the recent launch announcement. Credentials API: Our core API enables developers to have a turnkey way to issue, verify, and manage verifiable credentials on any Hyperledger Indy network. Check out our documentation or one of our reference applications to get started. Wallet API: An API for creating and managing cloud wallets on behalf of credential holders. It’s the backend of our Mobile SDK, which you can read more about in our recent post about building your own SSI wallets. Get started with the API by checking out the documentation. Future plans

Our approach to wallet portability shouldn’t be viewed as the end-all be-all. Further iterations and evolution will happen over time as the SSI community continues to converge on standards and protocols relating to wallet portability. We do commit, however, to being involved in that effort. If you’d like to get involved, here are some things we’d love your contributions on:

 

Currently connections are lost through the import/export process. Connections are a persistent 1:1 relationship between two parties and rely on endpoints provided by mediator agents to ensure messages are routed to the correct wallet. Because the approach to mediator agents aren’t completely standardized between vendors, connections do not transfer seamlessly across different vendors. The export function creates an encrypted, duplicate wallet which you can use as a backup. This is extremely useful if you lose your device and want to restore your wallet. But it also enables you to upload your wallet to multiple devices. This is useful if you want a tablet & mobile wallet to be in sync, but can be problematic if you duplicated your wallet onto other people’s devices, allowing them to present your credentials. Ensuring the integrity of credentials across the import/export process is the subject of future work. All the mobile wallets that support portability are based on the Aries Framework .NET. Further testing will be required to determine whether the exported wallet can be opened seamlessly in other Aries frameworks or whether additional limitations will arise. The import/export implementation is specifically geared toward local wallets and applications (ie., mobile wallets). Enterprise wallets and cloud wallets are out of scope. Our ask

If you are a developer or business that has created its own Aries-based wallet and would like to implement import/export functionality, join the Aries Working Group where these ongoing interoperability efforts are happening. If you’re a developer or business who is interested in utilizing Aries or digital wallets, we encourage you to check out the Trinsic platform. If you are an individual, give portability a try by exporting your verifiable credentials to and from one of the wallets that supports interoperability. As always, feel free to contact us if you have any questions—we’d love to chat.

Notes This is a phrase coined by Timothy Ruff, a longtime advocate for portability.

The post Trinsic Leads SSI Digital Wallet Portability appeared first on Trinsic.


Ontology

Leading Blockchain Projects Unveil Poly Network, Offering Unprecedented Interoperability Benefits

Poly Network will permit interoperability between multiple chains, offering a range of new advantages In a bid to lay the groundwork for a more collaborative and transparent decentralized ecosystem, three leading global blockchain projects, Ontology, Neo, and Switcheo, have announced the launch of Poly Network, a heterogeneous interoperability protocol alliance. In what is a major milestone for

Poly Network will permit interoperability between multiple chains, offering a range of new advantages

In a bid to lay the groundwork for a more collaborative and transparent decentralized ecosystem, three leading global blockchain projects, Ontology, Neo, and Switcheo, have announced the launch of Poly Network, a heterogeneous interoperability protocol alliance.

In what is a major milestone for the Distributed Ledger Technology (DLT) industry, Poly Network will permit cross-platform interoperability, greatly increasing transparency and accessibility. Enterprises leveraging diverging systems can connect to Poly Network, and collaborate and interact with each other through an open, transparent admission mechanism.

Andy Ji, Co-founder of Ontology, the high performance, open source blockchain specializing in digital identity and data said, “By combining the highest standards of expertise and platform capabilities, Poly Network will have unprecedented benefits for blockchain developers, propelling the building of the decentralized ecosystem through a range of platform offerings that make the user experience easier.

“For a time now, the Ontology community has enjoyed the benefits of cross-chain functionalities with Ethereum. Now, through Poly Network, an enterprise leveraging the Ontology blockchain will be able to seamlessly interact with an enterprise leveraging Ethereum, Cosmos, or Neo, helping these platforms overcome challenges to scalability, mainstream adoption, and collaboration. To make it happen, Ontology has invested high levels of technical research, development, and financial support.”

Prioritizing key characteristics of efficiency, value transfer, privacy, and security, PolyNetwork’s core function is to enable cross-platform interoperability and atomic cross chain transactions, such as the swapping of digital currency for other tokenized assets. PolyNetwork will:

Provide DeFi developers with new infrastructure for creating cross-chain compatible dApps Support both heterogeneous and homogeneous chains including Ethereum, Neo, Ontology, and Cosmos, along with each platform’s homogenous chain forks, with future plans to support the Bitcoin network Permit easy, eco-friendly access to cross-chain transactions, by requiring less smart contracts Increase scalability of cross chain transactions by enabling the transfer of both assets and information

Da Hongfei, Founder of Neo, an open-source, community driven platform, said, “I firmly believe that interoperability is the future of the blockchain industry. Within our current paradigm, both traditional and blockchain platforms exist as effectively isolated data silos — users and developers alike are restricted by each platform’s capabilities and limitations. With PolyNetwork, we are linking disparate and heterogeneous platforms to build a global cross-chain platform with the aim of realizing blockchain’s potential while overcoming its challenges — together. Moving forward, I encourage all interested projects and companies to join us in building the foundation for NGI.”

Ji concluded, “Cross-chain interoperability is becoming increasingly important as we focus on moving away from a siloed way of working. Ontology is committed to fostering an open and transparent blockchain ecosystem and we are proud to partner with Neo and Switcheo to deliver true cross-blockchain interoperability. We look forward to playing our part through this groundbreaking alliance.”

For more information on Poly Network and how to join, visit https://www.poly.network/

About Poly Network

PolyNetwork is built to implement interoperability between multiple chains in order to build the next generation internet infrastructure. Authorized homogeneous and heterogeneous public blockchains can connect to PolyNetwork through an open, transparent admission mechanism and communicate with other blockchains. Popular blockchain networks such as Bitcoin, Ethereum, Neo, Ontology, and Cosmos are already a part of PolyNetwork. More institutions and organizations are welcome to join PolyNetwork and build the next generation internet with us.

Find Ontology elsewhere

Ontology website / ONTO website / OWallet (GitHub)

Twitter / Reddit / Facebook / LinkedIn / YouTube / NaverBlog / Forklog

Telegram Announcement /