Last Update 8:51 PM February 28, 2021 (UTC)

Company Feeds | Identosphere Blogcatcher

Brought to you by Identity Woman and Infominer.
Support this collaboration on Patreon!

Sunday, 28. February 2021

Identosphere Identity Highlights

Identosphere #21 • Aries Graduates to 'Active' • Dizme, Linux, & Sovrin Foundations + Algorand • Are Vaccine Passports Inevitable?

Join us for another issue filled with upcoming events, sovereign identity and personal data news, and working group discussions.
Thanks to our Patrons

We got a new patron this week! We are grateful for your support!

If you haven’t had the chance to sign up, yet, we greatly appreciate your consideration!

Your contributions help us to see that our efforts are appreciated, and go towards recovering our investment into the curation and distribution of these weekly updates.

We are also open to corporate subscriptions, for organizations who want all of their members to stay current with the latest developments in decentralized identity.

Upcoming Open call to kickoff the upcoming Wallet Security WG at DIF March 1st

Bastian, Paul writes:

I will present motivation, goals and a first roadmap.

Very short summary:

 - standardized wallet security is necessary for sensitive credentials like id-cards, payment credentials or more

 - create a specification and interface to communicate about wallet capabilities, security, regulation-conformance and other points of security-relevant interoperability

 - define mechanism to enable wallet security assertions, certification and ways to prove them

 - define specifications about wallet user authentication, ways how to ensure them and how to communicate them to issuers/verifiers

Calander InviteWallet Security WG CharterWallet Security Mailing list

Engage with S&T and DHS Industry Liaisons March 2

The Insights Outreach webinar series will include the Office Industry Partnerships (OIP) program spotlights, engagement and contracting workshops, innovative technology demonstrations, and commercialization success stories from the community working with S&T. 

Interoperability Between ACA-Py & the Trinsic Platform March 3

Join us to see how two codebases, Aries Cloud Agent Python (ACA-Py) and Trinsic (based on Aries Framework .NET) interact interoperability in an enterprise environment.

Thoughtful Biometrics Workshop March 8,10,12 Data Ownership and Self-Sovereign Identity by Cryptonics March 10th

how it works from a technical point of view to the different real use cases in the world we are living in. Timestamping, digital signature, e-voting, authentication, and private contracts are some of the key elements that would be explained during the session.

Internet Identity Workshop XXXII (#32) April 20-22 Identiverse 2021 June 21-23 • denver, co

Catch up with peers, meet experts and share best practices and insights from the last year in identity. Join us for a unique hybrid event as we change the future of the industry together.

News Linux Foundation Announces DizmeID Foundation to Develop and Enable a Self-Sovereign Identity Credential Network

Blindsided by this news!!

The DizmeID Foundation and technical project will define and allow for implementation of Dizme features on top of Sovrin public identity utility. The Dizme ecosystem is expected to include various technological components leveraging Hyperledger stack and adding a monetization layer based on Algorand blockchain protocol, which will enable the exchange of verifiable credentials and the development of new vertical applications. 

Hyperledger Aries Graduates To Active Status; Joins Indy As “Production Ready”

“This approval is further evidence that Hyperledger Aries is a breakout success for the Hyperledger community,” said Brian Behlendorf, General Manager for Blockchain, Healthcare and Identity at the Linux Foundation. “Convergence on common libraries for the exchange of credentials will help speed the development of urgently-needed solutions and systems, ranging from education to finance to the fight against the pandemic. Aries is key to that convergence.” 

New versions of Multibase and Multihash published

Manu Writes:

Multibase is just a date bump to signal that the specification is still alive
and being used. There are no new base-encoding mechanisms that have been added
to this release. This community uses multibase to encode a number of values,
like signature values and did:key values, in Verifiable Credentials and
Decentralized Identitifer Documents.

Multihash adds the KangarooTwelve hash function to the list of available hash
functions. KangarooTwelve's strongest draw is that it's fun to say out loud...
and is a KECCAK-based highly parallel hash function, if you're into that sort
of thing. This community uses multihash in some DID Methods (Veres One uses it
for hash values for operations and blocks), and the DID Core specification
uses multihash for Hashlinks (to provide integrity protection for DID Documents):

Fairly boring stuff, but nevertheless, foundational building blocks upon which
many of us have built our implementations.

eSSIF-Lab Infrastructure-oriented Open Call Help us populate eSSIF-Lab Self-Sovereign Identity Framework!

2nd deadline has passed, but Infrastructure-oriented Open Call remains open. We are accepting your applications for the final deadline on 30th of June 2021 at 13:00 (Brussels Local Time).

Up to €155K funding for innovators to develop technical enhancements and extensions of eSSIF-Lab's Self-Sovereign Identity (SSI) Framework 

Decentralized Identity and DIACC PCTF Authentication

The Authentication component of the DIACC Pan-Canadian Trust Framework™ specifies processes and conformance criteria for service providers. Authentication and credential management services may be assessed against these criteria.

Identity Masters Podcast

At Authenteq, we spend a lot of time thinking and talking about how to make the internet a safer space. It’s also what drives our product roadmap and why we got into identity verification and e-KYC in the first place. As we work to democratize the knowledge we have access to, we know that for it to be truly accessible, we have to work with different formats. This is why we’re very excited to introduce the brand spanking new Identity Masters podcast now available on Spotify!

US Education Department promotes putting student records on blockchain

The COVID-19 pandemic has exposed flaws across various sectors. As a result, a number of government departments are evaluating blockchain-based systems as possible solutions for challenges involving multiparty workflows, record-keeping, transparency and more. 

For example, the United States Department of Education recently provided funding for the launch of the “Education Blockchain Initiative.” Referred to as the EBI, this project is led by the American Council on Education — an organization that helps the higher education community shape effective public policy — and is designed to identify ways that blockchain can improve data flow between academic institutions and potential employers.

Innovative Startups join forces to create a trusted training record and build a secure and safe waste data tracking system

Digital Catapult and Sellafield Ltd have selected two innovative startups, Condatis, and Jitsuin, to implement advanced digital technology solutions to support the nuclear industry to continue to monitor skills within the sector, and to provide a trusted and secure record for tracking hazardous waste and materials.

Benefits of SSI and Blockchain in Digital Identity good-id

The needs and experiences of citizens are established with how digital identity networks should preserve the freedoms and rights of users over the needs of the network. Transparency is explicitly mentioned as part of SSI, and it places a high emphasis on the importance of the public’s trust.

As we look to the future of digital identity, SSI principles with blockchain have already proven to be successful by bringing together stakeholders to create a mutually beneficial network.

Covid 19 Digital Vaccination Certificates -- Here Be Dragons!

This is a thread to keep an eye on. >> Anil John writes: 

Because I believe that this is an important conversation, I figure I would put together some high level slideware that synthesizes and shares the answers I have provided directly to those who have asked.  I am not in the hearts and minds business, so consider this in the spirit of the quote from Bruce Lee - "Absorb what is useful, Discard what is not, Add what is uniquely your own."

Happy to chat to share our mistakes, so that you don't need to repeat them, with those who have a public interest focus in this area.

The inevitable vaccine passports Or, are they actually inevitable?

Until the time digital records for vaccination are as simple and do not require a second thought around wallet/app/credential format etc - we have a long way to go before they are inevitable.

CCI Knowledge Base

If you haven’t already you might want to check out this google sheet

As our community continues to grow and the pandemic situation keeps evolving, this CCI Knowledge Base serves as a repository of ongoing COVID-19-related news, topics, researches and resources which are deem relevant to our community and digital identity technology. It aims to provide an up-to-date database for our CCI members to access relevant information quickly in one place whenever they need it, e.g. doing market research, developing their projects or simply keeping themselves updated on the news.

Submit relevant news or articles for the database

Vaccine passports prove an ethical minefield

Any Covid-19 vaccine passport scheme set up in the UK could easily turn out to be discriminatory and invasive, and open the door to worse abuses of privacy in future, say security experts and campaigners.

Literature Self-Sovereign Identity as the Basis for Universally Applicable Digital Identities

This paper addresses the role of digital identities for a functioning digital economy and outlines requirements for their management. [...] The concept of Self-Sovereign Identities (SSI) and the associated standards “Verifiable Credentials” and “Decentralized Identifiers” is a promising approach to improve the situation. They allow the flexible exchange of tamper-proof digital proofs between users and systems. Therefore, they form the foundation for building trust relationships in the digital space. This paper introduces the SSI paradigm and discusses the barriers that prevent the wide-scale adoption of this concept. 

Beyond SSI  DHS S&T Awards $198,600 to Develop Security and Privacy Testing of COVID-19 Contact Tracing Apps

to AppCensus, a start-up based in El Cerrito, California, to develop testing and validation services for digital contact tracing applications (apps). The phase 1 award was made under S&T’s Silicon Valley Innovation Program (SVIP) Emerging Needs: COVID-19 Response & Future Mitigation solicitation, which addressed multiple near term use-cases in response to the pandemic and prepares DHS for future mitigation. AppCensus is the first of six start-ups to receive a phase 1 award.

AppCensus AppSearch analyzes free publicly-available Android apps and reports the private and personally identifying information that different apps access and share with other parties over the Internet. We collect our results using a technique called dynamic analysis.”

Making your digital life easier with Government Sign-In by Verified.Me

Our recent rebranding from SecureKey Concierge to Government Sign-In by Verified.Me* was an extensive process that was done to benefit users even more. Streamlining who we are, what we look like and how we talk about the tools in our digital identity suite means we can better communicate with you – the user. Making your digital life easier and more secure is our goal in everything we do at SecureKey.

Letter to Attorney General Becerra Re: FinCen Proposed Rule Privacy concerns

Our concerns with the consumer privacy implications of this proposed rule are twofold:

First, the proposed rule’s requirement that MSB’s collect identifying information associated with wallet addresses will create reporting that extends well beyond the intent of the rule or the transaction.

Call for Review – WebAuthN: An API for accessing Public Key Credentials Level 2 is a W3C proposed recommendation

Comments welcome through March 26

The Web Authentication Working Group has published a Proposed Recommendation of Web Authentication: An API for accessing Public Key Credentials Level 2. This specification defines an API enabling the creation and use of strong, attested, scoped, public key-based credentials by web applications, for the purpose of strongly authenticating users.

MyData Attitudes To Personal Data Management

In recent years, personal data has been an increasingly popular topic of conversation for marketers, data analysts, regulators, and privacy warriors. Individuals have learnt that recent regulatory updates have given them more rights over how that data is used. Are these two forces aligned?

We distributed a survey and received over 400 responses from both individuals and organisations answering questions about the management of personal data. How aligned are the two points of view? This infographic shows a summary of key questions and responses.

Thanks for reading. See you next week!

If you find this publication to be valuable for your efforts in digital identity, support its creation by contributing on Patreon.

Saturday, 27. February 2021

Aergo

AERGO Community Questions

In the past week we’ve been collecting questions and frequently asked items that were brought up in our Telegram channel. The AERGO team is all ears, and we’ve gone through the process of addressing queries that range from technical, the inspiring, the strange, to the burning 🔥 Let’s dive right in! Q: Which oracle will be used by AERGO, or will there be just a default for dapps like Chainlin

In the past week we’ve been collecting questions and frequently asked items that were brought up in our Telegram channel. The AERGO team is all ears, and we’ve gone through the process of addressing queries that range from technical, the inspiring, the strange, to the burning 🔥 Let’s dive right in!

Q: Which oracle will be used by AERGO, or will there be just a default for dapps like Chainlink?
A: There is currently no official oracle as there has not been a need for one within the use cases of the enterprises that are operating on our chain. Necessary information for each customer is simply uploaded onto the chain.

Q: Are the possible solutions to the high cost of using erc-20 liquidity tokens: alternate protocols like Polkadot or Cardano?
A: In the long term, we are looking to utilize and trade our mainnet tokens as a gas solution. Currently we have no plans to use other protocols.

Q: How will KDAC custodial efforts affect the AERGO token. Will Tokens be used in these solutions? Will Bitgo be using any AERGO products?
A: BitGo is not using any AERGO products. We are working to integrate KDAC custody solution with the AERGO platform.

Q: Have people print stickers and paste them on their city, send you a picture, and the most liked one gets a prize in AERGO. Spread the news of the winner. Give someone 5k AERGO or something for the most creative one.
A: We definitely do not want to have our community committing any illegal vandalism type of campaigns in their respective city. I believe there are other ways to spread news about AERGO.

Q: Partnerships with several youtubers, known and upcoming ones. (It might cost some money but at this point, its a bull market. Spread the word, why are we missing this chance, even if the projects aren't done, spread the NEWS of the upcoming projects and then deliver)
A: We are working with our marketing agency to set up interviews and coverage by crypto influencers.

Q: Where is the updated Roadmap, please provide a concrete answer here.
A: We will be publishing our official roadmap within a week.

Q: Where are you on marketing, are you hiring someone to handle the different channels and keep twitter, reddit, facebook, website constantly updated with new and re-used information?
A: We have a great marketing agency that oversees our communication with the community. We will be doing more in the future with our website and create additional transparencies with the project.

Q: What happened with the Samsung Partnership, any new developments?
A: We are in talks, however with large companies such as Samsung, they are not flexible nor quick on decisions. We are leveraging our relationship as best as we can within their guidelines.

Q: What happened with the Hyundai partnership, any new developments?
A: There are many divisions of Hyundai. Hyundai Mobis, a SI company, works with Blocko to service Hyundai Group companies with blockchain solutions. Hyundai Motors is a customer of Blocko that utilizes AERGO Enterprise platform for used car part inventory tracking solution.

Q: How is the AERGO Hub being received? Any feedback received?
A: We have just opened our beta testing program and just started to collect feedback. We will be sharing the feedback with the community soon.

Q: Would a buyback option be available in the future?
A: This is an option that can be considered in the future.

Q: Would you buy ad slots on Coinmarketcap to promote AERGO.
A: At AERGO Mainnet launch we did purchase ads on Coinmarketcap website, however the return on investment was not high. We may consider again in the future major events. We do however send regular signals on Blockfolio.

Q: Can the website be updated with current news and information?
A: Current news and information are always posted on the website. We are looking into making it more easily visible.

Q: How do you feel about doing promotions on Twitter to spread the recognition of this coin, giveaway or something.
A: We would be open to ideas. We have done giveaways on AMAs in the past that were promoted on Twitter.

Q: New listings, we know you mentioned Coinbase, did you apply? What happened there? What about other exchanges, US consumers have no way to buy this coin.
A: Exchanges have very strict NDA regarding listings. All we can say at this point is that one of our goals is to list on an US based exchange.

Q: Thoughts on partnerships with other coins?
A: Yes, this is something we are brainstorming. If you have any ideas, we would definitely be open to hearing about them. We do not want to just announce a partnership for the sake of pumping the token price. We would like to do a real partnership where it can scale in the future and sustain a high token price.

Q: Have partnerships also promote AERGO, as it stands most partnerships just mention Blocko but never AERGO.
A: Press releases with Blocko’s customers are very tricky. Most of the high profile customers go through their PR department reviews. Mentioning themselves, and their SI partner is easy, but adding a 3rd party company into press releases is difficult.

Q: More community involvement and most of all ACTIVE social media handles. (Not a post here and then 2 or 3 weeks later another post)
A: We have AERGO Knights that we support to do posts for us, as well as our marketing agency. Time to time, AERGO team members participate in the official channels.

Q: Are you able to hire a company to handle the social media side who is able to be ACTIVE and constantly remind old users and new users alike of what is being worked on and what has been done already.
A: We are actively doing this, but we could do this more frequently.

Q: The United Nations bought 2.8M LTO coins (similar project to AERGO) and is going to use it for a digital land registry in Afghanistan. That is proof that huge organizations and governments are not afraid of blockchain tech any more. The AERGO team can use it as an argument in negotiations with companies. AERGO was going to do some land registry as well. What is the progress? Will it / does it use AERGO mainnet?
A: Every country has different laws regarding certification of documents. Just because one country uses blockchain one way, does not mean another country will adopt their policy. The Korean government does not yet recognize certification through public blockchain, however they do recognize private blockchain usage with certain licenses in place. Blocko was able to install AERGO Enterprise for a Korean land registry to certify land deeds to thwart forged deeds. Until the laws in Korea are changed, which we currently see changes in the right direction, we can only service the customers with their current needs. But, the good news is that Blocko and AERGO are positioned with our customers, should the day public blockchain usage becomes certified.

Q: AERGO knows that they are partners of Samsung. But do Samsung know that they are partners of AERGO? Is AERGO really a partner of Samsung or is it just a marketing thingy?
A: Yes, Samsung knows AERGO is a partner of their Samsung Blockchain Wallet. Contracts were signed to be listed on the wallet. We continue to discuss programs that we can implement to promote both AERGO and their Blockchain wallet.

Q: Forbes is mentioning Samsung and its Nexledger on the Blockchain 50 list each year. Is there some way we could be mentioned instead/ next to Nexledger in the list?
A: I believe the answer is no, as Nexledger is Samsung SDS’ product.

Q: What is the strategy for AERGO in Korea? Won-Beom Kim shared there is one. Why do most Koreans not know about AERGO in its own region?
A: We have stepped up our marketing in Korea to the crypto investor community. AERGO is more widely known in the Enterprise segment versus consumer segment.

Q: Why not do live AMA’s with big youtube influencers like BOXMINING OR IVAN ON TECH ?
A: We can look into doing AMAs with influencers, however we are mindful of not spending our marketing budget on everything. Influencers are expensive and we need to be strategic with our budgets.


Q: A weekly or bi weekly summary of what’s been worked on will be useful for keeping the community updated on progress. For example A timeline for all the projects that AERGO is working on like ISDB, DTT, CCCV.
A: We will consider making a dashboard to show progress of our projects/timelines.

Q: Where is the updated roadmap?
A: It will be published before the end of February.

Q: Why no marketing is being done?
A: Marketing is being done regularly, as we have an agency that helps us with marketing AERGO.

Q: What happened to the Samsung partnership?
A: We have a great relationship and will continue to find joint projects.

Q: Request to reduce supply to 50m from 500m in the ratio of 10:1 like algorand
A: Could you supply a compelling reason for this technically, and not just for token price pumping?

Q: What is AERGO’s plan to increase community engagement and draw in developers to build on mainnet and drive network effect
A: We are hoping to do this with our AERGO Hub product. We will also create programs that incubate compelling developer/projects on AERGO


Q: Are there any plans for community events?
A: We are looking into online events.

Q: What is this year's roadmap? We were told the team is working on releasing it soon
A: Roadmap will be published within the month of February

Q: Have u thought about renewing the brand? New website, Logo etc.?
A: We are looking to refresh with easier view of relevant information

Q: What date will the roadmap be released? The team has been saying we’ll let you know for weeks. It’s been almost 2 months since the new year and halfway into the quarter. We need a date.
A: The roadmap will be published in the month of February

Q: What’s happening with Samsung DeFi partnership? There was meant to be a competition to promote dapps being built on AERGO. What is the update ?
A: We are currently working on the fine details.

Q: What do you plan specifically in terms of marketing? Blog on website, reddit, etc.?
A: We have published a steady stream of blogs and updates. We have not published on Reddit, however we are looking into assigning someone to this. Also looking frequently into micro promotions.

Q: Jae Shin mentioned that the team hopes for Korean government to start to recognize crypto currencies as a security. What is the reason for this?
A: Currently cryptocurrencies are not recognized in Korea as a form of security. This prevents banks from using crypto as a form of monetary instrument nor a need for custody. As many countries are not recognizing crypto as a security, this holds back many services from using crypto as a replacement for fiat.

Q: How many tokens are currently staked?
A: Currently there are 48 million AERGO tokens staked, which is about 9.6% of circulating supply.

Q: What separates AERGO from other leading enterprise blockchains and platforms?
A: Unlike other Enterprise Blockchains, it has entered the real business market with a partner called BLOCKO and has the ability to expand further. BLOCKO supports easy interworking between Private Blockchain and Public Blockchain through solutions such as AERGO Enterprise Manager.

Q: Is AERGO/Blocko looking at implementing DeFi? To clarify, I'm aware of the KDAC/Kobit move. I'm talking about yield farming within AERGO, or if AERGO would look at lending, loaning, insurance provision or being an aggregator?
A: AERGO/BLOCKO has no plans to directly develop DeFi and issue tokens. However, AERGO is preparing for the DeFi/Dapp Contest and Incubation, and is looking forward to creating a new business based on the AERGO mainnet.


Q: Are there any plans to list AERGO tokens on Uniswap? If yes, what is the status and when can we expect to see AERGO on Uniswap?
A : The AERGO Foundation does not provide liquidity directly to Uniswap. The community has listed AERGO pairs. You can check it by looking up the latest contract address (0x91Af0fBB28ABA7E31403Cb457106Ce79397FD4E6 ).

We appreciate all the questions you have asked the team and understand your concerns. 2021 has plenty in store for all of us, we will be releasing our updated roadmap soon so keep an eye out for that! In the meantime..

Join our Community!

Discord | Telegram | Telegram RU | Telegram ANN | GitHub | Twitter | Medium | Reddit | KakaoTalk | Weibo (CN)|

AERGO Community Questions was originally published in Aergo blog on Medium, where people are continuing the conversation by highlighting and responding to this story.

Friday, 26. February 2021

Secure Key

Making your digital life easier with Government Sign-In by Verified.Me

The post Making your digital life easier with Government Sign-In by Verified.Me appeared first on SecureKey Technologies Inc..

Coinfirm

Drawing Paths to Risk Sources in the AML Platform

When generating Risk Reports, Coinfirm’s AML Platform checks over 270 algorithms and risk analysis scenarios. Some are direct risks (such as addresses belonging to a hacker or addresses present on sanction lists) and some are indirect. What are Indirect Risks? These indirect risks appear when blockchain addresses that Coinfirm are investigating has transactions with other...
When generating Risk Reports, Coinfirm’s AML Platform checks over 270 algorithms and risk analysis scenarios. Some are direct risks (such as addresses belonging to a hacker or addresses present on sanction lists) and some are indirect. What are Indirect Risks? These indirect risks appear when blockchain addresses that Coinfirm are investigating has transactions with other...

Cozy Cloud

Product update

To make your Cozy, your best ally for 2021: this is the main objective of our team. What's new in Cozy?

This article is also available in French.

To make your Cozy, your best ally for 2021: this is the main objective of our team.

What's new in Cozy?

💶 Cozy Banks

New banks and banking institutions have been added to your banking aggregator, Cozy Banks (Revolut, N26, Natixis Interépargne (Epargne Salariale), etc.).

More than 90 banking institutions available and to be connected to your Cozy

The Banks application is also available on iOS and Android.

🗒 Cozy Notes

While waiting for the possibility to add images in Notes, our team has implemented commands to enrich your notes with version 1.14.0 which includes :

✨ an improvement of the tables: cell merging , choice of column size, coloring of boxes
✨ adding a date
✨ the addition of blocks for code

Cozy Drive - web

✅ Guaranteeing you certified documents

Interface of documents certified by your Cozy

Your Cozy acts as a **trusted third party and certifies that the document conforms to the one uploaded, that it has not been modified. Each unmodified document appears today in your Drive folder with a marker of the brand (see the CAF example above).

It is a big job within our team and in the future, if for example when you present an invoice as proof of address when you register for your children's nursery or for a mortgage loan, your Cozy will guarantee to your contacts that it is indeed the original document, and that it is trustworthy. This functionality will be enhanced to guarantee you more "certified" documents. This is not currently the case.

✅ Making it easier for you to share folders with Drive and Notes

The collaborative sharing functionality in Drive is continuously improved.
For example, by creating a new folder in your Drive, you can :

create a note (here: 20210216Mathematics courses - with Pythagoras Theorem as subject) add a link to the definition of the Pythagorean theorem from the Wikipedia site. add a link from another site to complete the folder you are going to share.

Shared folder interface with bookmarks

The Drive application is also available on iOS and Android (and also on apk😎)

🧠 How to contribute to the project

Many of you have asked us to contribute to Cozy and help us democratize the personal cloud 🚀🚀

How to do it concretely?

👉 share a piece of code when developing an application or connector to automatically retrieve your data from an online service for our platform.
👉 report your experiences with Cozy on our forum or to Claude
👉 if you are a developer, join our team! We are currently recruiting on the Front side

Thank you for reading this article!

👉 If you're a fan of Cozy, talk about it around you 🙌.

📚 List of our latest articles 2021: what now? 2021: why you should use a password manager Customer support: you'll know all about Claude Tech: how we have improved the performance of our applications thanks to data indexing The Social Dilemna: a documentary that will get you off social networks *(for good!) Artificial Intelligence: Choosing decentralized maching learning to respond to the health crisis

IBM Blockchain

Blockchain Newsletter for February: Sustainability, COVID-19, crypto and digital events

Of the 96 companies that have been included in the Forbes Blockchain 50 since 2019, only 12 have made the list for all three years — including IBM. Three other 2021 listees have partnered with IBM for their claims to Forbes fame. A.P. Moller-Maersk made the list for TradeLens, the international shipping platform co-created with […] The post Blockchain Newsletter for February: Sustainability, COV

Of the 96 companies that have been included in the Forbes Blockchain 50 since 2019, only 12 have made the list for all three years — including IBM. Three other 2021 listees have partnered with IBM for their claims to Forbes fame. A.P. Moller-Maersk made the list for TradeLens, the international shipping platform co-created with […]

The post Blockchain Newsletter for February: Sustainability, COVID-19, crypto and digital events appeared first on Blockchain Pulse: IBM Blockchain Blog.


Dark Matter Labs

Building Mental Wealth

Rebuilding our foundational infrastructure We need to step into a phase of rebuilding. Rebuilding not towards what we had but towards an increasingly antifragile and empathetic society. This transition will require governments, institutions, companies, and associations to rethink and reconstruct the contextual settings [FIGURE 1] in which people operate - society’s platform for the future. It is

Rebuilding our foundational infrastructure

We need to step into a phase of rebuilding. Rebuilding not towards what we had but towards an increasingly antifragile and empathetic society. This transition will require governments, institutions, companies, and associations to rethink and reconstruct the contextual settings [FIGURE 1] in which people operate - society’s platform for the future. It is increasingly clear how the micro molecular violences of late 20th century society e.g. air pollution, noise pollution, light pollution, household stresses, nutrient decline, etc. foundationally impact our mental health and wellbeing and thereby limit the collective capacity of our society.

Remaking our institutional, environmental and cultural contexts will be essential to the development of people’s cognitive, emotional, and social capacities. In a time of complexity and uncertainty, these capacities provide foundational infrastructure for our 21st- and 22nd-century society and economy. They are of the highest priority during the nascent rebuilding phase.

Figure 1: Overview of the contextual settings that drive mental wealth

But how do we work towards rebuilding people’s cognitive, emotional, and social capacities when there are increases in depression rates, suicides, domestic violence, and unemployment? How do we rebuild public good and the ability to imagine another future when existential hope seems lost?

Drawing from our experiences across Sweden, Canada, and South Korea, we have identified five vital areas of focus during the rebuilding phase. This blog post examines key lessons that have been learned.

Five key take-aways

1. Language and metrics for a systemic policy architecture

Through conversations and research it has become clear that the taxonomy of mental health is largely associated with treatment, failure to treat and the clinical space of operation. This reality has structurally limited our capacity to systemically perceive and allocate resources to prevention and perhaps more critically thriving. (We define thriving as the substantial improvement of mental health at a societal scale; an achievement equivalent to the 20th century feat of doubling life expectancy).

Further, if we are to rebuild a societal-scale and entangled infrastructure, we need to start by describing the same thing, using a shared language. Currently, our taxonomy landscape is not only varied, but also a driver of different objectives. Either it encourages the legitimate need to increase funding for treatment and support services or it encourages the genuine investment in deep prevention and thriving. A plurality of perspectives visible across contexts: New Zealand, Wellbeing Budget; the World Health Organisation, Mental Health; the UK Government, Mental Capital and Wellbeing; and the Swedish Government, Psychological Health.

We believe we currently don’t have the descriptors necessary to make a case to for this entangled infrastructure, thereby operating in a state of ambiguity and challenge, trying to chart a shared way forward. Therefore, our next step needs to be shared language and taxonomy; taking into account the nuanced cultural, historical, and contextual differences and the added complexity they generate.

In software engineering, ubiquitous language describes the idea of domain specific languages. While this is not directly comparable, it is possible to learn how shared language among actors in a single domain can enable rigorous and reliable communication between them, unlocking the potential for collective problem-solving. In the absence of existing terminology, we will refer to a shared language for society’s cognitive, emotional, and social capacities as mental wealth.

In addition to providing clarity and direction, shared language can further encourage the design of shared metrics. These metrics help us understand the current mental wealth of a specific context, its evolution over time, and how it compares to other contexts. What is learned from data collection and analysis can inform the development of new policies that consider the entangled challenge of rebuilding our mental wealth.

There are already some metrics in place for mental health using zoning systems. However, these are based on the individual scale while mental wealth is concerned with the societal scale. Once again, language becomes important. As mental wealth involves multiple actors and domains, we need to be able to make sense at scale, as well as across sectors and departments. Without this common ground, it becomes difficult to write cross-contextual policies. It seems to be that this is one of the reasons why a systemic policy architecture for mental wealth is nowhere to be seen.

2. Investing in an entangled infrastructure

We believe that mental wealth can not be seen as a cost. Instead, it needs to be seen as an infrastructure enabled through societal-scale investments. By nature, mental wealth is a systemic problem, driven by multiple factors [FIGURE 2]. Investing in it requires us to move beyond short-return cycles and single-point investments. We need to co-curate portfolios of investments that are able to reach into the future, span across sectors and departments, and unlock innovation.

Figure 2: Drivers of mental health and wellbeing and the investment classes required to build mental wealth

Building a mental wealth portfolio requires us to focus on improving the contextual settings in which people operate. These settings create the foundation on which society’s cognitive, emotional, and social capacities can develop. This could include:

Increasing a community’s nature-based areas. For example, by introducing urban forests. Reducing nighttime light pollution. For example, by adopting dark-sky solutions. Supporting the availability of sports and other activities. For example, through initiatives like Sparks Generation. Developing non-polluting public transport that provide citizens with access to their surroundings, as exemplified by Mobility as a Service.

Each intervention of this type has spillover effects across sectors and departments. They require us to build the capacities and the elements that coordinate, measure, transact, pool, and aggregate the co-beneficiary results, their spillover effects, and their beneficiaries [FIGURE 3]. Without the ability to do this, we will struggle to invest in the entangled infrastructure of mental wealth.

Figure 3: Co-benefits, spillover effects, and beneficiaries of a community urban forest 3. An accounting framework for the future

It is often said that we make what we measure. If this is the case, there is a risk that investment in mental wealth could be under-prioritised as a result of missing metrics. Without them, we lack the data to build a case. More importantly, we are deprived of the ability to account for, drive, and value the co-beneficial results and spillover effects of a mental wealth portfolio.

For example, healthcare institutions know well how to account for the cost of treatment. They have done so for generations, with balance sheets remaining within health departments. However, when it comes to health budgets and accounting for prevention, it is unclear how they account, individually or collectively, for the costs and values created, especially as these often relate to future risks and the liabilities of not addressing the risks.

There is, then, a case for introducing an accounting framework that spans across silos and is able to account for the long-term effects of our mental wealth infrastructure, including benefits, risks, and liabilities. Values which might otherwise have disappeared from the public good become the very reason for investment in the first place.

4. Capacity for horizontal collaboration

Unless we are able to build distributed capacity to collaborate and drive a co-beneficial thesis, we risk becoming permanently stuck in silos. Today, many of us are trained to optimise one silo at a time; a reality where the boundary of a problem is defined by the name of the department. However, mental wealth is a wicked problem that ignores and spans boundaries. It is necessary, therefore, that we build the human capacity to (re)organise for an entangled and dynamic system.

Governments need to be able to collaborate across multiple domains and actors, creating a shared portfolio of interventions to invest in. These should enable:

Horizontal innovation across whole value chains. Shared understanding of the portfolio created. A foundation to make collective decisions Distributed agency to move different interventions forward.

This approach can move us beyond departmental and sectoral divisions and address the reality of a dynamic system.

Figure 4: The reinforcing feedback loop between horizontal collaboration and mental wealth

Building the human capacity to collaborate horizontally is, at the same time, dependent on our mental wealth. Without developing people’s social and emotional capacity, it becomes difficult to unlock complex collaboration. It is necessary, therefore, to design capacity-building processes that incorporate social and emotional training. Without it, we will struggle to implement co-beneficial solutions where the human is no longer optimised to be a ‘bad robot’ but, rather, a highly cognitive, emotional, and social being.

Horizontal collaboration further requires the enabling of cross-departmental and sectoral budgeting. The healthcare sector is facing a treatment crisis and the education sector a learning crisis. Asking these sectors to move significant budgets from doctors, nurses, and teachers in order to invest in an urban forest becomes problematic. Even in cases where it could prevent more people from showing up at the socially distanced waiting rooms or from drifting away during an online class owing to a neighbourhood’s poor air quality. Therefore, we need to build the elements that allow us to pool funds horizontally. Allowing health and education sectors to use seed funding as a directional tool, inviting other investment-strong departments to follow and provide the societal-scale investments required.

Once the horizontal budget is on its way, we need to talk about procurement. Procurement is the most common tool for investing in infrastructures, especially in relation to the public sector. It has many upsides and downsides, laid out well by our colleague Alastair Parvin in ‘After the Crisis’. The most significant challenge regarding mental wealth is the ability to procure for co-beneficial outcomes across departments. The European Commission refers to public-public cooperation as an enabler of contracts between departments without having to call for tender. However, it is clear that these corporations are still rare cases and do not allow the involvement of private bodies. This reality prompts the question: How do we increase procurement for co-beneficial outcomes within and beyond the public sector, for the public good?

We do not have all the answers to this question. However, we have shared initial thoughts about it in our ‘Bridging the Investment’ series and look forward to continuing to unpack how to enable horizontal collaboration for mental wealth.

5. Insuring societal scale innovation

Today large investment goes towards treatment related innovation. However, the societal scale innovation required to rebuild an entangled infrastructure goes beyond treatment and sits between sectors and departments. Traditional financing models for grants and public funding struggles to provide the speed or quantity of capital required to bridge this innovation gap. Impact investing also falls short, as risk models cannot price the scale or nature of the societal solutions required. As such, innovation for mental wealth tends to fall outside eligibility criteria.

Therefore, we ask ourselves: how do we fund at speed the scale of the necessary innovation? The answer may not lie in innovation itself, but in the ability to dynamically match innovation to future and societal scaled liabilities. In an age of complexity and uncertainty, the matching of these liabilities with potential hedges or solutions is a structural issue for innovation financing. It is what is called a settlement risk and unless we bridge the risk between liability holders and innovators we might not be able to ensure the societal scaled innovation required to rebuild our mental wealth.

Figure 5: financing ecosystems for multi-actor innovation capacity

There are still many questions related to how this could be done. Currently we are exploring the potential of creating financing ecosystems for multi-actor innovation capacity and a contract that can transfer the settlement risk to underwriters. At the centre of such an ecosystem, a Public Interest Trust operates to 1) pool the liabilities of mental illness and 2) aggregate the investment required to fund innovation for mental wealth. The high investment risk would be underwritten by grant and philanthropic funds, rather than being provided by them [FIGURE 5]. Thereby, larger investments can be attracted, creating a financial hedge against the settlement risk. Over time this hedge would be transformed into a natural hedge, as the innovation becomes the insurer of the liabilities.

Due to their size and density, cities are currently the main liability holders of mental wealth and innovators are the ones providing the solutions able to hedge these risks. Therefore, we see cities (including public, private and civic) and innovators (public, private and amateur) as the main actors in the ecosystem. An ecosystem, which we argue could help us bridge the innovation gap and support the systemic rebuilding of our mental wealth [FIGURE 6].

Figure 6: Enabling the rebuilding of Psychological Wealth Moving towards rebuilding

As we look forward to a society that no longer favours precariousness and starts rebuilding an antifragile foundation for mental wealth, we recognise the need of a “boring revolution”. A revolution focused on delivering a co-beneficial economy that transitions our contextual settings. A revolution of how we account, how we bridge silos and how we invest in societal infrastructures.

Our mental wealth is the foundational entangled infrastructure that enables society to build back better. It is an infrastructure in need of a revolution to manifest and without it, we will not be able to create the inclusive societal capacities that unlock the full potential of our 21st-century and 22nd-century reality.

This is a revolution necessary for the future and of equivalence to the late 19th century/20th century revolution of public health which yielded the 2nd industrial revolution.

We are grateful having had the chance to explore these questions with passionate people. Today we like to give special thanks to: Fredrik Lindencrona, Camilla Evensson, Caroline Steinersted, Dan Hill, Gabriella Dahlberg, Sebastian Meijer, Tomas Bokström, and the team at Dark Matter Labs: Anna Rosero, Eunji Kang, Hyojeong Lee, Indy Johar, Linnéa Rönnquist and Pam Sethi.

Text by Linnéa Rönnquist, Indy Johar and Richard Martin. Graphics by Linnéa Rönnquist and Hyojeong Lee

Building Mental Wealth was originally published in Dark Matter Laboratories on Medium, where people are continuing the conversation by highlighting and responding to this story.


KYC Chain

Compliance Challenges for VASPs and Automated KYC Solutions

Expanding global AML regulations are driving compliance costs up for financial service providers, including crypto companies. This article explores how Automated KYC Solutions can help reduce financial pressure on startups while ensuring they attain compliance through robust AML protocols.  The post Compliance Challenges for VASPs and Automated KYC Solutions appeared first on KYC-Chain.

auth0

Build a User Signup Counter with Arduino, Part 1

Learn how to build a physical user signup counter for your Auth0 tenant with Arduino
Learn how to build a physical user signup counter for your Auth0 tenant with Arduino

Okta

Unity WebGL + PlayFab Authorization in 20 Minutes

As game creators, we hold a fundamental responsibility to protect and secure any and all data that our players entrust to us. Historically, security in games has not been high on the priority list because games were enjoyed anonymously and offline. But with our society becoming ever more connected, demand for social gaming experiences and online gaming features has increased exponentially. This ha

As game creators, we hold a fundamental responsibility to protect and secure any and all data that our players entrust to us. Historically, security in games has not been high on the priority list because games were enjoyed anonymously and offline. But with our society becoming ever more connected, demand for social gaming experiences and online gaming features has increased exponentially. This has led to the development of cloud-based gaming platforms, such as PlayFab, that handle everything from multiplayer server hosting, leaderboard tracking, virtual economy, data/analytics, and much more. Even games that are mostly considered “offline” are seeing increased demand for some connected functionality. As more games are developed in the cloud (or developed with cloud-based platforms) and track an increasingly large amount of user data, security is becoming one of the highest priorities for the games community.

In their 2020 State of the Internet / Security report, Akamai tracked 100 Billion credential stuffing attacks from June 2018 to June 2020 and found that 10 Billion of these attacks were targeted at gamers. It is not just up to the player or the distribution platform to take security seriously. Players expect that the game companies producing the product that they are trusting their data to will also be keeping them secure.

In Identity Security for Games in C# with Unity, we took a look at the extreme basics of storing and authenticating users in a Unity project. That example is a great starting point for the fundamentals, providing a “hello world” concept within the Unity editor. I want to expand upon how the design can change depending on the build target and how to make use of a player’s authorization by passing it into other back-end platforms. To do this, we will be building a WebGL application in Unity that authenticates player’s and authorizes them to Azure Playfab; where their player data will be stored. Here is a conceptual overview of what this will look like:

Prerequisites Sign up for a free Azure Playfab account.

Sign up for a free Azure account.

Download and install Unity. Create a new Unity project from Unity Hub with the 3D template Download and configure the Playfab Unity SDK and Editor Extensions Build Target and Design

For this project, I will be targeting Unity WebGL. The build target is important because different platforms will require different means of facilitating user interaction.

In Identity Security for Games in C# with Unity, I described both native and OAuth design concepts. While building out a user interface for authentication natively in the engine might seem like the best approach because of user experience, it is not the preferred approach for security, and it typically adds much more effort for the developer. This is because it requires the developer to build out logic supporting the entire authentication state machine; securely handling every event, every MFA state, registration, MFA enrollment, user self service (account unlock and password reset), etc.

Prebuilt widgets can simplify the amount of effort in this regard. However, developers would still be asking a player to enter their password into an untrusted UI that the developer is responsible for and typically not able to fully keep secure. This is why OAuth is considered the best practice standard for authorization within the security industry, and utilizing a security platform’s hosted login experience is considered the most secure way to handle authentication. Players will be entering their credentials into a trusted browser window and a trustworthy security platform. Best of all, developers can share the responsibility of keeping the login experience with the security platform, which is much easier and more secure than attempting to do it all themselves.

Ok, great. But how does this relate to the build target? The browser. OAuth relies on a browser to facilitate user authentication, which must validate the user identity before providing authorization. Every platform will have a different, platform-specific way of presenting the user with a browser to interact with. For WebGL, the game is already running in a browser, so a simple pop up is all that is needed. But if the build target were Android, Chrome Custom Tabs would instead be needed to embed the browser into the app. Similarly, if the build target was iOS, Safari View Controller would be needed to embed the browser into the app. For full-screen games on a PC, it would be better to use a device code concept, similar to a TV or IoT device, or even authenticate the player before launching the game client; from an external launcher application interacting with the default browser on the operating system. The browser interaction and design will change for every platform and device. When designing the authentication experience, the build target will heavily influence the rest of the design.

The good news is that once the target is defined and the browser interaction decided, the rest of the code should be similar across all platforms. This is thanks to the shared responsibility with a security platform’s hosted login experience and a standardized authorization specification. The job of the developer is to securely provide a platform-preferred browser experience for the user and direct them to the OAuth authorize endpoint. Once the user has authenticated, the browser is closed, and an auth code is captured in client code to be exchanged for tokens behind the scenes. With tokens in hand, user authorization can be passed to all other backend cloud platforms; providing a centralized, and secure, identity security design that drastically simplifies the implementation effort.

With a build target in mind, lets look at how to code this in Unity.

Create the Project

First, if you have not already done so, create a new project from Unity Hub with the 3D template.

Next, configure the Playfab Unity SDK and Editor Extensions using the Unity Quickstart provided by PlayFab.

Create a new scene by clicking File, New Scene and save it in the projects Assets > Scenes folder.

Change the project to targetWebGL by, clicking File, Build Settings, WebGL, and then click Switch Platform.

Note: If you did not install WebGL when you installed this copy of the editor, you would be prompted to do so now.

Make sure that your newly created scene is checked in the Scenes In Build menu, and then close the Build Settings UI.

Note: If you do not see your scene, click the Add Open Scenes button in the bottom right of the Scenes In Build menu.

Right-click in the Scene Hierarchy panel and click UI, Button.

This will be the button that triggers the authentication request.

Right click the button in the Hierarchy panel, click Rename and name the button Sign In.

Expand the Sign In button in the Hierarchy panel to expose its text object and select it.

Change the text of the button in the Inspector panel to Sign In.

Interact With the Browser

With the project ready, the next thing needed is a way for the WebGL object to interact with the browser HTML that is holding it. This is because the C# in Unity will be called when the user clicks the Sign In button and that C# code needs to instruct the browser to render a popup window for the user to complete authentication.

To do this, a jslib is used to act as the middleman between the C# in Unity and the JavaScript in the WebGL objects parent page.

Open up a preferred text editor, create a new file called OAuth.jslib, and save it in your project’s Assets > Plugins folder.

Add the following to OAuth.jslib and save it.

mergeInto(LibraryManager.library, { startAuthentication: function (utf8String) { var authRequest = UTF8ToString(utf8String); var authorizationRequest = authRequest; startAuth(authorizationRequest); } });

This code very simply takes an OAuth authorization request URI, passed to it from the calling C# in Unity, and passes it to a corresponding startAuth() function in the parent page’s JavaScript. This startAuth() function will instruct the browser to open a popup and redirect to the OAuth authorization URI passed to it by the C# in Unity. The WebGL object calls the jslib which runs in the browser, allowing it to interact with the code running in the browser where the WebGL object cant.

Next, create the startAuth() function in the JavaScript within the WebGL’s parent page so that the C# can interact with it via the OAuth.jslib. To do this, start by building the WebGL project in Unity, which will generate an index.html that renders the WebGL object.

Click File, Build Settings and then click Build.

Select a preferred output folder and then click Select Folder.

Once the build process is complete, open the output folder and then open the index.html file in a preferred text editor. The contents of index.html may vary depending on if you are on Windows or Mac and depending on what version of the editor you are using. The important thing to note in what Unity generates is that there should be a <script> element towards the top of the HTML that contains the following:

<script> var unityInstance = UnityLoader.instantiate("unityContainer", "Build/WebGL2.json", {onProgress: UnityProgress}); </script>

unityInstance is the container that holds the WebGL object. This is how the page’s JavaScript will be able to call back to the C# code in Unity using:

unityInstance.SendMessage(objectName, methodName, value);

Find the <body> tags in the HTML and locate the closing </body> tag. Right before the closing tag, at the end of body, add the following:

<script> function startAuth(authorizationRequest) { window.addEventListener('message', function (e) { unityInstance.SendMessage("OAuth2", "getAuthResults", data); }, false); var childwin; const childname = "Authenticate"; childwin = window.open(authorizationRequest, childname, 'height=300px, width=500px'); childwin.focus(); }; </script>

This is the JavaScript function that the Sign In button in the WebGL object will trigger. The button does this by first calling a function in C# on its click event. That function will then call the OAuth.jslib which will then call this startAuth() function in the page JavaScript. This represents the best way for WebGL to provide browser interaction to the player from within the WebGL object.

window.addEventListener('message', function (e) { unityInstance.SendMessage("OAuth2", "getAuthResults", data); }, false);

window.addEventListener() creates an event listener that will be called by a child popup window when authentication is complete. The result of authentication will be an Auth Code sent back to the C# in the WebGL object via unityInstance.SendMessage().

var childwin; const childname = "Authenticate"; childwin = window.open(authorizationRequest, childname, 'height=300px, width=500px');

With the event listener ready, window.open() is used to create a popup window to the OAuth authorization request URI.

Finally, childwin.focus(); ensures that the popup focuses on the user to interact with.

Now that the WebGL’s parent page can initiate the authentication process and wait for a response from the authentication process, the next thing that is needed is a separate callback.html page for the OAuth authorization server to redirect the user back to after authentication. Because the authentication process is happening in the parent pages popup window, the callback.html page will also be rendered after authentication is successful. This will allow the callback.html page to receive the resulting Auth Code and send it back to the parent page via the event listener.

In the same folder as index.html create a new file called callback.html using a preferred text editor. The HTML in this page should be designed based on project needs, but it typically will not be rendered long enough for a player to see it. Add the following and customize the HTML as desired:

<!DOCTYPE html> <html lang="en-us"> <head> <script> //Parse query string helper function function getParameterByName(name, url = window.location.href) { name = name.replace(/[\[\]]/g, '\\$&'); var regex = new RegExp('[?&]' + name + '(=([^&#]*)|&|#|$)'), results = regex.exec(url); if (!results) { return null; } if (!results[2]) { return ''; } return decodeURIComponent(results[2].replace(/\+/g, ' ')); } //Get auth code from query string var code = getParameterByName('code'); //Get state from query string var state = getParameterByName('state'); window.opener.postMessage(code + "," + state, "*") try { window.close(); } catch (e) { console.log(e) } try { self.close(); } catch (e) { console.log(e) } </script> </head> <body> </body> </html>

getParameterByName() is a helper function that will parse the Auth Code and OAuth State from the query string.

window.opener.postMessage() sends the State and Auth Code to the parent window’s event listener so that it can send them back to a callback function in the WebGL’s C#.

Finally, the popup window is closed, putting the player back on the parent page where the WebGL object is waiting for authentication to complete.

With the browser portion of the project complete, the next step is to create the OAuth code in Unity that will be called by the Sign In button’s on click event.

Note: Now that customizations have been made to index.html, it will need to be backed up and replaced every time the project is built because Unity will overwrite it each time.

Adding OAuth Configure an Authorization Server

Many OAuth authorization solutions exist today. Okta is the industry-leading, best of breed, identity security platform on the market and allows for creating an unlimited number of OAuth Authorization servers quickly and easily.

Okta provides the easy-to-use Okta CLI, which will simplify the registration and setup process. First, Install the Okta CLI for your preferred operating system. With the CLI installed, run the following command:

okta register

Follow the instructions to register for a new Okta account. The CLI will handle the Okta configuration automatically. If you already have an Okta account use the following command instead:

okta login

Note: okta login requires a valid Okta API Token to be entered. This will be stored in the CLI’s okta.yaml file locally.

Next, run the following command to set up an OAuth app.

okta apps create

Give the app a name, and select option 2, Single Page App, for the application type.

The Redirect URI needs to point to where the callback.html page will be hosted. Use the following URI format, replacing appName with a name that will be used later when deploying the Unity project to Azure App Services.

https://appName.azurewebsites.net/callback.html

The Post Logout Redirect URI will need to point to where the index.html page will be hosted. Use the following URI format, again replacing appName with a name used later when deploying the Unity project to Azure App Services.

https://appName.azurewebsites.net/callback.html

Finally, select the default authorization server if prompted.

The CLI will create the application and return an Issuer along with a Client ID. Make sure to take note of both which will be used when creating the OAuth script.

Create the OAuth Script

In Unity, install Newtonsoft.Json for Unity using the instructions outlined in the library repo.

Next, create a new folder in the Assets folder and name it Scripts.

Right Click in the scripts folder, click Create, and then select C# Script. Name the script OAuth and double click to open it in Visual Studio.

Start by referencing the following libraries:

using System.Collections.Generic; using UnityEngine; using System; using System.Text; using System.Security.Cryptography; using System.Runtime.InteropServices; using Newtonsoft.Json; using UnityEngine.Networking;

Next, create a new class called OAuth and declare the needed instance variables to hold the client’s OAuth configuration.

Note: The instance variables could pull from a preferred configuration solution, or be exposed to the editor for configuration if desired.

public class OAuth2 : MonoBehaviour { [DllImport("__Internal")] private static extern void startAuthentication(string authRequest); // OAuth2 Client Configuration private const string clientID = "{clientId}"; private const string authorizationEndpoint = "/oauth2/default/v1/authorize"; private const string tokenEndpoint = "/oauth2/default/v1/token"; private const string userInfoEndpoint = "/oauth2/default/v1/userinfo"; private const string redirectURI = "https://appName.azurewebsites.net/callback.html"; private string state; private string code_verifier; }

Notice DllImport referencing the startAuthentication() function from the jslib at the start of the class. This is importing the jslib so that it can be called from the script.

The OAuth endpoints to be used are defined in the instance variables and the URI to the callback.html.

Enter the Okta Domain for your Okta org where indicated in the code above. Change appName in the redirectURI variable to match the app name used when configuring the approved redirectURI in Okta. Remember, this app name will be used later when hosting the project on Azure App Services.

Create a new function called Authenticate() to be called by the Sign In button’s on click event.

public void Authenticate() { // Generates state and PKCE values. state = randomDataBase64url(32); code_verifier = randomDataBase64url(32); string code_challenge = base64urlencodeNoPadding(sha256(code_verifier)); const string code_challenge_method = "S256"; // Creates the OAuth 2.0 authorization request. string authorizationRequest = string.Format("{0}?response_type=code&scope=openid%20profile&redirect_uri={1}&client_id={2}&state={3}&code_challenge={4}&code_challenge_method={5}", authorizationEndpoint, System.Uri.EscapeDataString(redirectURI), clientID, state, code_challenge, code_challenge_method); startAuthentication(authorizationRequest); }

The first thing that Authenticate() does is prepare the OAuth Authorization request. This is the request sent to the jslib, and then to the JavaScript running in the browser. The player will be redirected to this URI in a browser popup.

Authorization Code grant with PKCE is the most secure and preferred OAuth grant to use. Historically, OAuth has required a client secret to be passed in the authorization request from the client. This design worked fine for back-end services that could securely store the client secret but is flawed in the case of SPA-type applications that do not have a back end. Proof Key for Code Exchange (PKCE) was created as a way to remove the need for the client secret altogether. A code_verifier is randomly generated by Authenticate() and then hashed with SHA-256 to create a code_challenge. The code challenge is what replaces the client secret and the code verifier is what will be used to validate the request for tokens after authentication.

Next, Authenticate() builds the authorization request URI with all of the required parameters - including the code challenge, a randomly generated state string, and a redirect uri pointing to the callback.html - and passes it to the jslib startAuthentication() function so that it can be passed to the browser for the popup to navigate to.

Once the popup is displayed, the user authenticates and is redirected back to the callback.html page with an OAuth Auth Code and exact copy of the State string that was included in the authorization request. The callback.html uses window.opener.postMessage(code + "," + state, "*") to send the Auth Code and State back to the parent page holding the WebGL object which triggers the event listener to call unityInstance.SendMessage("OAuth2", "getAuthResults", data);. SendMessage() needs to call a function in the OAuth2 class called getAuthResults(), which will process the authorization results and exchange the auth code for tokens. Add the following code to OAuth.cs:

public void getAuthResults(string authResult) { string[] s = authResult.Split(','); string code = s[0]; // Auth code from javascript string incoming_state = s[1]; //state from javascript Debug.Log(code); Debug.Log(state); if (incoming_state != state) { Debug.Log(String.Format("Received request with invalid state ({0})", incoming_state)); return; } performCodeExchange(code, code_verifier, redirectURI); }

This function simply receives the results from the authorization request and parses them. The next step is to exchange the Auth Code for an OAuth Access Token and ID Token.

void performCodeExchange(string code, string code_verifier, string redirectURI) { Debug.Log("Exchanging code for tokens..."); //Generate token request string tokenRequestBody = string.Format("code={0}&redirect_uri={1}&client_id={2}&code_verifier={3}&scope=openid&grant_type=authorization_code", code, System.Uri.EscapeDataString(redirectURI), clientID, code_verifier); UnityWebRequest uwr = new UnityWebRequest(tokenEndpoint, "POST"); var contentBytes = new UTF8Encoding().GetBytes(tokenRequestBody); uwr.uploadHandler = new UploadHandlerRaw(contentBytes); uwr.downloadHandler = new DownloadHandlerBuffer(); uwr.SetRequestHeader("content-type", "application/x-www-form-urlencoded"); Debug.Log("TOKEN REQUEST BODY: " + tokenRequestBody); UnityWebRequestAsyncOperation async = uwr.SendWebRequest(); async.completed += (AsyncOperation op) => { GetTokenExchangeResponse(async); }; }

A token request is created using the auth code and code verifier. UnityWebRequest() is used to handle the token request, and the response is sent to a function called GetTokenExchangeResponse() once complete. Create GetTokenExchangeResponse() by adding the following code:

private void GetTokenExchangeResponse(UnityWebRequestAsyncOperation op) { Debug.Log("Got Token Response"); string responseText = op.webRequest.downloadHandler.text; Dictionary<string, string> tokenEndpointDecoded = JsonConvert.DeserializeObject<Dictionary<string, string>>(responseText); string access_token = tokenEndpointDecoded["access_token"]; string idToken = tokenEndpointDecoded["id_token"]; PlayFabInterface.Instance.LoginWithOpenIdConnect(idToken); }

GetTokenExchangeResponse() simply parses the OAuth Access Token and OAuth ID Token from the response to the token request made by performCodeExchange(). The last line of the function sends the OAuth ID Token to PlayFabInterface.Instance.LoginWithOpenIdConnect() so that it can be used to log the player into PlayFab, which is the last step in this project before it can be deployed.

Finally, Authenticate() referenced a few helper functions for generating the code challenge\verifier, encryption, and encoding.

/// <summary> /// Returns URI-safe data with a given input length. /// </summary> /// <param name="length">Input length (nb. output will be longer)</param> /// <returns></returns> public static string randomDataBase64url(uint length) { RNGCryptoServiceProvider rng = new RNGCryptoServiceProvider(); byte[] bytes = new byte[length]; rng.GetBytes(bytes); return base64urlencodeNoPadding(bytes); } /// <summary> /// Returns the SHA256 hash of the input string. /// </summary> /// <param name="inputString"></param> /// <returns></returns> public static byte[] sha256(string inputString) { byte[] bytes = Encoding.ASCII.GetBytes(inputString); SHA256Managed sha256 = new SHA256Managed(); return sha256.ComputeHash(bytes); } /// <summary> /// Base64url no-padding encodes the given input buffer. /// </summary> /// <param name="buffer"></param> /// <returns></returns> public static string base64urlencodeNoPadding(byte[] buffer) { string base64 = Convert.ToBase64String(buffer); // Converts base64 to base64url. base64 = base64.Replace("+", "-"); base64 = base64.Replace("/", "_"); // Strips padding. base64 = base64.Replace("=", ""); return base64; } Log Into PlayFab Configure PlayFab OIDC

Before PlayFab can accept authorization tokens from Okta, a new Open ID Connection will need to be created. This is the connection that will give PlayFab the needed information to validate the player’s ID Token.

Log into your PlayFab portal and click on the title for your project. If you do not have a Studio or Title created, create them now. The studio and title would have been needed for the PlayFab Unity SDK setup.

Next, click the gear next to the title name in the top left and select Title Settings.

Click the Open ID Connect tab and select New Connection

In the New Connection menu, name the Connection ID Okta and enter the Client ID and Issuer from the app set up by the Okta CLI. Client Secret is not used because of PKCE , so enter any random string into the Client Secret box in PlayFab. Finally, PlayFab is just validating an ID Token from an external authorization server so nonce is not needed here. Check Ignore nonce and click Save Connection.

PlayFab is now able to validate ID Tokens from the OAuth Authorization Server in Okta.

Create the PlayFab Interface Script

Now that the player has successfully authenticated, the last step is to authorize the player to PlayFab using the ID token received after authentication.

In Unity, right click in the Assets > Scripts folder and select Create, C# Script. Name the script PlayFabInterface.cs and double click to open the script in Visual Studio.

Start the script with the following using statements:

using UnityEngine; using PlayFab; using PlayFab.ClientModels;

Next, create a new class called PlayFabInterface:

public class PlayFabInterface : MonoBehaviour { }

Create the LoginWithOpenIdConnect() function:

public void LoginWithOpenIdConnect(string _idToken) { Debug.LogError("LoginWithOpenIDConnect Start! Token: " + _idToken); var request = new LoginWithOpenIdConnectRequest(); request.ConnectionId = "Okta"; request.IdToken = _idToken; request.CreateAccount = true; PlayFabClientAPI.LoginWithOpenIdConnect(request, OIDCLoginSuccess, OIDCLoginFailure, _idToken); }

Finally, handle the results:

public void OIDCLoginSuccess(LoginResult _result) { Debug.Log("Login With OIDC Success"); } public void OIDCLoginFailure(PlayFabError _error) { Debug.LogError("Login With OIDC Failure"); Debug.LogError(_error.GenerateErrorReport()); }

The user will now log into PlayFab using their ID Token from the OAuth Authorization Server in Okta.

Wire It Up

Now that the authentication logic exists, it just needs to be triggered. The Sign In button created earlier will need to call the Authenticate() function from its on-click event. The OAuth.cs and PlayFabInterface.cs scripts will also need to be added to a game object in the scene.

In Unity, right-click in the scene Hierarchy panel and then click Create Empty. Rename the game object to OAuth.

Drag the OAuth.cs and PlayFabInterface.cs scripts from the Scripts folder and onto the empty game object in the Hierarchy panel.

In the Hierarchy panel, expand the Canvas object holding your Sign In button, and then select the Sign In button.

In the Inspector panel, find the On Click() box and drag the OAuth game object onto it. Select the function dropdown, click OAuth2, and then click Authenticate().

The button will now call the Authenticate() function in OAuth.cs when the player clicks it.

Deploy to Azure App Services

Because of the jslib and browser code’s complexity, this project cannot be run from the editor directly. Azure App Services makes the deployment of the entire project easy.

First, back up your index.html and callback.html files.

In Unity, click File, Build Settings and then click Build to rebuild the project. Select the desired output folder and click Select Folder.

Navigate to the output folder selected and overwrite the index.html and callback.html files with the ones you backed up previously.

Open a preferred text editor, create a new file in the output folder called web.config, and add the following:

<configuration> <system.webServer> <staticContent> <mimeMap fileExtension=".unityweb" mimeType ="TYPE/SUBTYPE" /> </staticContent> </system.webServer> </configuration>

This web config file will make sure that the WebGL object is properly loaded when hosted.

Next, open Visual Studio and click Continue without code from the startup wizard.

Click File, Open, Web Site and select the output folder where the index.html and callback.html are located.

Next, click Build, Publish Web App, and then click Start on the Publish tab. Select Azure and click Next.

Select Azure App Service and click Next.

Click Create New Azure App Service.

Give the app the name used when setting up the RedirectURI in the Okta CLI where the placeholder was appName. This example used https://webgloauth.azurewebsites.net/, so the app name to use in Azure is WebGLOAuth.

Select a valid subscription, click New to create a new resource group, click New to create a new hosting plan, and then click Create.

With the new app selected in the publish menu, click Finish.

Note: Before clicking Publish, make sure that the Site URL in the Summary section matches the DNS that was defined in the Okta CLI during the app setup for the callback URI. This example used https://webgloauth.azurewebsites.net/ and https://webgloauth.azurewebsites.net/callback.html which matches the Site URL in Visual Studio. This same callback URI also needs to be correctly defined in the instance variables in the OAuth.cs script in Unity.

Finally, click Publish to deploy the application to Azure App Services.

Once publishing is complete, click the Site URL to navigate to the application and test it out.

Conclusion

Player security is quickly becoming one of the most critical aspects of game development today. More development is being done in the cloud, and more player data is being collected than ever before. While best practice for authentication and authorization is to use OAuth and a trusted security platform’s hosted login experience, the design to do this changes from one target platform to another. For Unity WebGL, you were able to interact with the WebGL’s parent page to render a popup that facilitated user authentication via your trusted cloud security platform, Okta. The resulting auth code was exchanged for tokens that can authorize the player to backend cloud platforms that your project is using; in this case, Azure PlayFab. Finally, you were able to host the project entirely on Azure App Services. The OAuth tokens used with PlayFab can authorize the user to other backend cloud platforms as needed, allowing for a convenient, centralized security design. The method used to facilitate browser interaction for WebGL will change with other target platforms, but the rest of the design would continue to be similar across all platforms.

Learn More About OAuth and Security in Unity

To keep learning, check out some of our other great guides and tips:

Identity Security for Games in C# with Unity An Illustrated Guide to OAuth and OpenID Connect Implement the OAuth 2.0 Authorization Code with PKCE Flow What the Heck is OAuth? Nobody Cares About OAuth or OpenID Connect

Comment below if you have any questions. Be sure to follow @oktadev on Twitter, follow us on Twitch, like us on Facebook and subscribe to our YouTube Channel for more!

Thursday, 25. February 2021

1Kosmos BlockID

Why the SolarWinds attack worked...

When Brad Smith, Microsoft's President, talks about the SolarWinds attack he certainly doesn't sugarcoat what is now known as the months-long hacking campaign that affected US government agencies and cybersecurity vendors: "I think from a software engineering perspective, it's probably fair to say that this is the largest and most sophisticated attack the world has ever seen..."

When Brad Smith, Microsoft's President, talks about the SolarWinds attack he certainly doesn't sugarcoat what is now known as the months-long hacking campaign that affected US government agencies and cybersecurity vendors: "I think from a software engineering perspective, it's probably fair to say that this is the largest and most sophisticated attack the world has ever seen..."


Indicio

Start simple to scale decentralized identity

The post Start simple to scale decentralized identity appeared first on Indicio Tech.

Start simple to scale decentralized identity The market for decentralized identity is growing—and the key to that growth is to start simple, Indicio.tech’s CTO Ken Ebert, tells KuppingerCole Analysts.

When beginning the journey toward decentralized identity deployment, “The simpler the system you make, the more likely you are to be successful.”

So said Indicio.tech Chief Technology Officer Ken Ebert in a recent presentation to the leading European information security firm KuppingerCole Analysts.

In a simple deployment of decentralized identity, the relationship between credential issuer, credential verifier, and credential holder is already established. Because participants are determined by these relationships, we call this decentralized identity deployment a closed ecosystem.  This allows for an iterative implementation in a controlled credential environment versus the chaos that can ensue with a larger scale roll-out.

Oftentimes the implementation of decentralized identity is viewed solely as a purely technology solution, but in reality, the technology solution is just one leg of a three-legged stool:

One leg does consist of the technology solution— the credential issuer, verifier and holder and the ability to write cryptographic messages to a digital ledger and retrieve them. Another is the business solution, namely, what is the problem that decentralized identity is working to solve? Who are the users and what is the benefit? How does it get funded? Finally, there is the governance solution, how do users get and use their credentials? How does a digital identity deployment comply with the ever-expanding global regulations on data stewardship?

In the case of a company call center, for example, credentials could be issued to a firm’s customers to speed transactions. If customer service can verify customer credentials in five seconds instead of 30, then that savings can be used to fund the continued adoption of decentralized identity. The technology serves to solve the business problem, and the governance is the closed ecosystem of the firm and its customers.

Not to be overlooked, Ebert says, is the role of marketing, both internal and external to ensure success. This includes at the most basic level communicating to users the pain points that decentralized identity will alleviate and hence its value proposition.

The bigger picture, said Ebert, is that marketing colleagues can help document messaging, synchronize press releases and announcements, and educate across the organization the benefits of decentralized identity.

During the course of an initial decentralized identity roll-out, new and expanded applications of the technology and process improvements will emerge. But again, Ebert stressed that expansion should proceed incrementally. The potential steps here are, going back to our call center example, increasing verifiers, issuers and credential holders to encompass more participants in the firm’s supply chain to decrease friction in performing routine transactions.

Decentralized identity is such a transformative technology it’s easy to get lost in building a new world, rather than building something small, fixing one problem at a time, and learning how to expand from that experience. Simple deployments mean you get to fix actual problems in a way that drives buy-in from stakeholders. Try to boil the ocean of decentralized identity, and you risk getting burned.

Indicio.tech is committed to being a resource hub for decentralized identity, providing enterprise-grade open source tools to get our clients, and the community, building solutions today. By providing Private Networks, the Indicio TestNet, and a variety of customizable training programs, companies and organizations from diverse industries around the world rely on Indicio.tech for expertise in building and scaling their decentralized identity solutions.

The post Start simple to scale decentralized identity appeared first on Indicio Tech.


IDnow

German e-government initiative and its effects on eID

As we settle into 2021, we are facing a new outlook. We now know that it is possible to manage most of our lives remotely thanks to digitalisation. The German government has been running a digitalization initiative now for some time, to achieve exactly that and enable citizens to take digital trips to the authorities […]

As we settle into 2021, we are facing a new outlook. We now know that it is possible to manage most of our lives remotely thanks to digitalisation. The German government has been running a digitalization initiative now for some time, to achieve exactly that and enable citizens to take digital trips to the authorities out of the comfort of their homes. The declared goal is to conduct many public procedures digitally by the end of 2022, as stated by the Online Access Law (Onlinezugangsgesetz).

Digital trip to the authorities – but only with identity verification

Identification plays a key role in conducting all business from home, as applicants will need to prove their identity reliably during any public procedures. The necessary technology is already available; since 2017 the German electronic ID has been issued with the online functionality activated by default. Owners can use two-factor-authentication to identify themselves digitally and nowadays most smartphones are equipped with the required Near Field Communication (NFC) Technology, turning the smartphone into a card reader, to do exactly that. Just recently, the German government announced, that they are also looking into implementation of a digital identity card (the so called smart eID, read more about it here), paving the road for e-government further.

A step in the right direction for Financial Services and other industries

Recent news aside, the technological groundwork has already been established and the digitalization initiative by the German government will be a trailblazer for the broader adoption of the electronic Identity Card. While the eID plays an important role for any authority relations, a lot of usage potential lies in other industries, where reliable identification is a core requirement for onboarding new clients. Financial Services, Gaming, Telecommunications and many more might embrace this simplified method of identity verification.

Freedom of choice for your customers – with IDnow eID

With IDnow, you don’t have to wait till 2022! Working with Trust Service Providers, we can offer an eID product that identifies your customers based on their German electronic ID – as an automated process, without waiting times or the need for a service agent. Through our platform approach, we enable you to offer your customers the freedom of choice between different identification methods through just one integration: VideoIdent, eID and – coming soon – BankIdent. Would you like to know more about eID? Take a look at our webpage!


HYPR

HYPR

HYPR is now certified by VMware and Citrix marketplaces.

The HYPR Cloud Platform is now certified by two top virtualization leaders, VMware and Citrix, to help organizations take a true passwordless approach to secure today’s distributed workforce and improve productivity.

The partner integrations make remote login easier and truly passwordless across Windows and Mac computers running Virtual Desktop Infrastructure (VDI). The user experience is simple and flexible:

The user logs into their VDI environment, such as VMware Horizon, or Citrix Storefront, by entering their username. Then, they’re redirected to authenticate to the HYPR Server via their single sign-on (SSO) provider. A push notification is sent to their HYPR mobile app, which requests their FIDO-based authentication such as face or fingerprint. Once successful, they gain access to their VDI environment. If available, they also have the option to authenticate with Windows Hello or TouchID embedded on their computer. This login example redirects HYPR users through VMware’s True SSO. Request a Demo

In today’s security landscape, passwords, even when used alongside traditional MFA, still leave users vulnerable to phishing and credential reuse. It also creates a poor user experience that strains the remote work experience and lowers productivity. By freeing people from passwords and password-based MFA, people can better focus on their work rather than fumbling their way across apps and resources. 

At HYPR, we understand how important VDI is for today’s more distributed global workforces. VDI gives people access to their digital workspaces with minimal management and maintenance by the organization’s IT team. That’s why we teamed up with VMware and Citrix to further bolster the remote experience.

HYPR & VMware

True Passwordless MFA is available via the VMware Marketplace and integrates with VMware Horizon and Workspace ONE using SAML. Employees authenticate once to their Horizon portal and gain secure access to workstations and published resources. Passwords are not needed.

HYPR & Citrix

HYPR is also on the Citrix Ready Marketplace and integrates with Citrix StoreFront on any existing identity provider (IdP) using SAML. The passwordless MFA experience is no different across virtual apps and desktop environments. There are no complex passwords nor one-time passcodes (OTP) to type in. Users simply authenticate once with the HYPR mobile app and gain access to all of their downstream applications.

Key Takeaways HYPR’s certified integration offers remote workers login flexibility across Windows and Mac computers, as well as Android and iOS tablets and mobile devices.  True Passwordless MFA enables organizations to scale their MFA infrastructure along with their growing distributed workforce. A truly passwordless approach across the enterprise enables organizations to better secure their workforce and increase productivity, wherever work is done.

Contact us to learn more on how you can improve the remote and roaming experience for your organization. Not using VDI for your distributed workforce? Check out our demo video on passwordless login for domain-joined computers:

Play

My Life Digital

GDPR Market Research – Take a look at the Businesses getting it right

#ThrowbackThursday MyLife Digital has been at the forefront of people-centric solutions since we started. This article was first published in July 2018 and is still highly relevant today. At that time, we had recently surveyed consumers about their data and rights.  We have completed another survey, this time with both consumers and organisations, again asking … GDPR Market Research – Tak

#ThrowbackThursday

MyLife Digital has been at the forefront of people-centric solutions since we started. This article was first published in July 2018 and is still highly relevant today. At that time, we had recently surveyed consumers about their data and rights. 

We have completed another survey, this time with both consumers and organisations, again asking about attitudes to data. We can see that consumers haven’t changed much in what they need, transparency about data practices has always been important. Have organisations given consumers what they want? Sign up here to receive our latest research once published.  

It’s easy to see the EU’s new data protection rules as a box-ticking exercise. Yet going a step further and explaining how personal data is used and the benefits to data subjects can reward companies with stronger customer relationships. 

Many companies believe they are now well on their way to compliance with the new European data protection regulation, according to recent GDPR market research[1]. But the GDPR survey also suggests a mismatch between those companies’ perceptions of their progress and their understanding of what’s required of them. This means they could be missing out on important opportunities to reinvigorate customer relationships. 

GDPR, enforceable in the UK under the new Data Protection Act 2018, significantly bolsters people’s rights over their personal data and what companies do with it. At its introduction in May, more than a quarter of organisations claimed to be ‘very well prepared’ for the EU General Data Protection Regulation and 61 percent ‘somewhat’ prepared. Yet it’s hard to imagine that these businesses have transformed their approach to customer data as part of those preparations. 

In many cases, companies approached the new rules as a compliance exercise – not a broader review of how the controls could work best for the business and its clients. Yet with GDPR, companies have a unique chance to develop a more sustainable plan around customer data that will contribute positively to their brand and business development. 

Poor vs best practice 

It’s easy to spot the companies that see GDPR compliance as a box-ticking exercise. The first sign is a lack of clarity in their customer communications: a big green button on their website, accompanied by generic policy wording pasted from a template. The main aim here is to secure quick confirmation of people’s data permissions, so they can get on and buy something. Anyone clicking through for more detail is faced with pages of small-print. 

This is at odds with the spirit of GDPR, which is about empowering customers (‘data subjects’) to make more informed decisions. The key to this is transparency and full disclosure. The Information Commissioner’s Office (ICO) recommends using plain language and information layering. This puts the individual more in control, enabling them to drill down quickly to specific areas they want to clarify. 

Sainsbury’s has clearly thought this through for shoppers registering for Nectar, its partner-based loyalty scheme. The online registration page explains exactly what each party wants to do with subscribers’ data and what customers can expect in return – relevant and personalised offers, for example. By explaining everything clearly and setting out the value for the data subject, Sainsbury’s is demonstrating fairness and integrity, in line with GDPR’s core principles

Juro, a London startup that automates the creation and management of companies’ sales contracts using AI, has taken a similar approach. Click through to find out more about its cookies and there is a full breakdown of what Juro and its partners would like to do with personal data, and why, with their permission. Click through to its full privacy notice and you’ll be rewarded with a step-by-step walk-through of what will happen at every point along the customer journey – the information you will be asked to give and what you will get in return. Juro also spells out customers’ data rights and how to exercise them. It’s a great example of an experience that has clearly been designed – by default – from the client’s perspective. 

Embracing the spirit as well as the rules of data protection 

It is as important to embrace the ‘spirit’ of GDPR as it is to follow the rules. Here’s where organisations need to pay special attention if they want to convince people that they have their best interests at heart and will treat their data with respect. 

Companies can engender trust by managing customers’ data permissions using a dedicated central platform specifically designed to address the more intricate detail of GDPR. If every aspect of consent is managed and tracked in a single place over time – a resource that can be quickly consulted and linked to a range of business systems (such as customer-relationship-management systems) – organisations can show that they are embracing data protection by design and default. 

Taking this a step further, companies could open up such a central permissions hub for self-service access by customers – allowing them to securely access, review and edit their own permissions at any time. A recent GDPR customer survey by MyLife Digital found that two-thirds of consumers would welcome the chance to be able to view all of the data a company holds on them in one place. 

Significantly, the same survey also confirmed that, far from wanting to withhold their data, people are largely happy to give it – as long as there is a clear benefit to them in doing so. When asked “Would you like to make money (or get other benefits) from your data?”, 64 percent of UK respondents and 77 percent of US consumers said yes. 

Ultimately, true GDPR compliance is a mindset, and a way of doing business that offers lasting rewards. Operationalising GDPR using purpose-built compliance software isn’t hard to do and offers a chance to increase engagement with customers, drive loyalty, and ultimately revenue. 

[1] GDPR Impact Series Research 2018, DataIQ in association with Experian: https://www.edq.com/uk/resources/papers/gdpr-impact-research-2018/ 

The post GDPR Market Research – Take a look at the Businesses getting it right appeared first on MyLife Digital.


KuppingerCole

Buyer’s Compass: Unified Endpoint Management

by Richard Hill Unified Endpoint Management refers to comprehensive solutions with capabilities that support a range of endpoint types. This KuppingerCole Buyer’s Compass will provide you with questions to ask vendors, criteria to select your vendor, and requirements for successful deployments. This document will help prepare your organization to conduct RFIs and RFPs for Unified Endpoint Managem

by Richard Hill

Unified Endpoint Management refers to comprehensive solutions with capabilities that support a range of endpoint types. This KuppingerCole Buyer’s Compass will provide you with questions to ask vendors, criteria to select your vendor, and requirements for successful deployments. This document will help prepare your organization to conduct RFIs and RFPs for Unified Endpoint Management.


KEYLESS

Tapping into the potential of multi-modal biometrics

How Keyless is advancing biometrics to provide fast, secure, and reliable authentication Advancements in biometrics have reached a point where the technology can reliably replace passwords, eliminate threats, and enhance the customer experience all at once. It’s time for enterprises to take a multi-modal approach to biometric authentication and identity management. Biometrics today, a single-mod
How Keyless is advancing biometrics to provide fast, secure, and reliable authentication Advancements in biometrics have reached a point where the technology can reliably replace passwords, eliminate threats, and enhance the customer experience all at once. It’s time for enterprises to take a multi-modal approach to biometric authentication and identity management. Biometrics today, a single-modal approach

Biometrics are fast, easy, and eliminate our reliance on passwords and PINs, however, despite their growing popularity and benefits, biometrics have their own unique challenges when it comes to widespread adoption, mainly due to reliability and privacy.

Most solutions on the market today only offer single-modal biometrics to authenticate users — this could be one of the reasons that that adoption of biometrics has been slow, particularly at the enterprise level in industries that would benefit most from the technology in terms of user experience, but require greater assurance that their users or employees are who they claim to be.

The reliability of single-modal biometric systems plays a part in this. Single-biometric solutions are less likely to withstand advanced spoofing threats and are also less likely to correctly match biometric templates as a user’s physical characteristics change, whether due to the natural aging process or suddenly as a result of an accident or disability. As a result, the reliability of single-modal biometrics is often questioned.

Multi-modal biometrics offer a breakthrough solution to these challenges — empowering businesses to finally part ways with passwords and other archaic authentication methods.

Why multi-modal?

Multi-modal solutions that combine physical and behavioral biometrics enable companies to enhance security, without disturbing the user experience.

Physical Biometrics vs Behavioural Biometrics What are physical biometrics?

The most popular biometric solutions leverage a user’s unique physical traits to authenticate them. Biometric templates are created by mapping a user’s unique physical traits. For example, at Keyless we use Neural Networks, a deep learning technique to extract high-dimensional feature vectors from sequences of images of the user’s face.

What are behavioral biometrics?

Behavioral biometrics, on the other hand, leverage deep learning to recognize unique patterns in how a user interacts with a device over time. These patterns could be based on how the user taps, swipes, or holds their device. Once patterns have been recognized, they can be transformed into biometric templates that can be used to monitor access to a user’s device in real-time.

Behavioral biometrics offer enhanced usability as they require no conscious effort from the user, allowing for continuous, frictionless authentication, enabling companies to frequently verify that a user is who they say they are, without disrupting the user. Physical biometrics, on the other hand, are better suited to one-time authentication, as they typically require conscious effort from the user.

Frictionless security powered by who the user is, not what they know

When it comes to enabling fast, secure authentication, because biometrics are inherently user-friendly, a multi-modal approach to authentication can directly solve the trade-off between convenience and security.

Multi-modal solutions give companies greater control over managing remote access to their private systems and data. For example, if an employee logs in but then the system recognizes unusual behavior (such as different keystrokes or tapping patterns), then it can log the user out.

The Keyless approach to biometric authentication At Keyless we take a multi-modal approach to biometrics, leveraging both physical and behavioral biometrics to offer a seamless authentication solution that allows companies to reliably identify users. Keyless uses a combination of possession and inherence factors to provide a fast, seamless and secure authentication experience.

The Keyless protocol allows for several physical biometric modalities including facial, fingerprint, voice, and iris recognition; as well as behavioral modalities like keystroke and swipe recognition.

Using deep learning to ensure high-accuracy

To offer behavioral biometrics, we must first learn how users interact with their devices. We do this by feeding deep-learning algorithms unique, user-generated data captured through sensors on the user’s mobile device.

Each input can be gauged on its reliability in authenticating an individual user. Once it’s considered reliable enough for identifying the user with high accuracy, the feature can be added to the user’s biometric templates.

Ensuring privacy compliance

Since biometrics are uniquely linked to the individual, it’s important that companies processing biometric data take extra measures to protect it against both new and emerging threats.

Biometrics have the potential to make authentication dramatically faster, easier, and more secure than traditional passwords, but companies need to be careful about the biometric data they collect. — Maria Korolov

Keyless ensures that the biometric data we capture is never at risk of being stolen, compromised, or lost by combining multi-modal biometrics with privacy-enhancing technologies. The combination of privacy-enhancing technologies, deep-learning, and multi-modal biometrics uniquely addresses the greatest barriers facing the adoption of biometric authentication solutions.

Read these pieces to gain a better understanding of how we leverage privacy-enhancing technologies to protect biometric data: Keyless and Zero-Knowledge Proofs, Keyless and Secure Multiparty Computation, Keyless and Shamir’s Secret Sharing.

Closing the gap between security and convenience

Multi-modal biometric solutions enable a high level of security without causing undue disruptions to the authentication experience. Rather, multi-modal solutions can enhance the authentication experience by offering frequent, unconscious authentication that can be used to detect advanced threats.

By presenting multiple biometric challenges to users, multi-modal solutions protect private systems and data in the rare event of successful spoofing attempts or account takeovers that take place after the point of authentication.

Thus multimodal biometric solutions can help companies close the gap between security and convenience, allowing for more seamless authentication and identity management experiences that enhance security and privacy compliance, without sacrificing the authentication experience or productivity.

Request a Free Demo of Keyless

Keyless™ authentication can help deliver secure and seamless digital experiences for your end-users and for your increasingly remote workforce.

Head to our website to learn more about our biometric authentication and identity management solutions.

www.keyless.io

Alternatively, you can email us directly at info@keyless.io

Tapping into the potential of multi-modal biometrics was originally published in KeylessTech on Medium, where people are continuing the conversation by highlighting and responding to this story.


Finicity

Finicity Mortgage Verification Service: The Smart, Simple Mortgage Verification Experience

The mortgage application process should be easier. It should be more accurate. It should involve less risk and less fraud. It shouldn’t be a slog for borrowers or for lenders. It should be as convenient and streamlined as we’ve come to expect from other modernized, digital experiences.  Transforming the entire underwriting process is a massive […] The post Finicity Mortgage Verification Ser

The mortgage application process should be easier. It should be more accurate. It should involve less risk and less fraud. It shouldn’t be a slog for borrowers or for lenders. It should be as convenient and streamlined as we’ve come to expect from other modernized, digital experiences. 

Transforming the entire underwriting process is a massive undertaking. And while Finicity already provides solutions across all the primary segments of mortgage lending, today we’re reaching another milestone by streamlining the verification of assets, income, and employment into a one-touch, GSE-accepted experience. I’m excited to introduce Finicity Lend’s Mortgage Verification Service (MVS), the faster, more accurate, more empowering verification experience for both lenders and borrowers.

What Is MVS?

Mortgage lending underwent a historic transformation in 2020. Problems that had been minor cracks in the mortgage lending experience became chasms as lenders had to rapidly adapt to physically-distanced workflows. But despite the COVID-19 pandemic, ensuing economic fallout, and record-breaking volume—which, while temporarily obscuring them, does not eliminate the cracks—certainly accelerating the need for a new mortgage experience, that need was already apparent. 

Paper-based mortgage processes take more time—something many lenders are already lacking with today’s high volume—and they’re more prone to fraud. Slower, less streamlined solutions also reduce organizational agility, preventing lenders from keeping pace both when the market is booming and when the market again normalizes. And the high-friction paper-chases are annoying for borrowers that are already acclimated to fast, convenient, digital solutions. 

We wanted to deliver a mortgage lending experience that exceeds the expectations of today’s borrowers while also enhancing outcomes and agility for lenders and their stakeholders. That’s why we designed MVS to deliver a one-touch, GSE-accepted digital verification of assets, income, and employment. Now you can complete all necessary verifications in one seamless process. It’s a fast, secure, anytime-anywhere experience that gives the borrower control over their financial data while also providing the lender with a real-time, accurate picture of the borrower’s financial health. 

MVS is powered by Finicity’s open-banking platform. This means that mortgage lenders get access to extended lengths of real-time data, analyzed and categorized thanks to advanced data intelligence. We also assure the most accurate data and keep the consumer at the center of the data-sharing experience with clear transparency and the ability to dispute reports. Access to reliable, real-time, multi-sourced, and even cross-verified data enables the most accurate verifications, setting you on your way to get rep and warranty validation from GSEs and investors. 

And because every lending use case and process is unique, we’ve designed MVS to be flexible and accommodate everything from refinancing to new purchases, including both qualified and non-qualified mortgages. We’ve also made it easy for mortgage lenders to integrate MVS into their workflow with several flexible integration options.

All of these features come together to build a consumer-centric lending experience that improves ROI for lenders.

Why Should Lenders Use MVS?

MVS is more than a product, it’s a partnership with Finicity that enables lenders to benefit from our open banking platform and our market-leading, secure connections to financial institutions. Through those connections, lenders can get the accurate data necessary to verify assets, income, and employment, and enhance their overall decisioning and underwriting processes. And with GSEs tightening their rep and warranty relief policies due to COVID-19’s impact on consumer income and employment, lenders will need the most reliable data from the most reliable sources.

MVS enables a digital mortgage experience, allowing lenders to reap the benefits that come from digital streamlining. In fact, validating assets, income, and employment digitally can cut up to 12 days off the origination process. MVS takes digital streamlining even further by completing these verifications with only a single borrower interaction. You can then refresh those verifications at close at no cost and without reengaging the borrower. With MVS, you complete more originations in less time—time that’s crucial for lenders to remain agile in a crazy, high-volume year like this. More time opens room for more originations and more commission.

The convenience of digital verifications and the simple, streamlined consumer permissioning process also enhances the lending experience for borrowers and helps them leave more satisfied and more likely to refer their lender to friends and family. MVS’s seamless and customer-centric digital experience enables lenders to distinguish themselves, especially against digital laggards, and gain a competitive edge.

We’ll also set you up for success with Finicity’s Adoption Best Practices training so you can hit the ground running and start reaping the rewards of a streamlined, digital mortgage process. 

With MVS, everybody wins. 

Don’t settle for yesterday’s mortgage lending experience. You deserve better. And so do your borrowers. Use Finicity Lend’s Mortgage Verification Service to build the foundation of your enhanced mortgage lending experience. Find out how to integrate MVS into your mortgage lending process and to learn more about how Finicity provides other mortgage solutions in prequalification, underwriting, funding enablement, secondary quality control, and servicing.

The post Finicity Mortgage Verification Service: The Smart, Simple Mortgage Verification Experience appeared first on Finicity.


Finicity Releases Comprehensive Mortgage Verification Service for Simpler, Faster Borrowing Experience

One-touch, GSE-accepted verification of assets, income and employment reduces loan process by up to 12 days SALT LAKE CITY, Utah – February 25, 2021 – Finicity, a Mastercard company and leading provider of open banking solutions, today announced its one-touch Mortgage Verification Service (MVS), enabling lenders to provide the simple, easy experience that today’s consumers […] The post Finicity

One-touch, GSE-accepted verification of assets, income and employment reduces loan process by up to 12 days

SALT LAKE CITY, Utah – February 25, 2021 – Finicity, a Mastercard company and leading provider of open banking solutions, today announced its one-touch Mortgage Verification Service (MVS), enabling lenders to provide the simple, easy experience that today’s consumers and lenders are looking for in mortgage origination. The solution allows consumers to permission data, quickly and easily, so lenders can verify assets, income and employment in a single interaction with borrowers that takes seconds or minutes instead of days or weeks. The verification is accepted by both Freddie Mac and Fannie Mae in place of cumbersome manual loan documentation.

While mortgage lending has rapidly moved toward a digital experience, the verification process has largely remained a manual, paper-driven process. By reducing the burden of manual methods of documentation, Finicity may help shave 8-12 days off the origination process for rapid loan closing while also increasing accuracy, improving profitability, and creating a better experience for both lenders and borrowers.

Through Finicity Lend’s Mortgage Verification Service, Finicity’s open banking platform leverages high value data available from financial institutions and payroll processors to provide accurate, real-time insights into a borrower’s current assets, income and employment. The solution offers flexible flows for different mortgage lending use cases — from refinancing to new purchases, qualified to non qualified mortgages.

This innovative service creates a simple, fast, FCRA-compliant verification experience that empowers consumers to digitally permission use of their financial data with one touch, through Finicity Connect, to rapidly validate key financial suitability requirements of a mortgage application. This is perfectly aligned with Finicity’s mission to empower both lenders and consumers while helping consumers benefit more from their own financial data. 

“We are streamlining mortgage lending significantly, reducing costs and shortening the time needed for the overall loan origination process,” said Finicity CEO and Co-founder Steve Smith. “With Finicity Lend, our ultimate goals are to help mitigate risk for lenders, create an improved consumer experience, and ultimately increase overall financial inclusion by helping borrowers better prove their creditworthiness.”

GSE Accepted

The Finicity Lend Mortgage Verification Service is accepted by both GSEs as a valid demonstration of a borrower’s assets, income and employment. Lenders are able to use Finicity verification reports for automated assessment and receive representation and warranty relief using Freddie Mac Loan Product Advisor® asset and income modeler (AIM).  Fannie Mae accepts Finicity mortgage verification reports for automated assessment within Desktop Underwriter® validation service through Day 1 Certainty®. 

“Freddie Mac has been at the forefront of advancing the digital mortgage experience that today’s borrowers have come to expect,” said Rick Lang, Single-Family Vice President of Strategy and Integration at Freddie Mac. “Our data-driven strategy helps produce safer loans and reduces the paper chase so our clients can speed up underwriting and bring borrowers to the closing table sooner.”  

“At Fannie Mae, we’ve been pioneering the digital technologies that will make the borrowing experience faster and easier for borrowers,” said Chuck Walker, Vice President Digital Alliances and Distribution at Fannie Mae. “Manually providing verification documents is a time-consuming and stressful process, so addressing asset, income and employment verification is central to moving the industry toward the ideal digital mortgage.”

What the Industry is Saying

Finicity clients and partners are already weighing in on Finicity’s Mortgage Verification Services (MVS):

“At Sierra Pacific Mortgage, we were excited to be a participant in the rollout of this enhancement to the lending process”, said Gary D. Clark, Chief Operating Officer at Sierra Pacific Mortgage.  “Continued improvements to the lending process is an important initiative at Sierra Pacific Mortgage, and one that elevates the consumer experience is a win-win for everyone.”

“As we began utilizing this new service, it quickly became clear how much it would improve the lending process for both our loan officers and borrowers”, said Patrick Gardner, Principal of Vellum Mortgage.  “Digitizing the mortgage process not only creates a faster, smoother experience for the borrower, but we’re also seeing significant cost savings and an increased volume of loans that we’re able to close.”

“For consumers, our focus is on delivering a fully mobile, fully seamless homeownership journey that’s centralized in one connected platform.  For lenders, SimpleNexus promises flexible efficiency that doesn’t get in the way of doing business.  Our integration with Finicity Lend’s Mortgage Verification Service delivers on both fronts with GSE-accepted verification of assets, income and employment in one easy interaction,” said SimpleNexus Chief Product Officer Shane Westra.

See MVS in Action

Finicity Lend’s Mortgage Verification Service  will be available across multiple LOS/POS platforms. See our MVS solution live at the following events: 

SimpleNexus SNUG 2020:  February 22-24, 2021 HousingWire Spring Summit:  March 4, 2021 ICE Mortgage Technology Experience 2021: March 8 – 26, 2021 Mortgage Banker Association’s Spring Conference & Expo:  April 20-21, 2021

To learn more about Finicity and its commitment to fast, reliable and high-quality data, visit www.finicity.com

About Finicity

Finicity, a Mastercard company helps individuals, families, and organizations make smarter financial decisions through safe and secure access to fast, high-quality data. The company provides a proven and trusted open banking platform that puts consumers in control of their financial data, transforming the way we experience money for everything from budgeting and payments to investing and lending. Finicity partners with influential financial institutions and disruptive fintech providers alike to give consumers a leg up in a complicated financial world, helping to improve financial literacy, expanding financial inclusion, and ultimately leading to better financial outcomes. Finicity is headquartered in Salt Lake City, Utah. To learn more or test drive its API, visit www.finicity.com

The post Finicity Releases Comprehensive Mortgage Verification Service for Simpler, Faster Borrowing Experience appeared first on Finicity.


digi.me

Digi.me joins Good Health Pass Collaborative to help build a safe travelling future

Building a path to restore safe international travel and kickstart the global economy is a challenge so broad that no company or country can solve it alone. Cross-border movement is essential to the economy, and the key to building confidence, both for those who want to be able to travel safely, and for governments who want to protect the health of their citizens, is through digital health crede

Building a path to restore safe international travel and kickstart the global economy is a challenge so broad that no company or country can solve it alone.

Cross-border movement is essential to the economy, and the key to building confidence, both for those who want to be able to travel safely, and for governments who want to protect the health of their citizens, is through digital health credentials which display vaccination or test certificates.

Continue reading Digi.me joins Good Health Pass Collaborative to help build a safe travelling future at Digi.me.


KuppingerCole

Datenbank- und Big-Data-Sicherheit

by Alexei Balaganski Dieser Leadership Compass bietet einen Überblick über den Markt für Datenbank- und Big-Data-Sicherheitslösungen sowie einen Leitfaden und Empfehlungen für die Wahl von denjenigen Produkten zum Schutz und zur Verwaltung vertraulicher Daten, die Ihre Anforderungen am besten erfüllen. Wir betrachten das breite Spektrum der beteiligten Technologien, die Produkt- und Servicefunkti

by Alexei Balaganski

Dieser Leadership Compass bietet einen Überblick über den Markt für Datenbank- und Big-Data-Sicherheitslösungen sowie einen Leitfaden und Empfehlungen für die Wahl von denjenigen Produkten zum Schutz und zur Verwaltung vertraulicher Daten, die Ihre Anforderungen am besten erfüllen. Wir betrachten das breite Spektrum der beteiligten Technologien, die Produkt- und Servicefunktionalitäten der Anbieter, die relativen Marktanteile und innovative Ansätze zur Implementierung eines konsistenten und umfassenden Datenschutzes in Ihrem Unternehmen.


Infocert

IMPULSE: European alliance to facilitate online identification in public services

Security, privacy, and ease-of-use will be the central concepts of IMPULSE (Identity Management in PUbLic SErvices), a European Horizon 2020 project that kicks off this month. Coordinated by Gradiant, a team of 16 entities from 9 different countries will use Artificial Intelligence and Blockchain to improve online identification processes. www.twitter.com/Impulse_EU Digital processes are increasin

Security, privacy, and ease-of-use will be the central concepts of IMPULSE (Identity Management in PUbLic SErvices), a European Horizon 2020 project that kicks off this month. Coordinated by Gradiant, a team of 16 entities from 9 different countries will use Artificial Intelligence and Blockchain to improve online identification processes.

www.twitter.com/Impulse_EU

Digital processes are increasingly used by citizens. The current health crisis around the world has limited personal interactions, digitising all kinds of procedures. This brings a number of problems, not only for those unfamiliar with the complex identification systems on the internet, but also for those who are not comfortable providing their personal data online. IMPULSE seeks to be a tool to simplify the digitisation of the European public sector as a result of a comprehensive assessment on how the use of disruptive technologies may impact society from technical, socio-economic, legal, ethical, policy, and standardisation standpoints. IMPULSE will work on the concept of Electronic Identity (eID) – the way users can identify themselves (and be identified) through the network – and its implications in multiple contexts. The initiative has a budget of around €4 million. The 16 partners that make up this multidisciplinary consortium will create a tool to meet all the ethical, legal, and social requirements, so that the tool may be adopted for public services in all the countries of the European Union, regardless of their culture and degree of digitisation. IMPULSE will become the clear representation of an efficient, useful, and responsible modernisation by combining Artificial Intelligence and Blockchain to offer a technological advantage over other systems used today. At the project kick off meeting earlier this week, all partners confirmed their optimism about the results that the initiative will offer and the pace of work that is expected to be maintained throughout the 3-year project thanks to the experience and expertise of those involved.

“The digitisation of public administration services is one of the seven flagship initiatives identified in the NextGen Europe programme, which gives an idea of the strategic dimension of IMPULSE. In this context, the project was born with a clear objective of facilitating the access of citizens to these services but without giving up the maximum guarantees of security, something essential if we want to move towards a fully digitised and inclusive society”

Luis Pérez Freire – Gradiant (the Spanish Research and Technology Organisation leading the IMPULSE project)

“Digitizing the public services is not just a matter of technical infrastructure, rather a multi-stage question of ethics and justice. There is the non-trivial question of regulating digital technologies in a way that guarantees the full protection of personal data. But it is also important to recall that accessibility and quality of public services are fundamental prerequisites of democratic participation in social life. This links techno-regulation and ethics with politics. And this is the strategic horizon in which IMPULSE will play a ground-breaking role.”

Antonio Carnevale – Senior Researcher at CyberEthics Lab. and the company’s contact point in the project

“We are proud to participate, together with highly qualified partners, in a project as prestigious and ambitious as IMPULSE. We will strengthen the consortium with our ability to work in multi-disciplinary teams, an ability refined over years of presence at important institutional tables, even at the international level, with public institutions and private technology companies. “During the project, we will be responsible for the management of the entire process of Innovation and we will be involved in the development of the general plan for the dissemination and exploitation, at a pan-European level, of the project results. We will contribute to the project with our long experience in digital identity for the public sector, achieved thanks to the role of Identity Provider for SPID, the Italian Digital Identity Schema. Moreover, in the field of distributed identity based on blockchain technologies, we will support the consortium with the know-how developed thanks to DIZME, a decentralized digital identity platform, conceived by InfoCert, which combines the world of Self Sovereign Identity with eIDAS regulatory compliance. In conclusion, IMPULSE is an important opportunity, even more so if the results will be functional to the identification of new potential areas of application of Digital Identity at the European level and not limited to the public sector.”

Carmine Auletta – Chief Strategy & Innovation Officer at InfoCert
AI and blockchain for electronic identification processes

In a world where it is increasingly common to carry out all kinds of processes online, avoiding fraud and deception is only possible with first-class technology. Funded under the call Transformative impact of disruptive technologies in public services (DT- TRANSFORMATIONS-02-2018-2019-2020), IMPULSE will develop a system to respond to the needs of both citizens and public servants in digital processes.

IMPULSE’s innovation focuses on combining two of the most promising technologies available today, that is Artificial Intelligence and Blockchain networks. The project’s goal is to improve the management of digital identity and electronic identification in the public sector. Today we have technology with the potential to overcome these challenges so its application in the field of digital identity will substantially improve the existing electronic identification systems at the time legal, privacy, or social issues requiring further analysis may raise.

Most citizens have an electronic identity card to carry out the authentication processes with the online services offered by the administration; however, its use is cumbersome due in large part to the lack of user-friendly interfaces for the common citizen. To overcome this adversity, IMPULSE will incorporate advanced face biometrics and document validation techniques based on AI to facilitate identification processes and provide the user with a digital onboarding experience that is fully transparent.

In addition, blockchain technology and the use of smart contracts will add trustworthiness in the process, providing mechanisms for users to demonstrate their identity without the need to disclose their personal data to third parties, who are a priori considered to be less reliable. In this way the citizen will maintain total control of their data and will be able to verify at all times the use that is being given to them.

16 high-profile partners

The consortium, made up of partners from Spain, Italy, Austria, Bulgaria, Germany, Finland, Iceland, France and Denmark, is funded by the European Union with €4 million under the Horizon 2020 programme. Partners from Italy will receive a total of €976.625 in funding. The complete list of partners is included below.

Gradiant (coordinator – Spain) Lappeenrannan–Lahden Teknillinen Yliopisto (Finland) Agency for European Integration and Economic Development (Austria) Association du Pole de Competitivite Transactions Electroniques Securisees – DIH (France) Aarhus Municipality (Denmark) Departamento de seguridad del Gobierno Vasco (Spain) Gijón City Council (Spain) Municipality of Peshtera (Bulgaria) City of Reykjavik, Department of Services and Operations (Iceland) Unione italiana delle Camere di commercio, industria, artigianato e agricoltura (Italy) CyberEthics Lab. Srls (Italy) ALiCE Biometrics (Spain) Fraunhofer Institute for Systems and Innovation Research (Germany) Tree Technology SA (Spain) Infocert S.p.A. (Italy) DIN Deutsches Institut für Normung e. V. (Germany) About CyberEthics Lab.

With 10 currently active projects in diverse technological domains, CyberEthics Lab. is a well-established company in the European research scene. This Italian SME leverages the experience and knowledge of its multidisciplinary core members, who teach at the University and work on numerous R&D activities in the context of inter-disciplinary research projects. CyberEthics Lab.’s main focus is a holistic vision of innovation, generated by efficiency, scientific curiosity and respect for human beings, where ideas, techniques, tools and methods from different disciplines are integrated to make innovative, secure and responsible technology. CyberEthics Lab.’s offering is mainly based on promoting innovation as well as creating ethical awareness and promoting ethical behaviour to build confidence and trust, as well as the promotion of the inclusion of legal and ethical concerns in the design and implementation of technologies, the identification of risks on individual fundamental rights and the issuing of recommendations in order to make technologies compliant with ethical principles and the current legal framework. CyberEthics Lab. members have also a long experience on technology implementation, with expertise on disruptive technologies such as blockchain and smart contracts (specifically addressed in PoSeID-on and PHOENIX projects as well).

WebsiteLinkedInTwittercyberethicslab.comlinkedin.com/company/cyberethics-labtwitter.com/CyberethicsLab Main role in IMPULSE

CyberEthics Lab. will support the entire consortium in ethical and legal concerns, issuing initial guidelines for the research process, leading the completion of the Ethics and Data Management Plan, and contributing to the assessment of the IMPULSE technologies with its own methodology. As a result, CyberEthics Lab. will conduct a multidisciplinary analysis of standards, legal, and ethical implications. CyberEthics Lab. will also act as technological provider of Smart Contracts templates for the enforcement of the data subject privacy rights thanks to its previous experience.

About InfoCert

InfoCert, Tinexta Group, is the largest European Certification Authority, active in over twenty countries.  The company provides services in digitisation, eDelivery, Digital Signature and digital preservation of documents and is an accredited AgID digital identity operator in the area of SPID (Public System for the management of Digital Identity). InfoCert invests significantly in research and development and in quality: it holds a substantial number of patents while the ISO 9001, 27001 and 20000 quality certifications bear witness to the highest-level of commitment in the provision of services and security management. The InfoCert Information Security Management System is ISO/IEC 27001:2013 certified for EA:33-35 activities. InfoCert is a European leader in the provision of Digital Trust services that are fully conformant with the requirements of the eIDAS Regulation (EU Regulation 910/2014) and ETSI EN 319 401 standards. The company aims to increasingly expand at international level, including through acquisitions: it holds a 51% stake in Camerfirma, one of the principal Spanish certification authorities and a 16.7% stake in Authada, a leading-edge German Identity Provider. Finally, InfoCert owns an 80% shareholding in Sixtema SpA, the technological partner of the CNA (National Confederation of Italian Artisans and Craftsmen and Small and Medium-Sized Enterprises), which provides technological solutions and consultancy services to SMEs, traders’ associations, financial intermediaries, professional firms and entities.

WebsiteEntreprise solutionsLinkedInTwitterFacebookinfocert.itinfocert.digitallinkedin.com/company/infocerttwitter.com/InfoCert_itfacebook.com/InfoCertSpA Main role in IMPULSE

InfoCert is an entitled Trust Service Provider (TSP), which is a critical figure in the IMPULSE project. In addition, Infocert will oversee innovation and exploitation management and is charged with disseminating the project’s findings. Moreover, InfoCert will contribute to the analysis and implementation of the trust components of the project, including the Decentralised Public Key Infrastructure (DPKI), the CA and the RA.

The post IMPULSE: European alliance to facilitate online identification in public services appeared first on InfoCert.digital.


auth0

Creating Great Passwords

A closer look at the world of passwords: Tricks, tips, tools, and other stuff to choose a good password in your apps.
A closer look at the world of passwords: Tricks, tips, tools, and other stuff to choose a good password in your apps.

Coinfirm

Crypto Custodian Tangany Partners With AML/CFT RegTech Coinfirm

25th February, LONDON, UK – Today Tangany and Coinfirm announce their partnership to provide strengthened risk management and AML/CFT (Anti-Money Laundering/Combatting the Finance of Terrorism) compliance for the better safeguarding of blockchain-based asset security. By leveraging Coinfirm’s AML Platform, Tangany’s custodian operations will be in sync with current and future guidelines for crypto
25th February, LONDON, UK – Today Tangany and Coinfirm announce their partnership to provide strengthened risk management and AML/CFT (Anti-Money Laundering/Combatting the Finance of Terrorism) compliance for the better safeguarding of blockchain-based asset security. By leveraging Coinfirm’s AML Platform, Tangany’s custodian operations will be in sync with current and future guidelines for crypto assets (i.e.,...

Infocert (IT)

IMPULSE: Alleanza europea per facilitare l’identificazione online nei servizi pubblici

Sicurezza, privacy e intuitività saranno i concetti centrali di IMPULSE(Identity Management in PUbLic SErvices), un progetto di ricerca europeo all’interno del programma Horizon 2020 che inizia questo mese. Coordinato da Gradiant, un team di 16 entità da 9 paesi diversi userà l’intelligenza artificiale e la blockchain per migliorare i processi di identificazione online. www.twitter.com/Impulse_EU

Sicurezza, privacy e intuitività saranno i concetti centrali di IMPULSE(Identity Management in PUbLic SErvices), un progetto di ricerca europeo all’interno del programma Horizon 2020 che inizia questo mese. Coordinato da Gradiant, un team di 16 entità da 9 paesi diversi userà l’intelligenza artificiale e la blockchain per migliorare i processi di identificazione online.

www.twitter.com/Impulse_EU

I processi digitali sono sempre più utilizzati dai cittadini. L’attuale crisi sanitaria in tutto il mondo ha limitato le interazioni personali, digitalizzando tutte le procedure. Questo porta una serie di problemi, non solo per coloro che non hanno familiarità con i complessi sistemi di identificazione su Internet, ma anche per coloro che non si sentono tranquilli nel fornire i propri dati personali online. IMPULSE vuole essere uno strumento capace di semplificare la digitalizzazione del settore pubblico europeo come risultato di una valutazione completa su come l’uso di tecnologie dirompenti può avere un impatto sulla società da un punto di vista tecnico, socio-economico, legale, etico, politico e di standardizzazione.

IMPULSE lavorerà sul concetto di Identità Elettronica (eID) – il modo in cui gli utenti possono identificarsi (ed essere identificati) attraverso la rete – e le sue implicazioni in molteplici contesti. L’iniziativa prevede un budget di circa 4 milioni di euro.

I 16 partner che compongono questo consorzio multidisciplinare svilupperanno uno strumento in grado di soddisfare tutti i requisiti etici, legali e sociali, tale da poter essere adottato per i servizi pubblici in tutti i paesi dell’Unione Europea, indipendentemente dalla tradizione e sviluppo culturale e dal loro grado di digitalizzazione. IMPULSE costituirà un chiaro esempio di una modernizzazione efficiente, proficua e responsabile, che combina l’intelligenza artificiale e la Blockchain per offrire un vantaggio tecnologico rispetto agli altri sistemi utilizzati oggi.

Al kick off meeting del progetto, che si è tenuto all’inizio di questa settimana, tutti i partner si sono mostrati ottimisti in merito ai risultati che il progetto comporterà e al ritmo di lavoro che dovrebbe essere mantenuto durante i 3 anni del progetto, grazie all’esperienza e alla competenza di coloro che sono coinvolti.

“La digitalizzazione dei servizi della pubblica amministrazione è una delle sette iniziative di punta individuate nel programma NextGen Europe, il che dà un’idea della dimensione strategica di IMPULSE. In questo contesto, il progetto è nato con il chiaro obiettivo di facilitare l’accesso dei cittadini a questi servizi, ma senza rinunciare alle massime garanzie di sicurezza, cosa essenziale se vogliamo andare verso una società completamente digitalizzata e inclusiva.”

Luis Pérez Freire – Gradiant (’organizzazione spagnola di ricerca e tecnologia alla guida del progetto IMPULSE)

“Digitalizzare i servizi pubblici è una questione non solo di infrastruttura tecnica, ma anche di etica e giustizia. Ci si deve porre la domanda – per nulla triviale – del regolare le tecnologie digitali in modo da garantire la piena protezione dei dati personali. Rimane importante ricordare che l’accessibilità e la qualità dei servizi pubblici sono prerequisiti fondamentali per la partecipazione democratica nella vita sociale. Tale considerazione lega la regolamentazione necessaria della tecnologia con l’etica e la politica. E questo è l’orizzonte strategico nel quale IMPULSE giocherà un ruolo chiave.”

Antonio Carnevale – Senior Researcher presso CyberEthics Lab e punto di contatto aziendale nel progetto

“Siamo orgogliosi di partecipare, al fianco di partner altamente qualificati, a un progetto così prestigioso e ambizioso qual è IMPULSE. Metteremo a disposizione del consorzio la nostra capacità di lavorare in team multi-disciplinari affinata in anni di presenza a importanti tavoli istituzionali, anche di livello internazionale, con enti pubblici e realtà tecnologiche private. Durante il progetto saremo responsabili della gestione dell’intero processo di Innovazione e interverremo nella predisposizione del piano generale per la diffusione e valorizzazione, a livello paneuropeo, dei risultati progettuali. Contribuiremo ai lavori con la lunga esperienza sull’identità digitale in ambito pubblico, ottenuta grazie al ruolo di gestore accreditato per SPID. E, in tema di identità distribuita basata su tecnologia blockchain, supporteremo il consorzio grazie al know-how acquisito con DIZME, la piattaforma decentralizzata d’identità digitale, ideata da InfoCert, che integra già il mondo della Self Sovereign Identity con la compliance normativa eIDAS. Insomma, IMPULSE è un’importante opportunità, ancor di più se i risultati saranno funzionali all’identificazione di nuovi potenziali ambiti d’applicazione dell’Identità Digitale a livello Europeo e non solo nel settore pubblico.”

Carmine Auletta – Chief Strategy & Innovation Officer in InfoCert
AI e blockchain per i processi di identificazione elettronica

In un mondo in cui è sempre più comune effettuare tutti i tipi di processi online, evitare frodi e falsificazioni è possibile solo adoperando una tecnologia di eccellenza. Finanziato nell’ambito del bando Transformative impact of disruptive technologies in public services (DT- TRANSFORMATIONS-02-2018-2019-2020), IMPULSE elaborerà un sistema per rispondere alle esigenze sia dei cittadini che dei dipendenti pubblici nei processi digitali.

L’innovazione di IMPULSE si concentra sulla combinazione di due delle tecnologie più promettenti oggi disponibili, vale a dire l’intelligenza artificiale e le Blockchain. L’obiettivo del progetto è migliorare la gestione dell’identità digitale e dell’identificazione elettronica nel settore pubblico. Oggi disponiamo della tecnologia con il potenziale adatto per superare queste sfide: in questo modo la sua applicazione nel campo dell’identità digitale migliorerà sostanzialmente i sistemi di identificazione elettronica esistenti nel momento in cui sorgano questioni legali, di privacy o sociali che richiedono ulteriori analisi.

La maggior parte dei cittadini possiede una carta d’identità elettronica per effettuare i processi di autenticazione con i servizi online offerti dall’amministrazione; ciononostante, l’utilizzo appare macchinoso, soprattutto a causa di interfacce che si rivelano poco user-friendly per il cittadino comune. Per superare questo ostacolo, IMPULSE includerà tecniche avanzate di biometria facciale e di convalida dei documenti basate sull’AI per facilitare i processi di identificazione e fornire all’utente un’esperienza di registrazione digitale perfettamente trasparente.

Inoltre, la tecnologia blockchain e l’uso di smart contracts apporteranno ulteriore affidabilità al processo, attraverso meccanismi che permetteranno agli utenti di dimostrare la loro identità senza la necessità di rivelare i loro dati personali a terzi, a priori considerati meno sicuri. In questo modo il cittadino manterrà il controllo totale dei propri dati e potrà verificare in ogni momento l’uso che ne viene fatto.

16 partner di alto livello

Il consorzio, composto da partner provenienti da Spagna, Italia, Austria, Bulgaria, Germania, Finlandia, Islanda, Francia e Danimarca, è finanziato dall’Unione europea con 4 milioni di euro nell’ambito del programma Horizon 2020. I partner italiani riceveranno un totale di 976,625 euro di finanziamento. Di seguito l’elenco completo dei partner.

Gradiant (leader – Spagna) Lappeenrannan–Lahden Teknillinen Yliopisto (Finlandia) Agency for European Integration and Economic Development (Austria) Association du Pole de Competitivite Transactions Electroniques Securisees – DIH (Francia) Aarhus Municipality (Danimarca) Departamento de seguridad del Gobierno Vasco (Spagna) Gijón City Council (Spagna) Municipality of Peshtera (Bulgaria) City of Reykjavik, Department of Services and Operations (Islanda) Unione italiana delle Camere di commercio, industria, artigianato e agricoltura (Italia) CyberEthics Lab. Srls (Italia) ALiCE Biometrics (Spagna) Fraunhofer Institute for Systems and Innovation Research (Germania) Tree Technology SA (Spagna) Infocert S.p.A. (Italia) DIN Deutsches Institut für Normung e. V. (Germania) Informazioni su CyberEthics Lab.

Con 10 progetti attivi in vari ambiti tecnologici, CyberEthics Lab. è ormai una realtà consolidata nell’ambito della ricerca europea. L’azienda vanta l’esperienza e la conoscenza di un team multidisciplinare, composto di persone che insegnano all’Università e lavorano in numerose attività di R&S nell’ambito di progetti di ricerca su numerose tecnologie. Il core di CyberEthics Lab. sta in una visione olistica dell’innovazione, basata su efficienza, interesse scientifico e rispetto per gli esseri umani, nel quale idee, tecniche, strumenti e metodi provenienti da diverse discipline si integrano per sviluppare tecnologie innovative, sicure e responsabili. Il contributo di CyberEthics Lab. si basa principalmente sulla promozione dell’innovazione, sulla cura di una consapevolezza etica e sulla promozione di un comportamento etico volto a stabilire fiducia e sicurezza, così come sulla valorizzazione degli aspetti legali ed etici nella progettazione e nell’implementazione delle tecnologie, l’identificazione dei rischi per i diritti fondamentali dell’individuo e la formulazione di raccomandazioni per rendere le tecnologie conformi ai principi etici e al quadro giuridico vigente. Il team di CyberEthics Lab. possiede anche una lunga esperienza nell’implementazione tecnologica, con competenze in merito alle tecnologie dirompenti come la blockchain e gli smart contracts (affrontati nel dettaglio anche nei progetti PoSeID-on e PHOENIX).

Sito webLinkedInTwittercyberethicslab.comlinkedin.com/company/cyberethics-labtwitter.com/CyberethicsLab Ruolo principale in IMPULSE

CyberEthics Lab. assisterà l’intero consorzio sulle questioni etiche e legali, elaborando le linee guida iniziali per il processo di ricerca, coordinando il perfezionamento del piano etico e di gestione dei dati e contribuendo alla valutazione delle tecnologie IMPULSE con la propria metodologia etica e legale. CyberEthics Lab. svolgerà un’analisi multidisciplinare degli standard attuali e delle implicazioni etico-legali della tecnologia di IMPULSE. Interverrà anche come fornitore tecnologico di modelli di Smart Contracts per l’applicazione dei diritti di privacy degli utenti grazie alla sua precedente esperienza nel medesimo campo.

Informazioni su InfoCert

InfoCert, Tinexta Group, è la più grande Certification Authority europea, attiva in oltre venti Paesi. La società eroga servizi di digitalizzazione, eDelivery, Firma Digitale e conservazione digitale dei documenti ed è gestore accreditato AgID dell’identità digitale nell’ambito di SPID (Sistema Pubblico per la gestione dell’Identità Digitale). InfoCert investe in modo significativo nella ricerca e sviluppo e nella qualità: detiene un significativo numero di brevetti mentre le certificazioni di qualità ISO 9001, 27001 e 20000 testimoniano l’impegno ai massimi livelli nell’erogazione dei servizi e nella gestione della sicurezza. Il Sistema di Gestione della Sicurezza delle Informazioni InfoCert è certificato ISO/IEC 27001:2013 per le attività EA:33-35. InfoCert è leader europeo nell’offerta di servizi di Digital Trust pienamente conformi ai requisiti del Regolamento eIDAS (regolamento UE 910/2014) e agli standard ETSI EN 319 401, e punta a crescere sempre di più a livello internazionale anche mediante acquisizioni: detiene il 51% di Camerfirma, una delle principali autorità di certificazione spagnole e il 16,7% di Authada, Identity Provider tedesco all’avanguardia. InfoCert, infine, è proprietaria dell’80% delle azioni di Sixtema SpA, il partner tecnologico del mondo CNA, che fornisce soluzioni tecnologiche e servizi di consulenza a PMI, associazioni di categoria, intermediari finanziari, studi professionali ed enti.

Sito webSoluzioni entrepriseLinkedInTwitterFacebookinfocert.itinfocert.digitallinkedin.com/company/infocerttwitter.com/InfoCert_itfacebook.com/InfoCertSpA Ruolo principale in IMPULSE

InfoCert è un Trust Service Provider (TSP) autorizzato, figura cruciale nel progetto IMPULSE. Inoltre, InfoCert supervisionerà la gestione dell’innovazione e della diffusione dei risultati di progetto e contribuirà all’analisi e all’implementazione delle caratteristiche di affidabilità del progetto, tra cui la Decentralised Public Key Infrastructure (DPKI), la CA e la RA.

The post IMPULSE: Alleanza europea per facilitare l’identificazione online nei servizi pubblici appeared first on InfoCert.digital.


Ockto

SIVI en Ockto brengen registratie klantdata stap verder

Rond het financieel advies dat een financieel adviseur uitbrengt speelt het gebruik van klantdata een belangrijke rol. Om te zorgen dat deze data eenduidig vastgelegd kan worden introduceert SIVI de vertaling van het Ockto Data Model naar de SIVI All Finance Datacatalogus (AFD). Bij financieel advies gaat het zeker niet alleen over verzekeringsgegevens, maar bijvoorbeeld The post SIVI en Ockto b

Rond het financieel advies dat een financieel adviseur uitbrengt speelt het gebruik van klantdata een belangrijke rol. Om te zorgen dat deze data eenduidig vastgelegd kan worden introduceert SIVI de vertaling van het Ockto Data Model naar de SIVI All Finance Datacatalogus (AFD).

Bij financieel advies gaat het zeker niet alleen over verzekeringsgegevens, maar bijvoorbeeld ook om gegevens rond inkomen, pensioen, belasting, hypotheek en lease. Behalve in documenten is deze informatie ook steeds meer als data beschikbaar. Met behulp van Ockto kunnen adviseurs hun klanten verzoeken hun persoonlijke gegevens direct uit de bronsystemen op te halen en met hen te delen.

Steeds meer partijen gebruiken de SIVI All Finance Datacatalogus (AFD) niet alleen voor het verzenden van berichten, maar ook voor het vastleggen van data binnen moderne database structuren. Dit is een belangrijke reden geweest voor de recente introductie van de vernieuwde SIVI AFS-standaard. Binnen het SIVI AFS is het mogelijk het complete klantdossier op gegevensniveau weer te geven. Op deze wijze kunnen softwaretoepassingen deze klantgegevens eenduidig vastleggen en bevragen.

Om deze vastlegging van gegevens binnen het klantdossier in de dagelijkse praktijk te ondersteunen introduceert SIVI nu de vertaling (mapping) van het Ockto Data Model naar de SIVI All Finance Datacatalogus (AFD). Deze Ockto-AFD-mapping maakt het voor software ontwikkelaars makkelijker de data uit het Ockto-bericht te importeren binnen het klantdossier. De Ockto-AFD-mapping is nu eerst beschikbaar in C# code. Later dit jaar zal de Ockto-AFD-mapping ook in de vorm van een API beschikbaar komen. De Ockto-AFD-mapping zal de releases van het Ockto Data Model volgen.

Peter Mols, Directeur SIVI: “De Ockto-AFD-mapping is de eerste in een reeks van mappingen rond o.a. inkomen, pensioen en hypotheken die SIVI zal uitbrengen binnen de vernieuwde SIVI AFS-standaard. Hiermee spelen we in op de toenemende noodzaak om eenduidig klantbeelden vast te leggen. Door het centraal aanbieden van deze mappingen verlagen we de kosten in de keten.”

Paul Janssen, Product Director Ockto: “Ockto sluit graag aan bij de SIVI-standaarden. Deze mapping tussen Ockto en het AFD ondersteunt het eenduidig vastleggen van gegevens binnen het klantdossier en faciliteert het toenemend gebruik van brondata. Onze samenwerking met SIVI sluit goed aan bij ons streven om de afhandeling van klantdata uit bronsystemen voor iedereen makkelijker te maken.”

The post SIVI en Ockto brengen registratie klantdata stap verder appeared first on Ockto.

Wednesday, 24. February 2021

ShareRing

ShareRing (SHR) Lists on Australia’s Crypto Trading Platform – Coinspot

ShareRing has been listed on Coinspot, Australia’s most popular cryptocurrency trading platform We are pleased to announce that on Tuesday, February 23rd ShareRing (SHR) has... The post ShareRing (SHR) Lists on Australia’s Crypto Trading Platform – Coinspot appeared first on Official ShareRing Blog.

ShareRing has been listed on Coinspot, Australia’s most popular cryptocurrency trading platform We are pleased to announce that on Tuesday, February 23rd ShareRing (SHR) has...

The post ShareRing (SHR) Lists on Australia’s Crypto Trading Platform – Coinspot appeared first on Official ShareRing Blog.


My Life Digital

Attitudes to Personal Data Management

In recent years, personal data has been an increasingly popular topic of conversation for marketers, data analysts, regulators, and privacy warriors. Individuals have learnt that recent regulatory updates have given them more rights over how that data is used. Are these two forces aligned? We distributed a survey and received over 400 responses from both … Attitudes to Personal Data Management

In recent years, personal data has been an increasingly popular topic of conversation for marketers, data analysts, regulators, and privacy warriors. Individuals have learnt that recent regulatory updates have given them more rights over how that data is used. Are these two forces aligned?

We distributed a survey and received over 400 responses from both individuals and organisations answering questions about the management of personal data. How aligned are the two points of view? This infographic shows a summary of key questions and responses.

 

Get the full report for free

 

Download our free Attitudes to Data Management Report 2021 to receive:

Key results of the two surveys mentioned above. Correlations and differences between the two audiences. Recommendations on how to ensure your organisation’s practices align more closely with the expectations of the consumer, as well as enabling you to meet legal and ethical standards.

The post Attitudes to Personal Data Management appeared first on MyLife Digital.


Evernym

Integrating SAP SuccessFactors with Evernym’s Verity SSI Platform

SAP shows how SSI makes portable, user-centric identity possible by integrating their SuccessFactors HCM platform with Verity. The post Integrating SAP SuccessFactors with Evernym’s Verity SSI Platform appeared first on Evernym.

SAP shows how SSI makes portable, user-centric identity possible by integrating their SuccessFactors HCM platform with Verity.

The post Integrating SAP SuccessFactors with Evernym’s Verity SSI Platform appeared first on Evernym.


Fission

The Price Associated With Free Applications

in a world filled with social media, apps and games on our phones, we often expect everything to be free--but what is the hidden cost to you?

Why You Should Pay For Apps and How It Can Protect Your Personal Data.

It’s “Free!” That’s Amazing! Who doesn’t love something that comes with no cost? For small businesses, hobbyists, startups and everyday people, I think it’s fair to say we all get excited when we hear this. In fact, in a world filled with social media, apps and games on our phones, perhaps we even expect it--but what is the hidden cost to you?

If you have seen The Social Dilemma then you might be familiar with this quote:
“If you’re not paying for the product, then you are the product” — Daniel Hövermann

The hard fact here is that these “free” apps (ad-supported products) profit from collecting your data. They also make money from your attention through your clicks, likes and screen time. The real product for sale in all of these cases is your attention and subsequent interactions while logged in.

Here’s a simple example of how you become the product: In advertising-supported products, ads are presented to you.  These advertisements are curated based on cookies you agree to in your web browser as well as other behaviours you exhibit within the app itself.  

Let’s break this down a bit more. You’re scrolling Facebook when you see a “post” for something you were recently shopping for online. This is called an impression and the app/platform gets paid by the advertiser simply for it coming into your view. Now say you click on that ad. Facebook receives another payment for your behaviour, in this instance for your click. This might sound familiar given what we all know about Facebook, but this kind of advertising is rampant in our world. A recent article from the website MLSDev states that currently, “7 out of 10 apps contain embedded advertisements”. This seems innocent at first glance, but who can advertise on these platforms is largely unregulated and monitored primarily by algorithms. Algorithms that many people--including those who work at these companies-- don’t completely understand.

Using ads to provide free apps for users means that these companies are collecting information about you. This information is collected via your behaviour, views, and interactions. Then, this harvested information is used to infer demographic information about you. This includes but is not limited to where you live, your age, if you have kids etc. All of this is made available simply from behaviours you exhibit online.

At Fission, we are on a mission to normalize the payment of apps for this very reason and believe that a movement towards paying for apps, even at nominal rates, would bring about greater protection for you, the user.

What happens when you pay for your app:

Your data becomes more secure and we begin to move towards a more ethical and humane technological future. Your behaviour and attention are not propagated and sold. Your privacy is more protected because the profitability of the application you are using is not contingent on your screen time/attention and personal information for targeted advertisements. Charging for apps can improve the standard of living for app developers outside of North America and Western Europe.
a. Example: You publish an app and promote it, developing a user base of    500 users. If those 500 users pay $50 for that app, we’re talking about a    $25,000 USD salary for that developer. Pretty substantial for an        independent developer in most parts of the world, especially in developing    countries Payment places value on the work. In an article from Derek Sivers, “Psychology experiments have shown that the more people pay for something, the more they value it.” Developers should be paid ethically without distributing their user’s data for financial gain to largely unregulated advertisers.

Let’s think about this. Would you be comfortable telling a stranger you met on the street, where you live, how many kids you have, where you went to high school, if you’re married or not? We readily volunteer this information online every day using ad-supported apps, like social media, productivity apps and games. Sounds insane right?

The future is not as abysmal as it might seem. While technology is bound to proliferate there are ways to prevent this dark picture above. The first step of many is to pay for the applications and software you use.

At Fission this is a driving focus for us. We believe in the protection and privacy of our users is essential--That’s why we feel so strongly about our mission. By supporting a movement for normalizing the payment of apps, we aim to be part of the solution. We encrypt the data of every end-user--while giving developers the tools to build great apps, and eventually include the ability to easily charge for them.

For end-users wanting to try out some of these apps, you can start by logging into Fission’s Drive app. It’s free for small amounts of data, and you can pay for a premium account on Open Collective, which helps us build more apps and tools for developers.

For developers, check out some of the App ideas in the forum, or start with the dev guide to build an app of your own.

References

The Social Dilemma. Jeff Orlowski. Tristan Harris, Jeff Seibert, Justin Rosenstein, Tim Kendall. Netflix, 2020. Film

Catherine Han, Irwin Reyes. “Do You Get What You Pay For? Comparing The Privacy Behaviours of Free vs Paid Apps” UC Berkeley, International Computer Science Institute. University of Calgary, IMDEA Networks Institute, Universidad Carolos de Madrid. 2017

MLSDev. (2021) How Do Free Apps Make Money?
Retrieved from: https://mlsdev.com/blog/how-do-free-apps-make-money

Derek Sivers.  (2018) The Higher The Price, The More They Value It.
Retrieved from: https://sive.rs/morepay#:~:text=Psychology%20experiments%20have%20shown%20that,the%20more%20they%20value%20it.&text=When%20people%20want%20the%20best,the%20expensive%20wine%20tastes%20better.

Space O Technologies. (2021) How Do Free Apps Make Money
Retrieved from: https://www.spaceotechnologies.com/how-do-free-apps-make-money/


Finicity

Finledger: Concerns about COVID-19 credit crisis aren’t going away anytime soon

It may be that the worst is yet to come regarding a credit crisis and long-term economic hardship. According to a survey of 2,000 U.S. consumers Finicity conducted last November, people stated they were just as concerned about their credit in the wake of job losses or financial hardship near the end of 2020 as they […] The post Finledger: Concerns about COVID-19 credit crisis aren’t going a

It may be that the worst is yet to come regarding a credit crisis and long-term economic hardship. According to a survey of 2,000 U.S. consumers Finicity conducted last November, people stated they were just as concerned about their credit in the wake of job losses or financial hardship near the end of 2020 as they were during the initial onset of the pandemic. In fact, more than two-thirds (65%) of respondents said they are concerned their credit score will go down in the next six months because of the pandemic.

Read the full article.

The post Finledger: Concerns about COVID-19 credit crisis aren’t going away anytime soon appeared first on Finicity.


Fintec Buzz: Fintech Interview with CEO and co-founder, Finicity – Steve Smith

Finicity CEO and C0-founder Steve Smith discusses open banking and data strategies and how to continue to drive innovation forward in the fintech industry. Read the full interview.  The post Fintec Buzz: Fintech Interview with CEO and co-founder, Finicity – Steve Smith appeared first on Finicity.

Finicity CEO and C0-founder Steve Smith discusses open banking and data strategies and how to continue to drive innovation forward in the fintech industry.

Read the full interview. 

The post Fintec Buzz: Fintech Interview with CEO and co-founder, Finicity – Steve Smith appeared first on Finicity.


IBM Blockchain

Blockchain tokenization in enterprises and beyond

Blockchain tokens are the digital representation of complete or shared ownership in anything of value. Blockchain tokens are commonly leveraged in payments and settlements between participants. The tokens also enable representation of multi-party ownership of an indivisible asset, such as a work of art, and ease the exchange of such ownership between parties in a […] The post Blockchain tokeniza

Blockchain tokens are the digital representation of complete or shared ownership in anything of value. Blockchain tokens are commonly leveraged in payments and settlements between participants. The tokens also enable representation of multi-party ownership of an indivisible asset, such as a work of art, and ease the exchange of such ownership between parties in a […]

The post Blockchain tokenization in enterprises and beyond appeared first on Blockchain Pulse: IBM Blockchain Blog.


Trinsic (was streetcred)

An Introduction to Trinsic Studio

Trinsic Studio is an easy-to-use portal that allows developers to easily manage their integrations with Trinsic’s APIs. With it, developers can manage their credentials, API keys, verification policies, and more—all in one place. However, the Studio is not just for technically-savvy people. From the Studio, you can issue verifiable credentials, perform verifications, and establish connections […]

Trinsic Studio is an easy-to-use portal that allows developers to easily manage their integrations with Trinsic’s APIs. With it, developers can manage their credentials, API keys, verification policies, and more—all in one place. However, the Studio is not just for technically-savvy people. From the Studio, you can issue verifiable credentials, perform verifications, and establish connections in minutes without writing any code. It truly is the best no-code tool for verifiable credential exchange.

 

Instead of spending weeks of development time or money on learning the ins and outs of self-sovereign identity (SSI), you can sign up for a free Trinsic Studio account and issue your first verifiable credential in minutes! Learn more about what you can do with Trinsic Studio below.

What can you do with Trinsic Studio? Create Organizations

Before issuing or verifying any credential, you must create an Organization in Trinsic Studio. When you create an Organization, you are creating a credential issuer and/or verifier. The number of Organizations you can create depends on your Trinsic Studio plan, so check out our pricing page to see how the plans differ.

Issue credentials

When we say that issuing verifiable credentials using Trinsic Studio takes only minutes, we mean it. The Studio guides you step-by-step on how to create a credential template, so you can control what information is included in a certain type of credential. You can also enable revocation if necessary for your use case. Once a credential template is created, you can start issuing that type of credential to anyone. Trinsic Studio gives you the option to send the credential by connection, email, or QR code/link. Check out this tutorial on how to issue credentials in Trinsic Studio.

Verify credentials

Verifying credentials is just as easy with Trinsic Studio as issuing them. To be able to verify a credential, a verification template must be created. Verification templates specify the exact information an individual needs to share in order for the verification to be successful. Trinsic Studio even gives users the flexibility to define verification templates that utilize zero-knowledge proof (ZKPs) technology. Once a verification template has been created, you can use Trinsic Studio to send a verification request using by connection, email, or a QR code/link. Learn how to verify credentials in Trinsic Studio using this tutorial.

Make connections

Connections are used in SSI to create a secure, peer-to-peer communication channel between you and another party. With this connection, you can issuer, verify, and share verifiable credentials without fear of man-in-the-middle attacks. With Trinsic Studio, creating a connection with someone else is as easy as clicking a button to generate a QR code and sending that QR code to the other party for them to scan using an SSI digital wallet (like the Trinsic Wallet). Learn more about how to create connection in Trinsic Studio through this tutorial.

White-label the Studio

Trinsic Studio can now be white labeled with your branding. This highly-requested feature is available to our enterprise customers. If you are interested in learning more about white labeling the Studio, please contact us here.

Sign up for free

More than 1000 developers from organizations around the world have used Trinsic Studio to manage their verifiable credential exchange. Try Trinsic Studio out for yourself today by signing up for a free account. If you are passed the exploration stage and ready to build an enterprise-grade credential ecosystem, our pricing is based on the number of credentials issued and verified, so you only pay for what you need.

 

For more support, check out our documentation which includes explanations of SSI terminology and concepts, tutorials, our API Reference, and a community forum. And as always, feel free to reach out to support@trinsic.id if you have any questions.

The post An Introduction to Trinsic Studio appeared first on Trinsic.


Provenance

How Provenance’s Transparency Framework helps beauty brands communicate product impact

The beauty industry isn’t good at opening up when it comes to ingredients and impact. But the ‘values shopper’ cohort is growing, and brands need to step up to meet their expectations.... The post How Provenance’s Transparency Framework helps beauty brands communicate product impact appeared first on Provenance News.

The beauty industry isn’t good at opening up when it comes to ingredients and impact. But the ‘values shopper’ cohort is growing, and brands need to step up to meet their expectations. According to the British Beauty Council’s Courage To Change report, 86% of beauty shoppers want information about ingredient supply chains. Brands are falling behind these customer expectations: 1 in 5 customers don’t know how to check a product’s sustainability credentials.

Supply chain transparency is a difficult issue for brands. At Provenance, however, we believe they have a responsibility to open up about impact. And, with customer expectations for transparency growing fast, those who do will benefit. We believe every product should come with accessible, trustworthy information about origin, journey and impact. Through Provenance’s Transparency Framework, we’re helping beauty brands deliver exactly this. 

What’s the Transparency Framework?

Our Transparency Framework is an open-source framework that helps companies communicate the impact of their business and supply chains with integrity. The framework is benchmarked to international standards, verifiable through proof and backed up by a council of expert advisors. It allows brands to turn supply chain data into information shoppers can understand, accessible through off-pack and online channels.

As part of the Transparency Framework process, we work with brands to develop a comprehensive understanding of a product’s impact throughout the supply chain. This covers a range of issues relating to a business’s impact, from carbon impact to animal welfare issues. The framework then allows brands to communicate the impact through Proof Points, which connect claims around impact to real data in the supply chain.

The Transparency Framework is benchmarked to international standards

What does it mean for beauty brands? 1: Unlock the truth around ingredients

Beauty shoppers are used to a lack of information around ingredients. When ingredients lists are disclosed, they’re often difficult to understand, making it hard for shoppers to make an informed decision.

Our Transparency Framework is enabling brands to be open with customers about ingredients and their impact on people and the planet. Through Proof Points, brands can clearly communicate proven claims around their ingredients, so that shoppers can compare products based on facts.

The framework in action: Provenance recently worked with an Italian brand who were marketing sunscreen as Coral Reef Safe. Thanks to the Transparency Framework process, they learned that their formulation in fact contained an ingredient not considered to be Coral Reef Safe. The brand immediately stopped making this claim, and are currently taking steps to address the issue.

With Provenance, beauty brands can make proof-backed claims across a range of impact areas

2: Add clarity to clinical claims

Beauty shoppers are used to brands marketing their products as ‘clinically tested’. But unfortunately, not all claims adhere to a consistent standard, and shoppers are generally left in the dark on the nature of these tests

Through our Transparency Framework, brands can get specific with their ‘clinically tested’ claims. The framework allows brands to share the subject, method and results of the test in question. Shoppers can then access this information online – by clicking on a Provenance Claim Capsule or in-store – by scanning a QR code.

The framework in action: Lumity is a cosmetics brand that sells ‘clinically tested’ Day & Night Nutritional Supplements. With Provenance Proof Points, their customers can see a detailed summary of these clinical tests with just a click. Lumity’s customers can see that “a placebo-controlled, double-blind, two-cell clinical trial was conducted on 50 female subjects, aged 35-65, for 12 weeks.” Crucially, they can also see the positive results: “users saw significant improvements in hair, skin, nails & quality of life.”

3: Communicate with confidence on product recyclability

Beauty brands can add credibility to their recyclability claims with verified Proof Points

British Beauty Council research found that unrecyclable packaging is amongconsumers’ top environmental concerns – and little wonder, given that the global cosmetics industry produces more than 120 billion units of packaging every year.

However, it’s a complex issue and many brands struggle to know what they can and can’t claim. Through the Transparency Framework, we’re helping brands develop a comprehensive understanding of the recyclability of their packaging, so that they can communicate confidently around this key consumer issue.

The framework in action: We recently worked with a haircare brand called Centred. Through the Transparency Framework, they were able to collate comprehensive packaging information from their supplier for a recyclability Proof Point. In the process, they discovered that parts of their packaging were actually made from recycled plastic. This meant that they weren’t only recyclable, but more than 75% of the product’s packaging was made from recycled materials, which they could communicate to shoppers through an additional Proof Point.

4: Build trust with third-party proof

Sana Jardin’s claim of supporting local communities in their supply chain is verified by NEST

According to a Compare Ethics report, just 1 in 5 shoppers trust brands’ sustainability claims. It’s perhaps understandable thatlarge brands are often guilty of ‘greenhush’, and reluctant to communicate progress for fear of backlash. 

Beauty brands can tackle this problem by working with independent third parties to verify their claims. The Transparency Framework helps brands add weight to claims and build consumer trust by embedding these verifications within the off-pack and online shopping experience. In practice, this lets brands highlight the positive impact they are having in the supply chain, whether that’s by working with communities or reusing byproducts.

The framework in action: The socially conscious fragrance brand Sana Jardin claims to “contribute to development of the community where our hero ingredient is sourced”. Through the Transparency Framework, the brand is able to evidence their actions: the Orange Blossom Project which they helped form has led to a 136% increase in local womens’ wages and the re-purposing of floral waste. Sana Jardin had this claim publicly verified by NEST, a nonprofit focused on responsible handcraft.

 

Are you a beauty brand manager? We’d love to talk with you about how our Transparency Framework can help you communicate with credibility around product impact. Get in touch.

The post How Provenance’s Transparency Framework helps beauty brands communicate product impact appeared first on Provenance News.


Ontology

Ontology Weekly Report (February 16th- 22nd, 2021)

Highlights It was a very exciting week for Ontology as the ONTO team launched their new wallet: ONTO Web, which is the first cross-chain web wallet. Ontology also successfully executed the “ONTO Red Packets For You & Me” and “Open ONTO with me” campaigns. Meanwhile, Ontology, as the only KYC & Identity Solution provider, has introduced the solution in detail. Latest Developments D

Highlights

It was a very exciting week for Ontology as the ONTO team launched their new wallet: ONTO Web, which is the first cross-chain web wallet. Ontology also successfully executed the “ONTO Red Packets For You & Me” and “Open ONTO with me” campaigns.

Meanwhile, Ontology, as the only KYC & Identity Solution provider, has introduced the solution in detail.

Latest Developments

Development Progress

We have completed 40% of the Ontology EVM-integrated design, which will make Ontology fully compatible with the Ethereum smart contract ecology after completion.

Product Development

Our campaign of “ONTO Red Packets For You & Me” launched last week. This allowed users to send Red Packets of ONG or WING on ONTO for a 10,000 ONG bonus. ONTO launched a two-week “Open ONTO with me” campaign, which was a tremendous success and attracted nearly 1,000 users.

dApps

112 dApps launched in total on MainNet. 6,420,482 dApp transactions completed in total on MainNet. 22,744 dApp-related transactions in the past week.

Community Growth

Another week of growth for the community. This week we onboarded 1,618 new members across Ontology’s global communities. As always we welcome anyone who is interested in Ontology to join us!

Global Events

Ontology — The ONLY Decentralized Identity Partner for Binance Smart Chain. Ontology and Binance have a long history of working closely together — a fruitful, mutually beneficial cooperation. The integration of Ontology’s Decentralized Identity Solution into the Binance Smart Chain is the latest in our collaborative efforts, further strengthening the ties between the two companies. The symbiotic relationship sees Ontology and ONT ID as the sole partner for BSC in terms of providing a truly decentralized identity option and KYC user verification. Li Jun, Founder of Ontology, was interviewed by CointelegraphCN and shared his thoughts on public chain’s globalization.

Industry News

Data Regulation In 2021: More Small Steps Or A Giant Leap?

By Li Jun, Founder of Ontology

We are living through an activist period of data regulation that will dictate how our economies function for generations to come. Data regulation is being designed from first principles before our very eyes. It draws from notions of individual privacy, and is heavily influenced by world events from national security regulation following 9/11 to a reckoning on data collection in the aftermath of the 2016 US presidential election. Technical considerations are also feeding into regulation, in particular the ability of machine learning technology to help with medical treatment or predict crime using ever larger data sets.

ONT ID is a framework for decentralized identity, built on the Ontology blockchain and integrated with the Binance Smart Chain. So any user, developer, or project launching STO’s will have access to fully integrate the benefits of ONT ID and with it, decentralized identity, into their existing developments. We are looking forward to witnessing the wider use of ONT ID, providing any entity with a digital identity that is controlled by the entity through decentralization.

Find Ontology elsewhere

Ontology website / ONTO website / OWallet (GitHub)

Twitter / Reddit / Facebook / LinkedIn / YouTube / NaverBlog / Forklog

Telegram Announcement / Telegram English / GitHubDiscord

Ontology Weekly Report (February 16th- 22nd, 2021) was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


UbiSecure

How higher education should leverage digital transformation with IAM

Being digital first with a robust digital journey is more than just a competitive advantage for educational institutions – it’s fundamental. The... The post How higher education should leverage digital transformation with IAM appeared first on Ubisecure Customer Identity Management.

Being digital first with a robust digital journey is more than just a competitive advantage for educational institutions – it’s fundamental. The diversity of the user base, IT budget restrictions, along with an absolute need for frictionless flow throughout services calls for a well thought out identity and access management (IAM) solution. Here is how higher education should take advantage of digital transformation with IAM.

Dynamic identity lifecycle management

Educational institutes have many different groups with different roles going through their various systems.

Students – from enrolment, to student lifecycle, all the way to becoming an alumnus Staff – regular staff accessing systems based on roles Visitors – visiting lecturers and students, short-term international students, contractors with the need of temporary access rights

Managing these identities without a dynamic identity management solution means time wasted on repetitive operational tasks, can cause serious delays with enrolling new faculty or students, and ultimately leads to a poor user experience. A good identity management solution simplifies creation, migration and storage of these user identities and ensures a frictionless, any-time, any-place journey throughout their identity lifecycle.

Cut down costs with good user experience

As well as various user groups, higher education institutes typically have various services. By enabling single sign-on (SSO), users can benefit from simplified login using one identity to all digital services and applications – which would otherwise call for admin-heavy tasks, draining IT support resources. By allowing the end user to manage their own accounts and resolve issues themselves, such as passwords or ID management through self-service, the whole process is even more quick and user-friendly. Giving the power to the end user, institutions are able to reduce operational costs significantly while increasing overall user satisfaction.

Robust security

Security issues, such as data breaches, are becoming more common every day. Higher education institutes are processing a lot of personally identifiable information (PII), which calls for solid security measures. Developing a security strategy based on capabilities such as identity verification, multi-factor authentication and delegated authority prevents security threats and helps to deal with security challenges effectively. A good identity management solution makes sure your users are who they say they are and that they have the right access to authorised services.

Future-proof operations

Attracting new students and maintaining the quality and growth in use of digital services affordably is a key part of the competition for higher education institutions. As budgets and resources are being constrained unceasingly, the demand for online services is growing fast – especially due to the pandemic. Having a digital-first mindset not only solves budgeting issues by cutting down unnecessary costs, but can make your institution competitive by appealing to tech-savvy students. Enriching educational life online with good identity management and having a truly secure, modern online experience is not a “nice to have”, it’s a must have and makes your digitalisation project sustainably future-proof. Digitalisation can help you to place first in the competition.  

Start your digital transformation

Higher education institutes are facing many challenges that can be solved with proper IAM solution. (See this blog from Intragen: 5 IAM Challenges in Higher Education). Being digital-first has always been a competitive advantage but now it is the key to business success. Ubisecure has almost two decades of experience in delivering proven digital ecosystems and we’ve helped higher education institutions take full advantage of digitalisation through our Identity Platform.

 

WEBINAR: The Value of Identity and Access Management in Higher Education. Learn how IAM can accelerate digital transformation in Higher Education. March 11th 12:00 PM – 12:45 PM EET. Register now!

The post How higher education should leverage digital transformation with IAM appeared first on Ubisecure Customer Identity Management.


SelfKey

HKVAX Joins Selfkey Exchanges Marketplace

We’re delighted to announce that the Hong Kong Virtual Assets Exchange is the latest addition to the SelfKey Exchanges Marketplace. The post HKVAX Joins Selfkey Exchanges Marketplace appeared first on SelfKey.

We’re delighted to announce that the Hong Kong Virtual Assets Exchange is the latest addition to the SelfKey Exchanges Marketplace.

The post HKVAX Joins Selfkey Exchanges Marketplace appeared first on SelfKey.


auth0

Authenticating Vuepress Apps with Auth0

Learn how to create your first VuePress site and add Authentication with Auth0
Learn how to create your first VuePress site and add Authentication with Auth0

Metadium

Metadium’s plans for the future

Metadium plans for the future Dear Metadium Community, In the last post, we looked at the business results of Metadium in 2019, now we would like to introduce you the roadmap and key activity goals we have for the first half of 2020. This year we’ll focus on creating demand for META coin and new use cases for Metadium blockchain and DID technologies. Metadium also plans to continue it
Metadium plans for the future

Dear Metadium Community,

In the last post, we looked at the business results of Metadium in 2019, now we would like to introduce you the roadmap and key activity goals we have for the first half of 2020.

This year we’ll focus on creating demand for META coin and new use cases for Metadium blockchain and DID technologies. Metadium also plans to continue its active DID standardization activities in order to impact the DID (Decentralized Identifier) industry.

Main goals of Metadium in 2020 - Construction of a DID ecosystem and practical use cases for DID and blockchain technology Launch THEPOL (new domestic service) KAYBO KEEPIN (new overseas service) Regulatory sandbox approval Metadium SDK for Unity official launch Launch MyKeepin app (DIDaaS) - DID Standardization Integrate the DID international standards recommended by DIF and W3C ETRI joint project: Development of Interoperable Decentralized ID Platform

Let’s take a detailed look:

Construction of a DID ecosystem and practical use cases for DID and blockchain technology Launch THEPOL (new domestic service)

THEPOL is an online voting, research, and petition service using Metadium’s blockchain and DID. All voting processes are recorded on the Metadium blockchain, enabling transparent opinion polls and petition campaigns. DID technology also ensures anonymity of survey participants and one-person voting. THEPOL uses DID technology to store all of your identification information, so you don’t have to worry about leaking your personal data.

2. KAYBO KEEPIN

KAYBO KEEPIN is a Metadium DID-based app that provides personal authentication and digital wallets to KAYBO users in Latin America.

KAYBO is a game publishing platform operated by FHL Games that provides major game services such as Battleground(PUBG) and Point Blank to its 20 million users in Latin America.

KAYBO KEEPIN is the result of Coinplug and FHL Games partnership and it’s being developed according to the W3C DID standard.

3. Regulatory sandbox approval

Coinplug, Metadium’s technological partner, is planning a regulatory sandbox for ‘Construction of youth age identification systems’ using the Metadium DID. To demonstrate the advantages and potential of DID technology, we are creating various certification business test cases where Metadium is used.

4. Metadium SDK for Unity

Metadium team is planning an official launch by upgrading Metadium SDK for Unity alpha version. Unity is the world’s largest game engine and game developer community, with this launch game developers can integrate Metadium SDK for Unity to easily apply DID solutions to their games. We are planning to unveil Metadium SDK for Unity in Unity Asset Store.

5. MyKeepin

MyKeepin is a DID network led by Coinplug, a technological partner of Metadium. Companies and institutions that need DID solutions can join the DID ecosystem as partners in the Mykeepin program. MyKeepin app will be launched in the second quarter.

DID Standardization

1. Integrate the DID international standards recommended by DIF and W3C

Metadium technology reflects the DID international standard presented by DIF and W3C. DIF’s Identity Hub is built on the Keepin app, and the Universal resolver is being used to develop an interoperable DID platform.

2. ETRI joint project

At the request of the Korea MSIT (Ministry of Science and ICT), Coinplug is participating in the project conducted by IITP. The Metadium Platform is being used for research and technology development. The ultimate goal of this project is to develop a platform that ensures privacy protection and interoperates with each other platforms. The project includes Korea Post, KFTC, Koscom, and ICON, a other blockchain platform.

Thank you for your support and trust in Metadium.

Metadium Team

Metadium’s plans for the future was originally published in Metadium on Medium, where people are continuing the conversation by highlighting and responding to this story.


Announcement: Metadium Hard Fork on November 18

METADIUM Hard Fork Dear Metadium community, Our team would like to announce the upcoming hard fork and what it means to everybody in the Metadium sphere. The hard fork is expected to happen at 00:00 UTC / 09:00 KST on November 18, 2020 at block 11441000. But please note that due to the nature of blocks generation, this time is just an approximation. The upcoming Metadium’s hard fork is
METADIUM Hard Fork

Dear Metadium community,

Our team would like to announce the upcoming hard fork and what it means to everybody in the Metadium sphere.

The hard fork is expected to happen at 00:00 UTC / 09:00 KST on November 18, 2020 at block 11441000. But please note that due to the nature of blocks generation, this time is just an approximation.

The upcoming Metadium’s hard fork is a stepping stone to the future of our blockchain. We are following Ethereum’s Istanbul hard fork (ETH 2.0) because we want to upgrade and fix bugs but also to ensure we have future access to the developments and upgrades of the Ethereum blockchain.

The hard fork was unanimously decided by the consortium that forms the Metadium Blockchain. Please note that if you execute an action before the block 1144100, there will be no interruption caused by the hard fork.

New Features: 6 EIP upgrades

152: blake2b hash function 1108: alt_bn128 re re-pricing 1344: chain ID to prevent replay attacks 1844: reprices certain opcode due to the rapid growth of states 2028: reduce the gas cost of calling on-chain data (68 to 16 gas per byte) 2200: net gas metering changes for SSTORE opcode

Upgrade method:

Option#1: Check out the latest codes from master branch & build your own gmet binary Option#2: Linux builds are available at the official release link. Please make sure that github commit hash is the following: 5b383cfca560f491b8726976977fb35bd3dd7af5

If you have any questions, please contact us through info@metadium.com

We would like to thank you for all your support and we are looking forward to Metadium’s next stage with a bigger impact in the industry.

Thank you for your support and trust in Metadium.

Metadium Team

Announcement: Metadium Hard Fork on November 18 was originally published in Metadium on Medium, where people are continuing the conversation by highlighting and responding to this story.


Coinplug launched digital public service app ‘B Pass’ and ‘Citizen Safety Report’ smartphone app…

Coinplug launched digital public service app ‘B Pass’ and ‘Citizen Safety Report’ smartphone app based on Metadium Blockchain(DID) As part of the Busan Blockchain Regulatory Free Zone project, Coinplug, a blockchain tech company, has launched a mobile identification service called ‘B Pass’ and ‘Citizen Safety Report smartphone application for Busan citizens. BPASS mobile application using Metadiu
Coinplug launched digital public service app ‘B Pass’ and ‘Citizen Safety Report’ smartphone app based on Metadium Blockchain(DID)

As part of the Busan Blockchain Regulatory Free Zone project, Coinplug, a blockchain tech company, has launched a mobile identification service called ‘B Pass’ and ‘Citizen Safety Report smartphone application for Busan citizens.

BPASS mobile application using Metadium DID

Both ‘B Pass’ and ‘Citizen Safety Report’ apps are based on Metadium’s decentralized ID (DID) technology. With ‘B Pass’ app, Busan citizens can use Busan Citizenship Card, Haeundae Gu Resident Card, Busan City Hall visit card, CENTAP pass, Mobile Family Love card, mobile library membership card, and other public services based on non-face to face identification. They also can use ‘B Pass’ for various Busan Blockchain Regulatory Free Zone Project services such as digital vouchers.

Thank you for your support and trust in Metadium.

Metadium Team

Coinplug launched digital public service app ‘B Pass’ and ‘Citizen Safety Report’ smartphone app… was originally published in Metadium on Medium, where people are continuing the conversation by highlighting and responding to this story.


THEPOL Launched

THEPOL is launched On March 30th, Metadium announced that they have released a voting and survey mobile application, “THEPOL”. It was released on Korean AppStore and Google PlayStore. Users can create voting and survey on the official THEPOL website. The followings are some main characteristics of THEPOL: Nobody can manipulate voting and survey: Since THEPOL records the online v
THEPOL is launched

On March 30th, Metadium announced that they have released a voting and survey mobile application, “THEPOL”. It was released on Korean AppStore and Google PlayStore. Users can create voting and survey on the official THEPOL website.

The followings are some main characteristics of THEPOL:

Nobody can manipulate voting and survey: Since THEPOL records the online voting and survey in Metadium blockchain, no one can manipulate voting or survey results, and anyone can conduct transparent petitiona and survey at a low cost. THEPOL ensures voter’s anonymity using DID: THEPOL uses Metadium’s DID(Decentralized ID) technology to ensure “vote anonymity” that traditional online voting services couldn’t realize. Meta(META) Coni and badge(NFT): The surveyor can pay META coin to surveyee as a reward. By participating in the survey, surveyee can obtain not only META coins but also rare badges(NFT) issued on Metadium blockchain.

THEPOL Launched was originally published in Metadium on Medium, where people are continuing the conversation by highlighting and responding to this story.


Fraud information inquiry service launch

Metadium DID login application, ‘Mykeepin’ started to provide fraud information inquiry service “Mykeepin” is an electronic signature and simple identification service with Metadium blockchain DID and self sovereign identity(SSI) technology. On May 27th, Metadium announced that Metadium DID login application, ‘Mykeepin’ started to provide fraud information inquiry service. Users of Mykeepin
Metadium DID login application, ‘Mykeepin’ started to provide fraud information inquiry service

“Mykeepin” is an electronic signature and simple identification service with Metadium blockchain DID and self sovereign identity(SSI) technology.

On May 27th, Metadium announced that Metadium DID login application, ‘Mykeepin’ started to provide fraud information inquiry service. Users of Mykeepin app can check data about fraud and illegal dealing. Organization and company can participate in Mykeepin Alliance to share, validate and leverage data about fraud and illegal dealing.

“Mykeepin” is an electronic signature and simple identification service with Metadium blockchain DID and self sovereign identity(SSI) technology. Through Mykeepin, users can directly experience non-face-to-face authentication and authentication requests without third-party identification service companies.

Fraud information inquiry service launch was originally published in Metadium on Medium, where people are continuing the conversation by highlighting and responding to this story.


Busan-si launched public services using Metadium DID

Busan is the second largest city in Korea. Its deep harbor and gentle tides have allowed it to grow into the largest container handling port in the country and the fifth largest in the world. The city’s natural endowments and rich history have resulted in Busan’s increasing reputation as a world class city for tourism and culture, and it is also becoming renowned as a hot spot destination for inte

Busan is the second largest city in Korea. Its deep harbor and gentle tides have allowed it to grow into the largest container handling port in the country and the fifth largest in the world. The city’s natural endowments and rich history have resulted in Busan’s increasing reputation as a world class city for tourism and culture, and it is also becoming renowned as a hot spot destination for international conventions.

As part of the Busan Blockchain Regulatory-free zone project, Busan Metropolitan Government will launch the “Busan Blockchain Trial App”, Metadium Blockchain Decentralized Identifier(DID)-based mobile identification service for Busan citizens.

Through the app, Busan citizens can use various services such as Busan citizenship card, visiting Busan City Hall, multi-child family card, and mobile payment based on Metadium DID.

Busan-si launched public services using Metadium DID was originally published in Metadium on Medium, where people are continuing the conversation by highlighting and responding to this story.


‘Magic Eco’ to integrate Metadium DID

MYKEEPiN (Metadium DID-based app) is applied to LMS(Learning Management System) used in the AI education of MagicEco ‘MagicEco’ operates AI (Artificial Intelligence) and IoT education programs for various college students and job seekers, including the University ICT Research Center (ITRC) of the Ministry of Science and Technology, Seoul Metropolitan Government, Hyundai Motor Group, and SK U
MYKEEPiN (Metadium DID-based app) is applied to LMS(Learning Management System) used in the AI education of MagicEco

‘MagicEco’ operates AI (Artificial Intelligence) and IoT education programs for various college students and job seekers, including the University ICT Research Center (ITRC) of the Ministry of Science and Technology, Seoul Metropolitan Government, Hyundai Motor Group, and SK University, and specializes in technical education for ab out 10,000 people a year.

‘MyKeepin DID(Metadium blockchain)’ is applied to LMS(Learning Management System) used in the AI education of ‘MagicEco’.

With the authentication technology of MyKeepin DID application, we will protect users’ personal information, apply various blockchain technologies to attendance check, electronic contract, and learning history management, and this is the first case of applying blockchain technology to the education industry in Korea.

‘Magic Eco’ to integrate Metadium DID was originally published in Metadium on Medium, where people are continuing the conversation by highlighting and responding to this story.


Partnership : CUBOX

New Metadium DID-based service CUBOX specializes in AI-based face recognition terminals. The specialty further extends to iris and fingerprint recognition. We develop non-contact to face biometric authentication technology with CUBOX, which utilizes Metadium DID(decentralized identifier). Metadium Technology Inc. plans to present a new technology and business model that combines DID with non-co
New Metadium DID-based service

CUBOX specializes in AI-based face recognition terminals. The specialty further extends to iris and fingerprint recognition.

We develop non-contact to face biometric authentication technology with CUBOX, which utilizes Metadium DID(decentralized identifier).

Metadium Technology Inc. plans to present a new technology and business model that combines DID with non-contact biometric authentication. Metadium has developed the mobile visiting card for Busan government building using DID technology and this experience will be helpful for developing non-contact biometric authentication.

Partnership : CUBOX was originally published in Metadium on Medium, where people are continuing the conversation by highlighting and responding to this story.


Metadium celebrates reaching more than one million META wallets

Greetings from the Metadium team, We area happy to announce that over a million wallets have been created on the Metadium blockchain. This number was accomplished thanks to the successful development of multiple business related to DID in 2020. Some of these bussiness are: *THEPOL: a blockchain-based survey and poll platfom using Metadium blockchain. It uses Decentralized Identifier (DID) t

Greetings from the Metadium team,

We area happy to announce that over a million wallets have been created on the Metadium blockchain. This number was accomplished thanks to the successful development of multiple business related to DID in 2020. Some of these bussiness are:

*THEPOL: a blockchain-based survey and poll platfom using Metadium blockchain. It uses Decentralized Identifier (DID) to provide untampered voting and public opinion polls as well as guaranteeing the voters’ anonymity and data protection. THEPOL have total of 1000+ surveys and polls were created with over 500, 000 users.

* MYKEEPiN: an easy and secure identification service based on METADIUM’s Decentralized Identifier (DID) technology and Self-Sovereign Identity (SSI). The service is designed so users can process digital identity authentication requests via their credentials stored on MYKEEPiN without the intervention of third-party companies. Since its launch in May, through the 83 Alliance Company, MYKEEPiN has already been implemented in multiple areas including education, AI, and more.

*MONEY TREE: a Metadium-based financial platform for Galaxia Communications with a stable coin (XTL) with the same value as 1 KRW. Moneytree is an electronic payment and mobile financial platform used by more than 1,500,000 users.

Reaching one million META wallets it’s without a doubt a big accomplishment and it couldn’t have been obtained without our team and, of course, the community.

We will continue working hard to ensure Metadium-based services reach more people in the future.

Thank you,

Metadium Team

Metadium celebrates reaching more than one million META wallets was originally published in Metadium on Medium, where people are continuing the conversation by highlighting and responding to this story.


Okta

What Being a Musician Taught Me About Being a Programmer

I’ve been a musician most of my life. I was in bands when I was in junior high up until I joined the Army at 19. I started writing software a few years after getting out of the military and kinda put music to the side. Recently, I’ve started to play in bands again, and something struck me about the dynamics of playing with other musicians and how that prepared me for writing software on a team. I’

I’ve been a musician most of my life. I was in bands when I was in junior high up until I joined the Army at 19. I started writing software a few years after getting out of the military and kinda put music to the side. Recently, I’ve started to play in bands again, and something struck me about the dynamics of playing with other musicians and how that prepared me for writing software on a team. I’ve been writing software professionally now for over twenty years. There are many parallels, so I thought I would share some of those lessons hoping that it might help other programmers work more effectively in a team. Writing software, after all, is a team sport.

Practice Your Craft

As a musician, you should practice your parts for the songs you’re playing with the band and also practice getting better on your instrument in general. The better you play your part, the better the team will be, but never forget that your part is just one part of a larger whole. This means practicing and perfecting your part before you get to rehearsals. The more prepared you are when you get to rehearsal, the better the band will sound at rehearsal.

In software, this means continuing to learn about the languages and platforms that you program for. Take time to read that blog post, or watch that tutorial, or read that programming book. You don’t want to let it consume your whole life (work-life balance is extremely important), but you can just carve out one evening a week where you spend time improving your skills for an hour or two without becoming a workaholic.

Let Go of Mistakes

When you mess up, learn what you can from the mistake: was it just a typo? Or a more serious deficit in your knowledge? Then, let it go. It’s easy to dwell on mistakes, but that will only force you to make more of them. You don’t want to ignore a mistake completely, but learn what you can and move on. I’ve always thought that the best musicians in the world are the ones who can make a mistake instantly learn, and then forget it. Just keep playing.

This goes for coworkers as well. Your band/teammates will make mistakes as well. Believe me, they know they messed up. Fix it, and move on. They probably already feel bad enough, and you might be surprised at just how far a “That’s okay, you got this” will go to creating a really strong team bond.

Celebrate Bandmates Successes

It’s important to celebrate your bandmates’ victories, accomplishments, and those times when they just kill it during a performance. Not only does this form a stronger team bond, but they will undoubtedly return the favor. It may seem like a small thing, but it can really make someone’s day. Everyone wants to feel like that “rock star” from time to time, and band members will remember that feeling of killing it and want to do it all the time. This will also make you want to have more of those moments too. Everyone’s performance will improve, and the band will sound fantastic. There is nothing like being on a team of high-performing players, whether you’re playing a local dive bar or building a large software system.

Stay On Good Terms

If you must part ways with a member, do your best to part on good terms. Bad feelings are too heavy to carry around, and that weight affects your performance (on and off stage). Plus, whether you live in LA or Springfield, IL, you’ll likely see those same players around your local scene, and you don’t want anyone running around with bad feelings about you. You might find yourself in a band with them later, and you don’t want things to be awkward or to not even get the gig because of one person.

On the flip side, welcome new members with excitement. Do your best to include them in band activities and make them feel like they’ve ALWAYS been part of this group. Being a new member of the team is stressful and can be isolating. You’ve probably been the new guy before too. The band will sound better and playing music will be more fun.

No Showboating

Everyone has a solo once in a while. It won’t always be you. It’s important that all of the members of the band get a chance to show off their talents. When it’s your turn, let ‘er rip! When it’s not your turn, do whatever you can to make sure that you’re playing in a way that helps to highlight just how badass your teammate is. They will do the same for you when it’s your turn to shine. This, again, creates a tighter team bond and makes playing with your team that much more enjoyable. You’re going to be spending a lot of time with these people, do your best to make it a joy to work with them.

Nobody likes a know-it-all anyway. If you’ve ever been on a team where one of the members believes they’re above the team, it can be quite demoralizing to the team and really affect the team’s performance. Don’t be that person. If you really feel like your skills are way above the team you are on (it happens), find another team. Sometimes that is a great signal that it’s time to move on. You might also find other ways to serve the team, like helping them in areas where they are deficient. This can be a little tricky since you don’t want to seem like a know-it-all, but if you see them struggling, you might just say, “Hey! Want some help?” You might even find you have a knack for teaching and find yourself a new career direction!

Not Every Song Will Be Your Favorite

I’ve played in a lot of cover bands, some originals bands as well. Not all of the songs we played were fun for me. Maybe my part was boring; maybe I just didn’t like that particular song. It’s fine. It is important to remember that you are doing this for someone else. When you’re putting together a setlist, you try to play songs that the audience will enjoy. Sometimes those are songs you don’t really like.

The same goes for software. Sometimes you might get tasked with the “grunt work.” Maybe it’s a task you don’t really like doing. It’s important in those times to remember the team’s goals. What are we trying to accomplish here? Understanding how an unenjoyable task helps the team make better software can be the key to making those tasks more bearable.

Your Tips

Over the years, I’ve met a LOT of developers. There seem to be a lot of you who are musicians. So what did I miss? Any tips along the way you’ve found that made you a better team member or made your work more enjoyable? Please share your tips below in the comments.

If you’d like to learn more about programming with Okta, check out these other blog posts:

Reactive Java Microservices with Spring Boot and JHipster Build a Secure GraphQL API with MicroProfile Easily Consume a GraphQL API from React with Apollo

Also, don’t forget to follow us on Twitter and subscribe to our YouTube channel. We post regularly on both. Rock on!

Tuesday, 23. February 2021

Global ID

The GiD Report#148 — A Ripple bombshell? Clubhouse and the rise of digital cults

The GiD Report#148 — A Ripple bombshell? Clubhouse and the rise of digital cults This week: Ripple bombshell Clubhouse counterpoint This week in regulations Stuff happens 1. A “bombshell” in the SEC v. Ripple case? Revealed during the pretrial court hearing: A large exchange approached the SEC in 2019 asking if they could trade/sell XRP. Apparently, the SEC didn’t say no. C
The GiD Report#148 — A Ripple bombshell? Clubhouse and the rise of digital cults

This week:

Ripple bombshell Clubhouse counterpoint This week in regulations Stuff happens 1. A “bombshell” in the SEC v. Ripple case?

Revealed during the pretrial court hearing: A large exchange approached the SEC in 2019 asking if they could trade/sell XRP. Apparently, the SEC didn’t say no.

Coinbase CEO Brian Armstrong, Photo: TechCrunch

Elsewhere, Coinbase is now trading at over $100 billion — potentially making it the biggest IPO since Facebook. To the moon. Axios:

Share sale: Coinbase last month launched a secondary share sale via Nasdaq Private Markets (f.k.a. Second Market), offering up to 1.8 million shares in weekly batches.
The goal was to help Coinbase determine a reference price for its public offering, which will be done via direct listing instead of IPO. The initial batch of 75,000 shares was sold on Jan. 29 at $200 per share. That worked out to a valuation of nearly $54 billion, compared to the $8 billion valuation Coinbase received during its prior venture capital round in late 2018. The next two batches were sold at $301 and $303, respectively. The most recent batch of 127,000 shares was sold Friday at $373, which works out to a valuation of $100.23 billion.

And finally, if you haven’t already, check out our latest podcast episode featuring Solana creator Anatoly Yakovenko: EPISODE 04 — The future of blockchain with the creator of Solana

It’s a super educational listen as Anatoly gives a bit of history on the evolution of blockchain, while providing the origins story behind Solana’s novel solution — proof of history.

Also:

Diem Stablecoin Prepares for Liftoff With Fireblocks Custody Partnership — CoinDesk Via /gregkiddWhy Silicon Valley Doesn’t Get Bitcoin with Dan Held — What Bitcoin Did 2. Last week, we talked about where Clubhouse fits into the evolution of our platforms in the context of this broader cycle. Here’s a counterpoint.

From The Information:

People are right to be focusing attention on Clubhouse’s meteoric rise. But almost everyone seems to misunderstand what is making the app so successful.
It isn’t, as people like Ben Thompson suggest, simply another iteration of the old story of lowering the barriers for production and consumption, shortening feedback loops and creating new “white space” for aspiring stars. If you think of it as being like Twitter, Stories or TikTok, you are missing the point.
The key to Clubhouse’s rapid accession is that its social design makes it an ideal platform for cults at a time when the social internet is rapidly evolving away from organizing around communities and toward cults. Clubhouse’s rise likely signals major strategic shifts that several legacy social platforms are going to have to consider.
The weakening of digital communities and rise of digital cults that we are seeing is inevitable. A similar pattern played out millennia ago in the physical world, as large cults and autocracies came to dominate small local communities. It is reasonable that the same cycle would repeat at warp speed in the digital space.

Also:

Clubhouse Chats Are Breached, Raising Concerns Over Security Reddit removed 6% of content on its platform in 2020 Audio platforms are thriving in the pandemic “Mark Changed The Rules”: How Facebook Went Easy On Alex Jones And Other Right-Wing Figures 3. This week in regulations:

Maryland becomes first state to enact tax on digital advertising (Axios):

A coalition of business and tech trade groups sued the state of Maryland Thursday over its newly enacted tax on digital ads that are shown to state residents, Ashley reports.

Also:

Australia’s news law prods Google, Facebook down opposite paths The State House Versus Big Tech House to grill Facebook, Google, Twitter CEOs as Washington seeks to crack down on disinformation, antitrust States Push Internet Privacy Rules in Lieu of Federal Standards The Bizarre Reaction To Facebook’s Decision To Get Out Of The News Business In Australia 4. Stuff happens: Via /markonovak — ‘LinkedIn For Gamers’ Secures $6 Million Investment From NFL And NBA Stars Via /m — Vaccinated Travelers May Soon Be Able to Visit Hawaii With a Digital Vaccine Passport Via /gregkidd — Zolve raises $15 million for its cross-border neobank aimed at global citizens — TechCrunch JPM addresses tether risk TAT targets holders of cryptocurrencies Podcast: The New York Times’ Nicole Perlroth on the cyber-weapons arms race Lol: Dogecoin Brings the Cryptocurrency Craze to Its Logical Conclusion Via /m — Microsoft’s New Gig: A LinkedIn Freelancer Market Rivaling Upwork, Fiverr

The GiD Report#148 — A Ripple bombshell? Clubhouse and the rise of digital cults was originally published in GlobaliD on Medium, where people are continuing the conversation by highlighting and responding to this story.


Coinfirm

Coinfirm and Stelios Americanos & Co LLC RegTech Alliance

23rd February, LONDON, UK – Coinfirm, the industry-leading RegTech provider and Stelios Americanos & Co. LLC, a highly innovative Cypriot law firm, today announce their joint cooperation. “Coinfirm has always put great importance in ensuring the efficiency in speed and cost that blockchain brings – as well as the opportunity to reinvent entire market structures...
23rd February, LONDON, UK – Coinfirm, the industry-leading RegTech provider and Stelios Americanos & Co. LLC, a highly innovative Cypriot law firm, today announce their joint cooperation. “Coinfirm has always put great importance in ensuring the efficiency in speed and cost that blockchain brings – as well as the opportunity to reinvent entire market structures...

KuppingerCole

Cysiv SOCaaS

by Warwick Ashford Cysiv SOCaaS is a Security Operations Center (SOC) as a Service offering based on Cysiv's cloud-native, co-managed platform that is highly scalable, supports multi-tenancy, and ingests telemetry from any source that generates logs.

by Warwick Ashford

Cysiv SOCaaS is a Security Operations Center (SOC) as a Service offering based on Cysiv's cloud-native, co-managed platform that is highly scalable, supports multi-tenancy, and ingests telemetry from any source that generates logs.


Indicio

Hyperledger Aries sees new coordinated effort to produce complete mobile agent

The post Hyperledger Aries sees new coordinated effort to produce complete mobile agent appeared first on Indicio Tech.

Hyperledger Aries sees new coordinated effort to produce complete mobile agent New community effort, Aries Bifold, targets an open source Aries mobile agent interoperable with other Aries projects.

Hyperledger Aries is infrastructure for blockchain-rooted, peer-to-peer interactions. It includes  shared cryptographic storage for blockchain clients as well as a communications protocol for allowing off-ledger interactions between those clients. And while mobile agents are a vital piece of the community ecosystem, helping implement verifiable credentials exchange between mobile devices, the community does not yet host an active effort targeting mobile agents.

Indicio is happy to announce an initial code contribution and ongoing development and support for this community effort to produce a production quality mobile agent that implements common protocols within the Aries community. We look forward to collaborating with others in this effort.

Part of the goal of this open community effort is to ensure Aries Bifold’s interoperability with other Aries projects to ensure protocols are built to Hyperledger Aries and Indy standards.

The initial community meeting will occur on Wednesday, February 24, 12:00 PM MST (11am PST, 2:00pm EST, 7:00pm UTC) Meeting information can be found here.

We welcome participation from the multiple organizations, contributors, and maintainers within the Aries community that have expressed interest in this effort and anyone interested in collaborating with the Aries community to create open source mobile agents.

All are welcome! Come join us!

The post Hyperledger Aries sees new coordinated effort to produce complete mobile agent appeared first on Indicio Tech.


Tokeny Solutions

The Era of Digital Asset Marketplaces

The post The Era of Digital Asset Marketplaces appeared first on Tokeny Solutions.
Ebook download The Era of Digital Asset Marketplaces

In many major jurisdictions, blockchain technology is now recognised as a suitable infrastructure for the issuance and transfer of financial securities. There is now a huge opportunity for financial institutions to launch their digital asset marketplaces (DAMs). These players can utilise a network that brings solutions to long-standing efficiency and liquidity problems.

Download the ebook to discover: What is a Digital Asset Marketplace (DAM)? Potential revenue models Functional components Methods in which DAMs can leverage DeFi ecosystems

The post The Era of Digital Asset Marketplaces appeared first on Tokeny Solutions.


Elliptic

Crypto Regulatory Affairs: OFAC Doubles Down on Crypto Sanctions Enforcement

The US Treasury's Office of Foreign Assets Control (OFAC) has issued a major warning to the crypto industry. 

The US Treasury's Office of Foreign Assets Control (OFAC) has issued a major warning to the crypto industry. 


Fission

Fission Demo Day February 2021

We meet on the third-Thursday-of-the-month for "Demo Days" where the Fission team talks about new releases & product updates, early demos of in-progress work, planned roadmap items followed by an open Q&A.

We meet on the third-Thursday-of-the-month for "Demo Days" where the Fission team talks about new releases & product updates, early demos of in-progress work, planned roadmap items followed by an open Q&A. We also invite the community to come and give app demos. This week we invited Jeremy Rustin of the TiddlyWiki project to talk about his product and our plans to integrate Fission’s Webnative SDK with TiddlyWiki.

Meet the Team

Our distributed team continues to grow with team members spanning from our HQ in Vancouver across three continents. Our newest addition is Courtney in 🇺🇸 North Carolina, United States.

Courtney (@Courtney) joins us as our Marketing Lead. This was her first week with Fission and her first demo day. Look for more blog posts and email updates from Courtney about what's going on with Fission.

Fission and Headless Ghost

Fission teammate Helder (@agentofuser) shared his work with Headless Ghost and Fission. He showed us how to use the Ghost CMS as a familiar editing interface for content producers and then pulling the content into a Next.JS website via API. Then, the content is compiled into a static website using React and deployed to the Fission platform. Check out the video to hear the details of this process.

Filecoin Backup App

Next, we heard from Ben (@benjaminbollen) and Brian (@bgins) about their work with Filecoin. Filecoin is a blockchain specifically used for storing files online without relying on a third-party platform or company. Because Fission is also interested in working in browsers very securely, we decided to work with the Filecoin team on this project. Hear from Ben and Brian about their progress in the Fission Demo Day video.

IPFS Cluster Updates

Daniel (@dholms) talked to us about his updates to the IPFS Cluster. Fission is built on an Interplanetary File System (or IPFS) which is a global peer-to-peer network. Fission builds encryption on top of the IPFS stack, running IPFS servers to make sure that your data stays online and accessible directly from the browser, all with your security in mind. Daniel shares with us some recent backend infrastructure work he's doing to improve our system performance.

Tiddlywiki and Fission

Jeremy Rustin (@jermolene) is the creator of Tiddlywiki, a serverless web app that allows users to create a non-linear notebook for recording, organizing and sharing complex information. We're collaborating to build an app that will allow users to use TiddlyWiki on the Fission platform. You can hear from Jeremy directly in the Demo Day video. We’ve also been working out in the open in the TiddlyWiki group on the Fission forum so hop over there to see what's going on.

Video Join us next time

We're experimenting with Luma for registering for events. Click the button below to sign up for the Fission March 2021 Demo Day, or visit the event page »

Register for Event

Caribou Digital

A platform of their own: Our experience running a participatory video storytelling project

This is a guest blog. In October 2020, Caribou Digital engaged my production company, Nairobi-based Story x Design, to produce a participatory video storytelling project that put 11 people who earn their livelihood from digital platforms at the centre of their own story. We are passionate about innovative storytelling methods and were able to apply our experience in user-generated content an

This is a guest blog.

In October 2020, Caribou Digital engaged my production company, Nairobi-based Story x Design, to produce a participatory video storytelling project that put 11 people who earn their livelihood from digital platforms at the centre of their own story. We are passionate about innovative storytelling methods and were able to apply our experience in user-generated content and production skills training to design and implement the project.

This project was a continuation of Caribou Digital’s ongoing work on platform livelihoods. At the onset of COVID-19, Caribou Digital initiated a series of conversations with platform workers and sellers in Ghana, Kenya, Nigeria, and Uganda about how the pandemic had impacted their lives. This storytelling project brought to life the stories of 11 workers through video diaries.

My company provided the workers with training on videography skills, equipped them with decent camera phones, and provided ongoing narrative mentorship to empower the participants to tell their stories over a period of two months. Through a series of short self-shot videos, each participant shared their real-life experiences as digital labourers during the pandemic.

In this blog, we share our methodology, including successes, challenges, and tips for those who are interested in running a similar project.

Methodology

During the project design phase, we discussed how the participants would submit their footage; we decided that issuing a smartphone would be our best chance for success. It would also be an incentive to participate in the project. The 12 participants were sent a Samsung A21s smartphone, chosen for the quality of its camera and for its price. We locked the phone at the start of the project using Prey, software that can remotely control and secure the phone.

Upon successful completion of the project, each phone was unlocked and ownership was transferred to the participant. A small octopus tripod, a Bluetooth selfie stick, a lapel microphone, and a selfie light were also distributed to each participant. This small production kit was designed to increase the production value of the video recordings and aid skills development.

For their contribution to the project, each participant received three payments tied to the delivery of a video for each chapter. The payments were a recognition of the time invested by the participants in the project while doing their platform work. In addition to their smartphone kit, they also received internet data bundles for uploads.

It was important in the project design phase to ensure the project was designed in an ethical and respectful way. We clearly outlined the terms of the engagement through a formal agreement form and gathered the informed consent of each participant for the usage of their image and likeness. This process also allowed the participants an opportunity to voice concerns about how their story would be edited and disseminated. A few participants told us they did not want to film their families and others drew boundaries in terms of time commitment. Knowing this upfront helped us work together with respect and understanding.

Through an agile approach, we engaged the participants in three ways: via video skills development, one-on-one narrative mentoring, and through a participatory, post-production process.

Video Skills Training — Using WhatsApp, we ran a series of one-on-one training sessions on how to record video using their phone. The participants were asked to submit video samples before the training that were used as examples to identify common mistakes. Afterwards, they were taught how to stabilize the camera, record clean audio, and capture shots to build a scene. We created a series of training videos and supporting graphics that were shared on WhatsApp, and then re-shared when a participant needed a specific reminder to, for example, turn their radio off while recording a video diary.

Narrative Mentorship — To support the storytelling outcomes, our mentor Abu “Sensei” Majid worked remotely with each participant to help them identify and express stories important to their lives. Through WhatsApp messaging and Google Duo video sessions, Abu spoke to the participants weekly for two months. This organic dialogue built trust and confidence, while empowering participants to self-direct their own story. Guided by Dan Harmon’s popular 8 story circle structure, Abu and I crafted a series of standard narrative prompts for the participants. As we received material back, we adapted the narrative model to the nuances of each person’s evolving story.

Participatory Post-Production — The participants shared their video diaries and original video footage primarily through Google Drive, WeTransfer, and Send Anywhere. The upload to cloud platforms was often stalled by unreliable internet access. Our post-production team downloaded the files to review and edit the material, sharing previews and follow-up questions with each participant in real-time. This structure helped the participants maintain ownership of their story without the need for high-level technical skills. Our team found Google Drive challenging as a cloud platform for video editing because it lacked file prioritization functionality and slowed down the back-and-forth feedback loop.

Successes

Overall, the project succeeded in capturing a variety of otherwise underrepresented voices in a unique and timely way. We produced and published 23 video chapters over the course of two months. That’s 92 minutes of content created remotely by a small production team and first-time self-shooters in quite a short period of time. Some of the specific successes include:

One-on-one Mentoring — The greatest success of the project design was the inclusion of the role of Story Mentor from the very beginning. Abu was the only touchpoint for all the participants, which reduced any confusion and streamlined communication. Abu delivered one-on-one skills training using WhatsApp audio and video, Google Duo App, and occasionally, traditional network carrier phone calls when experiencing internet downtime. In these calls, he provided narrative guidance and contributed to the project management, ensuring the success of the project. Empowered Voices — Eleven of the 12 participants successfully delivered three video chapters showing a strong level of engagement as self-driven storytellers. Some participants were quick to engage and able to share their story readily. Others were slower to unfold. But as the participants gained technical skills and saw we were treating their story with respect, confidence grew. By the third chapter, Mary Ikigu, 24, politely declined our prompts and self-directed her entire shoot, a significant milestone in our eyes. Intimate Stories Captured — The stories captured went deeper than conventional reportage or research would permit. Video highlights include a COVID-19-related funeral, a wedding, and a genuine, real-time emotion that help audiences engage more deeply with the participants’ lived experience. For example, Stanley Shiafu, 34, was an on-demand cleaner with Lynk Kenya who lost his work during the COVID-19 pandemic. Watch his three chapters here. Challenges and how they were overcome

We knew the project was ambitious. The 12 participants live in four countries and two time zones. By the nature of their work, they are busy people with mouths to feed and lives to live. This storytelling project, though compensated, was always going to be a big ask. But we learned a lot and, by all accounts, so did the participants. Some of the challenges include the following:

B-roll — B-roll is footage that is laid over interviews or video diaries. It illustrates the story being told. For example, Dorcas Mutheu, 41, runs a cake business, so visuals of her baking, decorating, and boxing her product were important to show her work. We trained the participants in basic video conventions such as the different purpose of establishing and close-up shots. Also, because we wanted the project to be as collaborative as possible, we resisted being prescriptive about what B-roll participants should be filming. However, it became clear that recording video diaries was fairly easy for most to do and that filming B-roll of their lives was more challenging. Through continual briefings, resending our video instructions, and eventually, sending specific shot lists, we were able to co-create each story in a visually engaging way.

Deadlines and Incentives — Between the ambitious two-month timeline and the demands of their lives, the engagement of the participants varied over time. We found it was difficult for most of them to upload enough content to meet the tight edit turnaround. We decided we needed to incentivize them to meet their deadline by offering a small cash transfer via mobile money in addition to their agreed instalment fee. This worked really well and about two-thirds of the participants uploaded on time.

Data Uploads — The upload of video files was challenging for all participants, particularly those in rural areas where internet speeds were slow. Stanley, for example, had to commute to a nearby town to access good upload speeds. Okoli Edwin Chimereze, 20, an e-commerce entrepreneur, had to go to cyber cafés in Lagos to help with faster internet. When it became clear the upload was slowing down the Google Drive production workflow and frustrating the participants, we distributed additional data bundles, some of which needed to be reloaded for each chapter. We also encouraged the use of alternative file transfer websites like WeTransfer and Send Anywhere. However, this meant organizing the files manually, which consumed a lot of time.

Language Barrier — It became evident that a lot of communication was lost in translation with one participant. She was getting frustrated and unmotivated, so we translated her prompt questions to her mother tongue and hired a translator for one-on-one calls. This made for easier, efficient one-on-one training sessions. Her subsequent video submissions were of greater depth and quality.

Insecurity — Showing where you live and work was an important part of gathering B-roll. However, this was not always possible or safe. One participant was treated with aggression while filming on the streets of Kampala during a tense political climate. We worked with her to find creative ways to film while reducing her exposure. Also, two of our participants were mugged during the course of the project and had their phones stolen. While we had installed a lock-out software on the phone in the case of such a scenario, we were only briefly able to trace the locations of the two stolen phones. Fortunately, we were able to retrieve the data remotely. We had a contingency budget, so we re-issued the phones to keep the two participants on track.

Tips for others considering using this technique

This innovative participatory approach is exciting for the participants and engaging for audiences. When used in combination with more traditional research or reportage methods, it has a great deal of potential to unlock surprising insights. However, there are a few things to consider before embarking on your project design.

Set clear expectations. We drew up a clear project agreement and informed consent document, which each participant signed, scanned and sent back to us. However, we didn’t make clear how much time we would expect people to put into the project. The time it takes to produce a video, particularly the B-roll overlay footage often surprises newcomers to video. It would have been helpful to communicate a conservative estimate of the hours the project would require. In hindsight, we estimate each chapter took the participants approximately eight hours to plan, film, and upload. In addition to training and communication time, the average time spent on the project was about 30 hours per participant. For most, the remuneration was fair. However, it still would have been better to estimate and communicate how much time this would actually equate to. Try to be realistic. Participatory projects like this have a huge payoff in terms of originality and access to the stories that are shared. However, they also take a lot more concerted effort and time. Our two-month timeline was probably too ambitious for the amount of content we co-created. We could have done with a third more months, a third fewer participants, a third more staff, or a third fewer videos. If you’re looking to undertake a similar project, set your goals high and then add a third more resources! Stay agile. No, really, stay agile. Flexibility and improvisation are central to a participatory approach. The project design was premised on producing and publishing three discrete video chapters per person. For some people, their story naturally unfolded over time and the three-chapter format made sense to track their journey. For example, Peter Maraizu Ogbonna, 35, a taxi driver in Lagos, captured the construction of his new poultry farm side-hustle over the course of two months. For others, such as Onyinyechukwu Anastestia Onyekaba, 24, a software developer also in Lagos, her story was more compellingly told in just one compressed chapter. The decision to vary the number of chapters edited per participant was made in the last days of the project, triggering a furore of re-editing but ultimately making the content more engaging for audiences across a variety of platforms. Also, one participant found the challenge to record B-roll and in-depth video diaries too difficult to manage with his end-of-year work commitments, so we collectively decided to end his involvement in the project without any video output. While disappointing, we had anticipated there would be a level of attrition and were pleased 11 of the 12 participants successfully engaged with the demands of the project. Conclusion

This participatory process has led me to feel deeply connected to 12 people I’ve never met. In fact, because Abu was the only person to speak directly to each person, I’ve never even spoken to them. After a hard year for most people around the world, and a particularly hard year for my production business and the creatives I work with, this project was a deeply healing journey.

Without even realizing the impact they would have, these 12 participants took on a huge new challenge. They learned new skills. They bravely bared their souls. They said “Yes” at a time when the world was consumed by anxiety. I personally think this is very cool. Not only do they offer important insights into the reality of work on digital platforms, but by sharing their stories, they also validate the struggles of millions like them, surviving and thriving even during the COVID-19 pandemic.

Watch all the videos here

A platform of their own: Our experience running a participatory video storytelling project was originally published in Caribou Digital on Medium, where people are continuing the conversation by highlighting and responding to this story.


Urbit

Eliza

The Storm before the Calm Building things, even Calm™ things, makes noise. George Nakashima’s wooden furniture evokes peace, but its construction still begins with a chainsaw. We mention this because we’re firing up our newest power tool. We call her Eliza. She is a data-collection robot. Hang on. An AI? Harvesting my data? Spying on me? Flooding me with spam? This is Urbit, right? Le


The Storm before the Calm

Building things, even Calm™ things, makes noise. George Nakashima’s wooden furniture evokes peace, but its construction still begins with a chainsaw. We mention this because we’re firing up our newest power tool.

We call her Eliza. She is a data-collection robot.

Hang on. An AI? Harvesting my data? Spying on me? Flooding me with spam? This is Urbit, right? Let us tell you about our new friend in basic terms, where MEGACORP might otherwise deploy legalese.

What Eliza is and isn’t

Replicants are like any other machine - they're either a benefit or a hazard. If they're a benefit, it's not my problem.

Rick Deckard

Let's start with the benefits. Eliza is a simple chatbot. She was made to collect feedback on Landscape. She asks direct questions about peoples' usage of Urbit, records any responses, and reports them to her managers at Tlon. She'll also provide timely advice and information to ships kind enough to chat with her. She'll begin reaching out to ships via DM in the coming month. If you don't want to, just say no, and you'll never hear from her again.

Finding bugs is critical to Urbit's rapid development, especially in this period of nascency. Most Urbit-related frustrations go unreported; regular folks don't think to open up Github when they experience a minor issue. That's why it's incumbent upon us to design a lower-friction forum for feedback. People who aren't comfortable filing an issue might be willing to talk to a friendly chatbot.

Now, about hazards. By design, there are few. Eliza runs on Tlon's comet, not on your planet. There will be no OTA to install %eliza-bot-store on everyone's ships. Eliza learns things through conversation, not hidden surveillance software. Data she requests, like the text output of our network analytic tool, +tally, must be sent manually. MEGACORP tends to cut this corner; to refine their products, they just track your reactions in real-time.

Since she's open-source, anyone can creatively deploy Eliza. Just write her a dialogue tree. Of course, that also means we can't keep people from misusing Urbit bots. We think automated agents on the Urbit network are inevitable, so let's create a good baseline culture around them. (Much love to Tendiebot, who's serving up stock prices in ~tomdys/wall-star-bets.) However, once Urbit is used world-wide, we can't rely on culture alone.

A New Internet Economics

The sum of Urbit's architectural design is a new online economics: a way to finance and monetize access to internet infrastructure where people retain maximum equity. Explicitly acquire a low-maintenance, low-cost server. Use it to compute explicit things. Return to life.

We all know the basic MEGACORP business model: offer a free service (to suckers who don't own a server), secretly mine the data they generate on the (addictive) platform, then sell targeted ads to advertisers. This model will not survive Urbit. The ubiquitous personal server changes the economic equation. You used to 'pay' for an experience with your time and data. On Urbit, once you purchase core internet infrastructure, it's yours.

So, can Urbit meaningfully curtail our world's appetite for data? We'll see. Post-enlightenment society runs on instrumental reason, right down to the last byte. Marketers will always want a fresh dataset. However, when the goods are on your Urbit, you can at least demand the respect you deserve. Long term, we expect a transparent market for data, not an opaque "agreement". An analytics company wants to track everything you see and do on Urbit? They better be paying well.

Eliza doesn't pay respondents. Her collection is a charity to Urbit's developers. (We're also still busy enabling payments on Urbit with the Bitcoin wallet.) However, her presence is not a term of service either. In a few years, by the nth time some smart aleck uses Eliza's chatbot code and conjures 'Zalexa' to attempt an intrusion upon your Calm™, we will see the economics of Urbit's architecture suddenly vivify. Blacklists, reputation systems, and, eventually, even data markets, all Urbit primitives envisioned long ago, will become real. We won't need Zoogle to underwrite a spam filter for the entire internet; the scarcity of Urbit address space, in tandem with a true web-of-trust, handles that.

If Urbit is to be used worldwide, the old system will not go down without a fight. Spammers, spies, and digital robber-barons will hunt for ways to game Urbit. Our passage out of the digital gutter is beginning in earnest, and we are thrilled to put our new architectures to the test.

Monday, 22. February 2021

Global ID

EPISODE 04 — The future of blockchain with the creator of Solana

EPISODE 04 — The future of blockchain with the creator of Solana Solana creator Anatoly Yakovenko gives a brief history of the evolution of blockchain, explains how Solana has the scalability and throughput to address the entire world, and tells the story of how he met GlobaliD co-founder and CEO Greg Kidd. Past episodes: EPISODE 03 — Should we trust Facebook? EPISODE 02 — JP Thieri
EPISODE 04 — The future of blockchain with the creator of Solana

Solana creator Anatoly Yakovenko gives a brief history of the evolution of blockchain, explains how Solana has the scalability and throughput to address the entire world, and tells the story of how he met GlobaliD co-founder and CEO Greg Kidd.

Past episodes:

EPISODE 03 — Should we trust Facebook? EPISODE 02 — JP Thieriot on why Uphold isn’t delisting XRP EPISODE 01 — The SEC’s crypto turf war and why XRP isn’t a security

Have a question for Greg? A topic you’d like covered? A guest you’d like to see? Let us know!

GlobaliD on Twitter Greg on Twitter Anatoly on Twitter

EPISODE 04 — The future of blockchain with the creator of Solana was originally published in GlobaliD on Medium, where people are continuing the conversation by highlighting and responding to this story.


Authenteq

Introducing the Identity Masters Podcast

The post Introducing the Identity Masters Podcast appeared first on Authenteq.

Tradle

ABAC Digital Symposium 2021

Digitizing Trade and Supply Chain Finance 9–11 March Gene Vayngrib, Tradle’s CEO, is participating in this event as not only is he driving change in digital compliance, but has a long history in pioneering change in distributed Supply Chain Management. Some topics covered will include: New technologies and digital initiatives to enable a simple, secure and faster way of trading.

Digitizing Trade and Supply Chain Finance 9–11 March

Gene Vayngrib, Tradle’s CEO, is participating in this event as not only is he driving change in digital compliance, but has a long history in pioneering change in distributed Supply Chain Management.

Some topics covered will include:

New technologies and digital initiatives to enable a simple, secure and
faster way of trading. How these new technologies and digital initiatives keep trade flowing in the midst of COVID-19 and thereafter in the post pandemic phase Conditions for the effective deployment of these new technologies and digital initiatives across APEC economies

Here is the agenda for the event

You can sign up here

ABAC Digital Symposium 2021 was originally published in Tradle on Medium, where people are continuing the conversation by highlighting and responding to this story.


Peer Ledger

Is Blockchain the Key to Sustainability?

The word ‘blockchain’ has become ubiquitous with cryptocurrency, but the technology has applications far beyond finance. There is growing interest in using blockchain technology to solve global sustainability problems. Blockchain is defined as “a distributed database that maintains a constantly growing list of transaction records, referred to as blocks, and in which every block contains a link to

The word ‘blockchain’ has become ubiquitous with cryptocurrency, but the technology has applications far beyond finance. There is growing interest in using blockchain technology to solve global sustainability problems.

Blockchain is defined as “a distributed database that maintains a constantly growing list of transaction records, referred to as blocks, and in which every block contains a link to the previous block. More specifically, it is an open and distributed ledger that captures transaction data between two parties in a permanent and verifiable way.”

Blockchain is being used to support social initiatives like responsible sourcing and supply chain management by providing improved visibility and early detection of unethical producers and products. The numerous ways blockchain can support the United Nations Sustainable Development Goals which are highlighted in the graphic below.

Image source: IntechOpen

The World Economic Forum identified three key ways that blockchain can support sustainable development:

Building resilient and transparent supply chains Creating stronger and more accountable public institutions Spurring responsible sourcing and consumption, which is at the forefront of the environmental movement

The UN Global Compact and BSR recently launched a global guide to traceability for sustainability, underscoring the importance of this issue and the need for immediate action. As pressure mounts for companies and industries to ensure the integrity and sustainability of their products, technologies like blockchain will play an important role.

Peer Ledger’s MIMOSI Connect is a game changer because it gives companies a trusted, immutable record of transactions, documents, and metrics across their entire supply chain to support responsible supply chain management and due diligence. We believe that responsible supply chains are the future and we strive to make traceability available to companies of all sizes.

Responsible supply chains make stronger brands.
References Rick Leblanc (January. 2020), How Blockchain Will Transform Supply Chain Sustainability, The balance small business. Retrieved from https://www.thebalancesmb.com/blockchain-and-supply-chain-sustainability-4129740 Paula Fraga-Lamas and Tiago M. Fernández-Caramés (May, 2020), Leveraging Blockchain for Sustainability and Open Innovation: A Cyber-Resilient Approach toward EU Green Deal and UN Sustainable Development Goals, Intech Open publisher. DOI: 10.5772/intechopen.92371 Sumedha Deshmukh (September, 2020), 3 ways blockchain can accelerate sustainable development, World Economic Forum. Retrieved from https://www.weforum.org/agenda/2020/09/3-ways-blockchain-can-contribute-to-sustainable-development/ Norton et al (April, 2014), A Guide to Traceability: A Practical Approach to Advance Sustainability in Global Supply Chains, BSR. Retrieved from https://www.bsr.org/en/our-insights/report-view/a-guide-to-traceability-a-practical-approach-to-advance-sustainability-in-g

BLOK

5 reasons why an ethical digital health pass would be better

Government’s paper vaccine certificates would create a catalogue of problems. The UK Government’s plans to require holidaymakers to ask their GP for a paper vaccine certificate would encourage fraud, create unnecessary work for GPs and patients and would be unlikely to be accepted as attested proof by many more forward-thinking nations, UK experts have warned. […] The post 5 reasons why an ethic

Government’s paper vaccine certificates would create a catalogue of problems.

The UK Government’s plans to require holidaymakers to ask their GP for a paper vaccine certificate would encourage fraud, create unnecessary work for GPs and patients and would be unlikely to be accepted as attested proof by many more forward-thinking nations, UK experts have warned.

Chris Justice, President of BLOK BioScience, has urged the Government to join the global Good Health Pass Collaborative¹ of health, travel, technology and government organisations looking to create a blueprint for universally accepted digital health credentials that are privacy-protecting, user-controlled and globally interoperable.

Warning that the Government’s preference for paper vaccine certificates risks creating a range of problems for the public, healthcare professionals and law enforcement, while also reducing the UK’s standing on the global stage, Chris gives five reasons why an ethical privacy-focused digital-based health pass should be introduced, based on Self Sovereign ID technology.

Our personal data is more private and secure

Right now people don’t know where their Covid-19 health data is being stored or how it could be used in the future. This is a risk as NHS databases could be compromised², paper certificates are easy to lose¹ or people may have concerns over their data being accessed by government or health authorities³.

With a digital health pass built around a Self Sovereign ID, individuals’ test, vaccine and infection data sits exclusively on their device and the raw data is never shared with anyone else. The only communication it needs to give is attested proof that the individual meets the entry criteria of the country or vendor.

There’s no risk of counterfeit vaccine passports

There have already been several reported cases of counterfeit test results being presented¹ ⁴ and that is a huge risk when it comes to managing infections being carried from country to country – including into the UK.

While a paper certificate could be easily counterfeited, a digital health pass can combine attested identity and health credentials and allow an individual to prove that they meet a country’s entry criteria and are who they say they are. It is likely that an easy to counterfeit paper certificate will not be sufficient to access many countries.

They can be ethical and inclusive, if designed appropriately

The UK vaccine minister has said that vaccine passports are “discriminatory”⁵ but that really doesn’t have to be the case.

We firmly believe that vaccine passports should be designed in a digital-first form for maximum privacy, security and usability but that doesn’t mean that people without access to a smartphone would be excluded. Inclusivity models such as guardianship or providing secure access through a printed QR code just need to be built in – our own BLOK Pass is ID2020 certified⁶ for meeting 41 outcomes-based technical and ethical standards.

They would reduce bureaucracy and save time and money

It has been suggested that patients may need to book a GP appointment and pay £30 for a vaccination certificate⁷. Why should they?

This would be a costly inconvenience for both GPs and patients, and the British Medical Association has already said that patients should be given easy online access to their records for this reason⁷.

If people have access to their personal vaccination and testing data and they can use it for personal benefit then they are more likely to opt into vaccination programmes, which has public health advantages.

Anonymous insights could be a key tool in the battle against Covid

Imagine a world where experts can access completely anonymous Covid-19 data in real-time as part of a coordinated national programme that allows us to quickly respond to new variants, regional outbreaks and vaccine side-effects – without ever compromising people’s privacy.

This level of insight could be a game-changer in the battle against the virus but we would only ever be able to secure the level of adoption needed if we address people’s concerns about privacy and security, and make it attractive to sign-up for.

References

¹ Good Health Pass Collaborative: A safe path to global reopening

² The National Cyber Security Centre Annual Review 2020 shows that there were 51,000 indicators of compromise of NHS IP addresses

³ Simon N Williams Phd, Christopher J Armitage PhD, Tova Tampe PhD, Kimberley Dienes PhD. Public attitudes towards Covid-19 contact tracing apps: A UK-based focus group study. Health Expectations journal. January 2021. The most commonly stated concern was over data privacy and security, with participants expressing “reluctance to have their data accessed by Government or health authorities”

Europol – the European Union’s law enforcement agency – warned that there have already been several cases of fraudulent Covid-19 certificates being sold to travellers and said that technology advances mean that these can increasingly be produced to a high standard. February 2021

Nadhim Zahawi speaking on the BBC’s Andrew Marr show and reported on BBC News Online on 7th February 2021

ID2020 press release. ID2020 announces certification of BLOK Pass from BLOK BioScience. August 2020

Holidaymakers could face £30 charge from GPs for written proof of Covid-19 vaccine before travelling. Daily Mail article. 7th February 2021

 

The post 5 reasons why an ethical digital health pass would be better appeared first on BLOK BioScience.


Coinfirm

InstaSwap & Coinfirm Team Up to Strengthen Crypto Transparency & Compliance

22nd February, LONDON, UK  – InstaSwap, the world’s most convenient way to swap crypto assets, today announces a partnership with leading RegTech and blockchain analytics provider Coinfirm to leverage the AML Platform to further fortify the transparency of the crypto ecosystem and the mass adoption of blockchain. By deploying advanced AML/CFT analytics to combat money...
22nd February, LONDON, UK  – InstaSwap, the world’s most convenient way to swap crypto assets, today announces a partnership with leading RegTech and blockchain analytics provider Coinfirm to leverage the AML Platform to further fortify the transparency of the crypto ecosystem and the mass adoption of blockchain. By deploying advanced AML/CFT analytics to combat money...

Okta

Build Your First NestJS Application

NestJs is a popular Node.js framework that is built with typescript and makes use of object-oriented programming, functional programming, and functional reactive programming. NestJs boasts that it provides a framework for building scalable server-side applications. NestJs integrates nicely with Okta’s single sign-on provider. Okta makes securing a web service, such as the one you will build, qu

NestJs is a popular Node.js framework that is built with typescript and makes use of object-oriented programming, functional programming, and functional reactive programming. NestJs boasts that it provides a framework for building scalable server-side applications.

NestJs integrates nicely with Okta’s single sign-on provider. Okta makes securing a web service, such as the one you will build, quick and easy.

Create Your Okta Application with the CLI

The first thing you will need to do is sign up for an Okta developer account if you don’t already have one. This account is free forever and allows you to start building applications secured with Okta immediately. To do this you will use the Okta CLI. If you haven’t worked with the CLI yet you can check out the full documentation here.

Start by opening the terminal application of your choice. In the terminal, use the command okta register. This command will ask you for your name and email and then create an Okta developer account for you. If you already have an Okta developer account, you can use the okta login command and follow the on-screen instructions to log in to your organization. Once the CLI is set up for your organization, run the command okta apps create and give the application a meaningful name. I named mine library-api but you can name yours whatever you like. Select Single Page App as the application type. Even though you are creating an API in this tutorial, you will be using Postman as the “application” that will consume the API. This can be even more beneficial if you actually create a single-page application later. You’ll be able to use this client ID and issuer in that application.

Since you are going to use Postman to test, enter https://oauth.pstmn.io/v1/callback as your Login Redirect URL. Set the Logout Redirect URI to https://oauth.pstmn.io/. If your organization has multiple authorization servers, you can use default for this tutorial. After the CLI runs, it will return your Issuer and Client ID. Make note of these as you will need them in your application.

Create Your NestJs Application

Next, you can create your Nest.js application. If you haven’t installed the nest.js CLI yet then you should do that before beginning. Run the command npm i -g @nestjs/cli@7.5.4. Once that completes, run nest new library-api to create a new application called library-api.

Next, you need to install some dependencies you will need. First, you will want to install dotenv for your environment variables.

npm i dotenv@8.2.0

Next, you will need to install the passport and Okta libraries that will help set up your auth module.

npm i passport@0.4.1 npm i passport-http-bearer@1.0.1 npm i @nestjs/passport@7.1.5 npm i @okta/jwt-verifier@2.0.0

Now you can create a new folder in your root directory called .env. Add the following values to it.

OKTA_CLIENTID={yourClientId} OKTA_ISSUER={yourOktaIssuer} OKTA_AUDIENCE=api://default

Now you can begin the work of setting up your application. Nest uses a few concepts that it’s important to discuss at this point. The first is controllers. Controllers are responsible for handling requests from the client.

Next is providers. The name “providers” is a bit of a catch-all term for services, repositories, factories, and many other types of classes. Providers can be injected into other classes using the @Injectable() decorator.

Finally, there are modules. The role of the module is to organize the application structure. Modules are decorated with the @Module() decorator.

With that work in mind, you can begin to create your auth module. First, create a new folder in your src folder called auth. Add a file to this folder called http.strategy.ts and add the following code.

import { HttpException, Injectable } from '@nestjs/common'; import { PassportStrategy } from '@nestjs/passport'; import { Strategy } from 'passport-http-bearer'; import { AuthService } from './auth.service'; @Injectable() export class HttpStrategy extends PassportStrategy(Strategy) { constructor(private readonly authService: AuthService) { super(); } async validate( token: string, done: (error: HttpException, value: boolean | string) => any, ) { try { return await this.authService.validateToken(token); } catch (error) { done(error, 'The token is not valid'); } } }

This provider will look for an injected AuthService provider to do the heavy authentication lifting. You can define the AuthService logic in a new file called auth.service.ts in the same folder.

import { Injectable } from '@nestjs/common'; import * as OktaJwtVerifier from '@okta/jwt-verifier'; import { ConfigService } from '../config/config.service'; @Injectable() export class AuthService { private oktaVerifier: any; private audience: string; constructor(private readonly config: ConfigService) { this.oktaVerifier = new OktaJwtVerifier({ issuer: config.get('OKTA_ISSUER'), clientId: config.get('OKTA_CLIENTID'), }); this.audience = config.get('OKTA_AUDIENCE'); } async validateToken(token: string): Promise<any> { const jwt = await this.oktaVerifier.verifyAccessToken(token, this.audience); return jwt; } }

This implementation of the AuthService using the oktaVerifier to verify an incoming token.

Finally, you can tie these two providers together in one module by adding a file called auth.module.ts.

import { Module } from '@nestjs/common'; import { ConfigModule } from '../config/config.module'; import { AuthService } from './auth.service'; import { HttpStrategy } from './http.strategy'; @Module({ imports: [ConfigModule], providers: [HttpStrategy, AuthService], }) export class AuthModule {}

The HttpStrategy and AuthService get listed as providers in the AuthModule. You are also importing the ConfigModule but that hasn’t been implemented yet. Go ahead and do that now.

First, add a folder under src called config. Add a new file for config.service.ts. The code for that follows.

import * as dotenv from 'dotenv'; import * as fs from 'fs'; export class ConfigService { private readonly envConfig: { [key: string]: string }; constructor(filePath: string) { this.envConfig = dotenv.parse(fs.readFileSync(filePath)); } get(key: string): string { return this.envConfig[key]; } }

The implementation here uses dotenv to read the parameters from the .env file. It exposes a get function that will get the appropriate value from the file.

Next, add a file to the config folder named config.module.ts. Add the following code to it.

import { Module } from '@nestjs/common'; import { ConfigService } from './config.service'; @Module({ providers: [ { provide: ConfigService, useValue: new ConfigService(`${process.env.NODE_ENV || ''}.env`), }, ], exports: [ConfigService], }) export class ConfigModule {}

This module takes the ConfigService and packages it as an export. This makes it available for your AuthModule to use.

You can now create your controller. Open the app.service.ts and replace the existing code with the following.

import { Injectable } from '@nestjs/common'; @Injectable() export class AppService { books = [ { id: 1, title: 'The Hobbit', author: 'J. R. R. Tolkien', status: 'Checked-in', }, { id: 2, title: 'Do Androids Dream of Electric Sheep?', author: 'Philip K. Dick', status: 'Checked-out', }, { id: 3, title: 'Brave New World', author: 'Aldous Huxley', status: 'Checked-out', }, ]; getAllBooks(): any[] { return this.books; } getBook(params: any): any { return this.books.filter((r) => r.id == params.id)[0]; } updateBook(book: any): any { return book; } }

This file provides some stubbed methods for getting and updating books. Next, open app.module.ts and replace the code there with the following.

import { Module } from '@nestjs/common'; import { AppController } from './app.controller'; import { AppService } from './app.service'; import { AuthModule } from './auth/auth.module'; @Module({ imports: [AuthModule], controllers: [AppController], providers: [AppService], }) export class AppModule {}

The only change here from the boilerplate code is the addition of the AuthModule as an import. You will use this in a moment.

Finally, open app.controller.ts and replace the code there with the following controller code.

import { Controller, Get, Post, Param, UseGuards, Body } from '@nestjs/common'; import { AppService } from './app.service'; import { AuthGuard } from '@nestjs/passport'; @Controller('books') export class AppController { constructor(private readonly appService: AppService) {} @Get() @UseGuards(AuthGuard('bearer')) getAllBooks(): any[] { const books = this.appService.getAllBooks(); return books; } @Get(':id') @UseGuards(AuthGuard('bearer')) getBook(@Param() id): any { return this.appService.getBook(id); } @Post() @UseGuards(AuthGuard('bearer')) updateBook(@Body() book: any): any { return this.appService.updateBook(book); } }

Here you are using the AuthGuard to ensure your user is authenticated. You have also defined methods for get/:id, get, and post. In your @Controller() decorator you have named the controller books meaning that your routes will be books/GET, books/:id/GET, and books/POST.

Testing Your NestJS Application

You are now ready to test your application. As I said above, I will be using Postman for the instructions below. But you can use Advanced Rest Client or another application to test your Nest.js application. You can even build your own React front end for it.

In your terminal, run the command npm run start to start your application. While that is booting up, open Postman. Start a new request and head to the Authorization tab. Select OAuth 2.0 in the Type dropdown, then in the Current Token section on the right, set your header prefix to bearer. Under the Configure New Token section, select Authorization Code (With PKCE) as the grant type. Next, check the Authorize using browser checkbox. Populate your Auth URL, Access Token URL, and Client ID from your Okta developer’s console as follows:

Auth URL: {yourOktaDomain}/oauth2/default/v1/authorize Access Token URL: {yourOktaDomain}/oauth2/default/v1/token Client ID: {yourClientId}

In the Scope field type in openid email profile and in the State field, put any alphanumeric string you wish. Click Get New Access Token and complete the Okta login. Once you have done that, click Use Token the Access Token should be populated automatically in the Current Token section.

In the URL bar of your request, type in localhost:3000/books/ and hit Send.

You should see the full list of books returned to you. You can further experiment with this by trying the URI localhost:3000/books/1 or attempting to post to the books endpoint. You can also remove the access token from the Current Token section and see the 401 Unauthorized message.

Nest.JS is an exciting framework that makes writing clean and scalable Node.js applications quick and easy. Combined with the simplicity and security of Okta you will be able to build secure applications in no time.

Learn More About NestJS and NodeJS

If you liked this post, check out some other posts on NestJS and NodeJS.

Build a Secure NestJS API with Postgres Build a NodeJS App with TypeScript Node.js Login with Express and OIDC

Make sure you follow us on Twitter and subscribe to our YouTube channel. If you have any questions, or you want to share what tutorial you’d like to see next, please comment below.

Sunday, 21. February 2021

KuppingerCole

KuppingerCole Analyst Chat: Applying The Zero Trust Principle To The Software Supply Chain

Martin Kuppinger is one of the founders and the principal analyst of KuppingerCole and he is steering the overall development of the topics covered in KC's research, events and advisory. He joins Matthias to talk about the importance of extending Zero Trust to cover software security, for software in any form (embedded, COTS, as-a-service) and regardless of whether it’s home-grown or externally pr

Martin Kuppinger is one of the founders and the principal analyst of KuppingerCole and he is steering the overall development of the topics covered in KC's research, events and advisory. He joins Matthias to talk about the importance of extending Zero Trust to cover software security, for software in any form (embedded, COTS, as-a-service) and regardless of whether it’s home-grown or externally procured.




Expert Chat: Interview with Thomas Malta




Identosphere Identity Highlights

Identosphere #20 • SSI in the Public Sector • CCG discussions • Covid ID

The latest news in Self-Sovereign Identity, including upcoming events, blog posts, announcements, and highlights from the Credentials Community Group mailing list.
Thanks to our Patrons

We are grateful for your support!
If you haven’t had the chance, we appreciate your consideration!

Coming Up Trust and Identity: Digital Identity NZ 2020 research findings

Mar 3, 17, 31 11:00 AM (New Zealand)

One of the key discussion points related to education - both for organisations who have identity management needs, and for individuals, whānau and communities. We agreed that the theme 'Identity is a Taonga' is fundamental to this work. These DINZ member sessions are for us to further discuss the education gap, and to start work on shaping educational materials to fill that gap.

Rebuilding Respectful Relationships in the Digital Realm

by Elizabeth Renieris, presented by the Me2B Alliance. 

the relationships we have with digital products and services are increasingly complex and multi-dimensional while our legal protections lag behind and put us at risk. Find out what policymakers should do in order to address these vulnerabilities and help us rebuild respectful digital relationships.

Internet Identity Workshop XXXII (#32) April 20-22 Thoughtful Biometrics Workshop March 8,10,12 Disposable SSI-RFI Webinar

Watch the video at above link \ check out the slide deck.

• Background
• History of the Disposable SSID RFI
• The Evolution of Identity
• What are Disposable SSIDs?
• Potential for a Disposable SSID Standard
• The DSSID RFI
• RFI Response Request and Timings

High Level: Good to share with people asking about SSI   The End of Logins and Passwords, Just for Starters The Reboot Who Controls Your Digital Identity? SAP

SSI will have to be integrated with large existing business processes – and therefore enterprise systems such as ERPs, HCMs, or SCMs to name a few. If this integration results in SSI being as easy to use as clicking a button or selecting a menu item, it will lead to rapid uptake and acceptance.

This is precisely what we set out to test and understand with our proof-of-concept, developed in close collaboration between the SAP Innovation Center Network, Evernym and ATB Ventures.

The 5P’s of a Self-Sovereign Identity

A self-sovereign identity can be defined by the 5P’s as it is personal (it is about you), portable (meaning you can take your identity and data from one platform to another), private (you control your identity and data), persistent (it does not change without your consent) and protected (they cannot steal your identity).

Evernym: Privacy-Preserving Verifiable Credentials in the Time of COVID-19 Hyperledger

This session will focus on the analysis and discussion of two use cases where legacy identity solutions were unable to meet the needs, but ledger based solutions have been successful: covid credentials for travel, and employment credentials for staff movements.

News IDWorks turned to Tarmac to help accelerate their mobile app and backend development

Having decided to build their "Envoy" solution on the R3 Corda platform, finding affordable development resources with the right technical skills was proving difficult. Corda certified developers are extremely thin-on-the-ground and if you find a qualified engineer, they can be difficult and costly to recruit

Kaliya’s Articles on the DIF Blog

Finding the Bell Curve of Meaning - A process for supporting the emergence of shared language in broad collaborative communities

Where to begin with OIDC and SIOP - and how today’s most powerful authentication mechanisms can be decentralized

Understanding DIDComm - A cross-community effort to standardize on common, DID-anchored capabilities

Those of you who know me – know I care a lot about the difference between Open Source and Open Standards. So Juan Caballero and I drilled down on this topic.

Drilling down: Open Standards - What standards are and what it means to make them openly

Drilling down: Open Source - A crash-course in the complex world of variously-open software licensing

DIFS updated code of conduct -  Setting a tone for inclusive collaboration.

Meeco Terms & Conditions Update - Feedback Welcome.

Over the next fourteen days, we would love your feedback or questions on any of the changes.

IdRamp partners with the Lifelong Learner Project to win the ACE Blockchain Innovation Challenge

Lifelong Learner Project is proud to announce its selection as a recipient of the first phase of the Blockchain Innovation Challenge, a competition funded by the U.S. Department of Education to identify ways in which blockchain technology can provide social mobility and equitable access to economic opportunity.  

Companies involved include, RANDA Solutions, ETS, Digital Promise, University Instructors, The Utah State BOE, BlockFrame Inc. IdRamp, Evernym, Velocity Career Network, Fluree, Crocus, IMS Global Learning Consortium, Credential Engine.

Okta CEO: Here’s where cloud identity management is headed CSO

CSO: Do you have an opinion on self-sovereign identity?
McKinnon: I do. I think that it’s the future. We’ve got to get it done. The problem is: How does it get bootstrapped? How does it get useful in enough places so that enough people use it to make it useful? Where is it going to come from?

Public Sector Legal compliance and the involvement of governments SSI Ambassador

It’s currently possible to be eIDAS compliant with SSI, leveraging one out of five scenarios described in the SSI eIDAS legal report by Dr. Ignacio Alamillo Domingo. Especially interesting is the SSI eIDAS bridge, which adds legal value to verified credentials with the use of electronic certificates and electronic seals. However, it’s also possible to derive national eIDs notified in eIDAS, which are eIDAS linked by issuing a verifiable credential with a qualified certificate according to the technical specification.

The Past, Present and Future of Identity

Susan Morrow, considers the ‘digital identity journey’ to date and the important opportunity for the government to make use of the lessons learned when creating tomorrows digital identity ecosystem.

Catalan government announces self-sovereign identity project

The government of Catalonia announced its plans for self-sovereign identities (SSIs) for citizens based on blockchain technology. The project, named IdentiCAT, was revealed by the President of Catalonia Quim Torra and will allow citizens to be the “owner, manager and exclusive custod[ian] of his identity and data”.

The Future of Identity: Self-Sovereignity, Digital Wallets, and Blockchain InterAmerican Development Bank

enables sovereignty for individuals over their digital assets and credentials -such as digital passports, digital diplomas, digital property titles, and tokenized currencies such as dollar, euro, pound, or pesos- using digital wallets that can take the form of a mobile app. Secondly, when the subject of these digital assets and credentials presents them to a third party to prove ownership, the third party does not need to reach out to the issuer to verify them, as they can go against a public, decentralized, and immutable registry, such as a blockchain network, where the cryptographic proofs of the asset or credential were registered and are maintained by the issuer in an standardized and trustable way. 

IDunion: An open ecosystem for trusted identities

IDunion (formerly SSI for Germany) has completed the competition phase of the innovation competition “Schaufenster Sichere Digitale Identitäten” and is applying to the Federal Ministry of Economics and Energy (BMWi) for the next phase of the innovation competition.

The use of decentralised, self-sovereign identities for natural persons, companies and things is to be tested in over 35 use cases from a wide range of sectors.

The project involves 26 well-known public and private partners.

EU Data Governance Act (Meeco)

We welcome the regulation as a needed common ground for clarifying the role of data intermediaries, building trust in these intermediaries and setting the direction for data governance, including the emergence of digital human rights. 

In this context we offer the following suggestions: 

Explicitly include individuals as active participants in the definitions [...]

Clarify the scope of the data sharing services (Art. 9 (2)) and extend it to include services that empower the data subject beyond compliance. 

Foster the growth of intermediaries, which offer new technologies and have the greatest likelihood of success in Europe if supported by the Data Governance Act. 

Open silos and implement soft infrastructure such as standards & open APIs to accelerate uptake and interoperability between data sharing services. 

Foster eco-systems and demonstrate the value through practical use-cases. 

Create a level playing field for sustainable data sharing by providing funding to pioneers at the forefront of developing data eco-systems

Meeco Review of the European Data Strategy - Whitepaper

COVID and ID Covid has accelerated Canadians’ demand for digital ID DIACC

three-quarters of the population feels it’s important to have a secure, trusted and privacy-enhancing digital ID to safely and securely make transactions online. The majority of Canadians believe it is important for federal and provincial governments to move quickly on enabling digital ID in a safe and secure manner, according to the survey.

Digi.me partners with Healthmark to enable Covid testing and verified result reporting

Consentry healthpass capability is an end-to-end solution which enables users to take a self-administered PCR saliva test, send it in for processing, and then receive an in-app result. Crucially, Consentry also generates a certified and dated travel certificate, together with qualifying details of the test taken, which can be printed, shared securely or displayed as needed.

Center for Global Development: A COVID Vaccine Certificate Building on Lessons from Digital ID for the Digital Yellow Card

Covid Vaccination Certificate will be a formidable challenge, not only to international cooperation, but because it will need to be implemented in the course of mass vaccination campaigns across countries with very different health management systems and ID systems and with a constantly evolving situation.

The fine line between global COVID-19 protocols and privacy Tech Republic

A panel of experts considers the best methods for safe domestic and international air travel including proof of testing, vaccination passports, and digital health passes.

‘Vaccination Passports’: State of Play Infinite Ideas Machine

‘vaccination passports’ are unwarranted, in practice near-pointless clinically, and potentially risky in a number of ways.

Research: Vaccine passports and COVID status apps Ada Lovelace Inst.

Not to late to contribute to this Ada Lovelace Institute Project the due date is Feb 28th 

An evidence review and expert deliberation of the practical and ethical issues around digital vaccine passports and COVID status apps

Podcasts Self-Sovereign Identity and IoT

Michael Shea is the Managing Director of the Dingle Group and the Chair of Sovrin Foundation’sSSI in IoT Working Group. In this podcast we discussed the white paper he authored on Self Sovereign Identity and IoT. To explain the opportunities SSI can provide to IoT, Michael introduces us to three profiles: Jamie (machine to person), Bob (machine to machine) and Bessie the cow (digital twin).

PSA Today: Kaliya & Seth talk LEIs

with Simon Wood, CEO of Ubisecure (#1 issuer of Legal Entity Identifiers)

the evolution of LEIs since the financial crisis of 2008, the difference between high assurance and low assurance, and the relationship between rights and ownership as it relates to identity management of entities.

Catching up with the Credentials Community Group

Lotta great topics on the CCG Mailing List, click through to follow the discussion. 

credential definitions, credential manifests, BBS+, etc Daniel Hardman

When Tobias first described Mattr's approach to BBS+ signatures, one of my takeaways was that this changed the Indy mechanism of cred defs in two wonderful ways:

It eliminated the need for lots of keys (only one key, Y, needs to be declared as a credential signing key, instead of a set of keys, Y[0]..Y[n])

It made it possible to store a cred def somewhere other than a ledger

I was very happy about this.

However, I have since heard several smart people summarize the breakthrough as: "We don't need credential definitions at all. You just use the assertionMethod key in your DID doc to sign credentials, and that's all you need." I believe this is oversimplifying in a way that loses something important, so I wanted to open a conversation

Credentials and HTTP-Sig authentication for Solid Henry Story

Here is an extended version of the HTTP-Signature document I put together today, bringing in ideas that have emerged thinking about this over the past 3 months

Announce: CCG 101 Work Item Interest Heather Vescent

the CCG 101 work item is focused on identifying and creating material to make it easy for new participants & interested parties to learn about the CCG, our activities, work items, process & get involved.

Dillo plugin for DID URLs Charles E. Lehner

I would like to announce dillo-did, a plugin for the Dillo web browser implementing support for DIDs. This plugin enables navigating to DID URLs in Dillo and viewing the resolved/dereferenced DID documents and resources like web pages. The implementation of the DID functionality used is from ssi/DIDKit.

ERC-721 Non-Fungible Token Standard on Ethereum vs. VCs on Hyperledger Indy Michael Herman

When are Hyperledger Indy/Sovrin VCs better than Ethereum smart contracts for NFEs/NFTs (non-fungible entities/tokens)?

It seems obvious but I don't have a detailed/worked out answer.  One project I'm associated with wants to use the ERC-721 Non-Fungible Token Standard on Ethereum but I believe VCs are a better route to take. Part of the desire to stay on Ethereum is there is quite a vibrant NFT community on Ethereum and lots of different EC-721 tokens.

Vaccination Certificate Vocabulary Tobias Looker

I'd like to propose a new work item that formally defines a vocabulary for issuing Vaccination Certificates in the form of Verifiable Credentials.

Link to CCG PR
Link to current draft
Link to repository

Web 3 Elemental Chat - 1st Holochain P2P App for Hosts

Elemental Chat running on HoloPorts has no central database. Each person who is running the app signs their messages to their own chain and then automatically posts them to the shared database that is hosted by the other users of the application.

Crypto Bridging the Gap Between DeFi and Decentralized Identity Bloom

Decentralized Identity & DeFi are Disconnected ← true
Decentralized Identity is Chain-Adjacent  ← true 
Decentralized Identity & DeFi are Complimentary ← true 
How Decentralized Identity is Being Used

Health Data Passes

Employment Information

Credit, Income, KYC

Ontology Partnership with Binance Smart Chain

Ontology and Binance have a long history of cooperation and partnership that has generated benefits for both sides, none possibly more important the integration of Ontology’s Decentralized Identity Solution into the Binance Smart Chain. The symbiotic relationship sees Ontology, and ONT ID, as the sole partner for BSC in terms of providing a truly decentralized identity option and KYC user verification.

Hands On Introduction to Trinsic’s APIs

Provider • Credentials • Wallet

Building and Securing a Go and Gin Web Application Okta

Today, we are going to build a simple web application that implements a to-do list. The backend will be written in Go. It will use the Go Gin Web Framework which implements a high-performance HTTP server. The front end will use the Vue.js JavaScript framework to implement a single page application (SPA). We will secure it using Okta OAuth 2.0 authentication.

Become a Node Operator Indicio 

we’ve seen a rapid rise in demand for robust, stable, and professionally maintained networks to support decentralized identity solutions. It’s not a surprise: decentralized identity’s moment has arrived. That’s why we’ve been hard at work creating Hyperledger Indy networks upon which developers all over the world are building, testing, and launching their solutions.

Research  Decentralized SSI Governance, the missing link in automating business decisions TNO

This paper introduces SSI Assurance Communities (SSI-ACs) and identifies three specific governance topics: credential-types, accreditation and decision tree support.

Tools and services are suggested that help with these topics. Furthermore, a distinction is made between what the business primarily cares about (business and business applications), and the technology and other things that are just expected to work (which we call "SSI-infrastructure").

Development of a Mobile, Self-Sovereign Identity Approach for Facility Birth Registration in Kenya

The process of birth registration and the barriers experienced by stakeholders are highly contextual. There is currently a gap in the literature with regard to modeling birth registration using SSI technology. This paper describes the development of a smartphone-based prototype system that allows interaction between families and health workers to carry out the initial steps of birth registration and linkage of mothers-baby pairs in an urban Kenyan setting using verifiable credentials, decentralized identifiers, and the emerging standards for their implementation in identity systems.

Towards a Modelling Framework for Self-Sovereign Identity Systems

Modelling self-sovereign identity systems seeks to provide stakeholders and software architects with tools to enable them to communicate effectively, and lead to effective and well-regarded system designs and implementations. This paper draws upon research from Actor-based Modelling to guide a way forward in modelling self-sovereign systems, and reports early success in utilising the iStar 2.0 framework to provide a representation of a birth registration case study.

Thanks for reading. See you next week!

Saturday, 20. February 2021

Ontology

Ontology — The ONLY Decentralized Identity Partner for Binance Smart Chain

Ontology — The ONLY Decentralized Identity Partner for Binance Smart Chain Ontology and Binance have a long history of cooperation and partnership that has generated benefits for both sides, none possibly more important the integration of Ontology’s Decentralized Identity Solution into the Binance Smart Chain. The symbiotic relationship sees Ontology, and ONT ID, as the sole partner for BSC
Ontology — The ONLY Decentralized Identity Partner for Binance Smart Chain

Ontology and Binance have a long history of cooperation and partnership that has generated benefits for both sides, none possibly more important the integration of Ontology’s Decentralized Identity Solution into the Binance Smart Chain. The symbiotic relationship sees Ontology, and ONT ID, as the sole partner for BSC in terms of providing a truly decentralized identity option and KYC user verification.

The recent explosion of projects landing on the Binance Smart Chain (BSC) is expected to be accompanied by a meteoric rise in the number of potential investors interested in Security Token Offerings, otherwise known as STO’s. A key variable that is addressed through Ontology’s partnership with BSC is the problem of identifying potential investors, and ensuring their identities are not only protected but also validated through Ontology’s Trust Anchor Gateway and overall Trust Ecosystem.

About ONT ID

ONT ID is a framework for decentralized identity, built on the Ontology blockchain and integrated with the Binance Smart Chain. So any user, developer, or project launching STO’s will have access to fully integrate the benefits of ONT ID and with it, decentralized identity, into their existing developments. Protected by cryptographic techniques, ONT ID provides any entity with a digital identity that is controlled by the entity through decentralization. The entity in question here can be a variety of things, including but not limited to individuals, organizations, or objets. This makes tasks such as authentication, authorization or performing audits much simpler while protecting the data of the entity being authenticated, authorized, or audited.

About ONT TAG

Ontology’s Trust Anchor Gateway (TAG) is a decentralization platform based on ONT ID which provides access to security features such as KYC (know your customer) and global authentication services such as Shufti Pro, IdentityMind, and CFCA to name a few.

How does this help STO’s?

For any security token offering on BSC, the most important aspect is being able to accurately and safely verify the assets held by potential investors, while adding a layer of anonymity and protection to their identity through decentralization. Some pain points in compliance requirements include verifying of qualified investors, cross-border sales of security offerings, and the duration of the qualification validity period for both the United States and the European union. In a normal STO process, the issuance, self compliance, fundraising, and working with technical issues can be both time consuming and difficult. Neither of which is conducive to a successful STO. But by integrating ONT ID and adopting the ONT TAG, any STO on the Binance Smart Chain can identify potential investors much quicker, and nudge them through the checklists smoother. Not to mention that this is a more secure form for investors to participate while having their personal ID protected through the decentralization powered by Ontology — the only partner who offers this service on BSC.

Ontology — The ONLY Decentralized Identity Partner for Binance Smart Chain was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.

Friday, 19. February 2021

1Kosmos BlockID

The Gap Identity Service Providers Must Fill

Identity service providers create, maintain, and manage identity information for users and also provide authentication services to relying applications within a federation or distributed network. They bring a level of convenience to anyone who needs to access applications throughout the day to conduct business effectively and without having to enter credentials each time. So, it is n

Identity service providers create, maintain, and manage identity information for users and also provide authentication services to relying applications within a federation or distributed network. They bring a level of convenience to anyone who needs to access applications throughout the day to conduct business effectively and without having to enter credentials each time. So, it is not surprising that Okta, who’s the leader in the market, oftentimes uses the following tagline: “One-click access to all your enterprise apps—in the cloud and behind the firewall.” But, there is another slogan used by Okta that tends to leave me perplexed: “Okta is one trusted platform to secure every identity, from customers to your workforce.”


Space Elephant \ Unikname

Replay – 3 secrets pour sécuriser et mieux gérer votre flotte de sites web

L’article Replay – 3 secrets pour sécuriser et mieux gérer votre flotte de sites web est apparu en premier sur Unikname.

Vous gérez plusieurs sites web sur différentes technos ? Alors ce Webinar est fait pour vous ! À l’occasion ce nouvel épisode, vous découvrirez nos secrets en matière de sécurité et surtout des bonnes astuces pour faciliter votre gestion au quotidien. Soyons honnête, maintenir un parc de sites web n’est pas toujours évident…

Qui de mieux pour en parler que notre partenaire privilégié : Cyllène ! Olivier Poelaert, Président du Groupe, nous fera l’honneur de sa présence, et partagera avec nous son expérience, puisque Cyllène administre aujourd’hui un parc de +200 sites, et ce, sur plusieurs environnements.

Au sommaire :

Que dit le marché ? – Introduction aux principes de la gestion multi-sites

Retour d’expérience – Témoignage de Cyllène

À retenir – 3 secrets pour administrer plusieurs sites à la fois tout en gardant un haut niveau de sécurité

Accédez au Replay ! « 3 secrets pour sécuriser te mieux gérer votre flotte de sites web »

Nom

Email

J’en profite pour m’abonner à la Newsletter Unikname (1 à 2 mail par mois, pas de spam 😉 )*

1 + 4 =

Entrer dans la salle Vos informations vous appartiennent et restent sur nos propres serveurs, elles ne seront divulguées à personne, promis. Voir notre politique de respect de la vie privée.*

L’article Replay – 3 secrets pour sécuriser et mieux gérer votre flotte de sites web est apparu en premier sur Unikname.


auth0

Auth0 Is Hiring in Spain and Canada!

Auth0's CPO shares how his team is expanding their global hiring and technical hubs to new locations.
Auth0's CPO shares how his team is expanding their global hiring and technical hubs to new locations.

KuppingerCole

The Road To CIAM Success – Why an Identity Fabric Powers Digital Transformation

The Ugly Face of Yesterday’s CIAM We all have multiple different personas. But before we put on our hats as identity “experts” -  either as architects, implementers or vendors - let us start by simply being consumers.  Consumers of online services for banking, e-commerce, education, entertainment, and more.  And by thinking as consumers, we can all tell endless stories of poor use
The Ugly Face of Yesterday’s CIAM

We all have multiple different personas. But before we put on our hats as identity “experts” -  either as architects, implementers or vendors - let us start by simply being consumers.  Consumers of online services for banking, e-commerce, education, entertainment, and more.  And by thinking as consumers, we can all tell endless stories of poor user experiences with respect to using those services. Those poor user experiences are essentially putting an invisible barrier to entry to the applications and goods we wanted to consume.  

No one forgets how someone or something made them feel, and a poor registration, login or password reset journey, at best reduces the likelihood of consumer recommendations, at worst, you simply lose users which ultimately results in lost revenue.

The lack of a “digital fabric” is typically the root cause of those badly performing ecosystems.

Broad Requirements Drive The Need for an Identity Fabric

CIAM however is not just about the end-user experience. Obviously, the UI “is THE application” from an end-user point of view, but under the hood, there's an entire catalog of other requirements that are needed to make CIAM a success. 

Core elements of a good Consumer Identity Fabric are: 

Functionality for upholding compliance (think GDPR for EMEA or CCPA for the US amongst others),  Multi-device integration,  Adaptive and contextual security,  Bot-protection  Fraud management checks at various different states of the end-user journey.  

Data integration and data access are also important things to consider and are likely to need the creation not only of new software components - either through development or procurement - but also new operational structures and potentially new teams to manage that new landscape.

Put simply, CIAM isn’t just IAM. CIAM is an outside-in view of identity as opposed to the inside-out model for employee IAM. Features are different, but so too are the non-functional requirements. Scale - for storage and throughput - will be considerably higher, as will geographical availability, support for different locales and languages and the ability to elastically respond to changes in service demand. Many of these overall needs would unlikely have been a priority if a CIAM project was not needed.  

These new requirements however are driving the need for a more integrated, extensible and accessible set of identity services - likely based on an API/SDK first approach. This is where the “fabric” model of identity comes into play.

Start With The End in Mind

So how can we make CIAM a success for the organization?  The most simple approach is to “start with the end in mind” and by asking, what the CIAM platform is going to achieve for the business. In the identity world, CIAM is the solution most closely aligned to the organization's business objectives so it needs to be a priority for any organization going through a digital transformation and having their customers and partners in mind. It can open up new revenue streams, alter how business is conducted and provide a foundation for continued growth and innovation.

Stakeholders in the implementation process are numerous – typically the CISO, CMO, a digital leader, and various identity services leads are involved. So having a platform of distributed components that can assist them in their divergent goals is critical to making CIAM effective. By supporting services for a range of stakeholders we are moving towards developing a broader cross-business unit platform that can break down silos. This is exactly what an identity fabric aims to achieve.

CIAM Lifecycle

CIAM interactions really follow a cyclical pattern of interaction, from initial registration, through to secure login and contextual access, allowing consent and data management to take place seamlessly across a range of devices at a time of the users choosing. Throughout that lifecycle, we need to look beyond assurance and knowledge levels of those user interactions and should ask ourselves how a target service or application for the end-user can be provided in a way that encourages them to refer, recommend and renew?

 


Tokeny Solutions

Tokeny’s Talent|Eva’s Story

The post Tokeny’s Talent|Eva’s Story appeared first on Tokeny Solutions.
Eva joined the team six months ago as an intern to complete her graduation internship. Who are you?

My name is Eva Gasyte and I am from Lithuania. Even in my childhood I was interested in banks and the financial world, so as a first step towards my career, I studied finance in Vinius. After this I decided to deepen my knowledge and get a Master’s degree, but I also wanted to gain some international experience, and so in 2019 I moved to Monaco and started studying MSc Finance at the University of Monaco. This month I am finishing my internship at Tokeny and I am about to start my professional career.

How did you land at Tokeny Solutions?

As part of my studies at University of Monaco, I am required to do a 6-month internship. Luckily, I was noticed by Tokeny’s HR manager Radka on Linkedin as she read an article about me published by the university, in which I mentioned I am looking for an internship. She contacted me, and after a few interviews I was offered the internship. This is how Tokeny came into my life.

How would you describe working at Tokeny Solutions?

The experience was great in a lot of ways. I like the team, my duties and the company’s culture. Tokeny is a remote friendly company, with people working all over the world. I’ve been communicating with most of the team remotely, but I feel very happy that some of those people became my friends, even though I’ve never met them in real life! During the internship I’ve had a very dynamic schedule – every day was unique for me, with different tasks, and therefore I’ve gained a lot of experience in various fields. Our offices here in Monaco are at MonacoTech which provides a very dynamic and innovative environment. The start-up atmosphere there is amazing. It’s a combination of such a challenging place, exciting projects I took care of and great colleagues that helped me to grow even more and gain agility.

What are you most passionate about in life?

I love traveling and discovering new cultures – it amazes me how many breathtaking places are on Earth, one life is not enough to see everything. Even in Monaco – it is one of the tiniest countries in the world and after 2 years I still find places here I’ve never seen before.

What is your ultimate dream?

My dream is very simple – all I wish is to live a happy life. To succeed in my career, to have a nice family, and that everyone around me is happy as well.

What would you change in the world if you could?

I love animals, and if I could, I would provide home and food for all of them that are abandoned in this world.

She prefers: check

Coffee

Tea

Book

check

Movie

check

Work from home

Work from the office

check

Dogs

Cats

check

Call

Text

check

Burger

Salad

check

Ocean

Mountains

Beer

check

Wine

check

City

Countryside

check

Slack

Emails

check

Casual

Formal

check

Crypto

Fiat

check

Night

Morning

Join Tokeny Solutions Family We are looking for talents to join us, you can find the opening positions by clicking the button. Available Positions

The post Tokeny’s Talent|Eva’s Story appeared first on Tokeny Solutions.


SelfKey

KEY & KEYFI Airdrop for KEY Token HODLers on Binance

Two airdrops are better than one! SelfKey and KeyFi, together with Binance, are bringing you not one but TWO airdrops for KEY token hodlers on Binance.com. The post KEY & KEYFI Airdrop for KEY Token HODLers on Binance appeared first on SelfKey.

Two airdrops are better than one! SelfKey and KeyFi, together with Binance, are bringing you not one but TWO airdrops for KEY token hodlers on Binance.com.

The post KEY & KEYFI Airdrop for KEY Token HODLers on Binance appeared first on SelfKey.

Thursday, 18. February 2021

Caribou Digital

Women’s platform livelihoods: Balancing the opportunity with the myth of flexibility

By Miranda Grant and Grace Natabaalo Platform livelihoods hold great promise for making work more inclusive for young women. At the same time, they also present new expressions of gender differences in opportunity, uptake, earnings, and the ways work intersects with other areas of life. These tensions are apparent in a participatory video storytelling project carried out by Caribou Digital

By Miranda Grant and Grace Natabaalo

Platform livelihoods hold great promise for making work more inclusive for young women. At the same time, they also present new expressions of gender differences in opportunity, uptake, earnings, and the ways work intersects with other areas of life.

These tensions are apparent in a participatory video storytelling project carried out by Caribou Digital and Nairobi-based Story x Design, with the support of Mastercard Foundation, in October 2020. The project equipped seven young women based in Ghana, Kenya, Nigeria, and Uganda with mobile phones, video training, and mentoring, to enable them to tell us their own experiences of platform work over a period of two months.

The women — two web developers, a farmer, a carpenter, a motorcycle rider, and two social commerce entrepreneurs — told us how marketplace platforms have enabled them to challenge structural barriers such as access to capital, sexism, freedom of movement, access to ICTs, and personal safety. Some needed very little capital to start. Others could conduct the work from the comfort and safety of their home, which enabled digital platforms to provide an entry point into the world of work.

Some women also leveraged the fact that they already had a computer, internet access, and the requisite skills to perform their platform work; others pushed themselves to learn skills that enabled them to join a male-dominated profession. While the COVID-19 pandemic disrupted their lives, the women all fought to stay financially afloat and keep their passions alive online.

Despite these opportunities, the women’s stories also bring to light gender-specific challenges associated with platform livelihoods, like sexism and having to balance unpaid care work at home with platform work.

Breaking down gender stereotypes in the workplace Dathive

Dathive, 31, known as “Señorita” to her regular customers, is one of only two female motorcycle taxi riders on Uganda’s SafeBoda ride-hailing platform. Before becoming a driver, she has tried numerous other business ventures like running a mobile money business and a fashion boutique. However, none offered the potential profit of boda boda work. Joining this male-dominated sector means that Dathive has not had it easy; she sometimes had to prove that she could transport passengers well regardless of her gender. However, being one of only two female SafeBoda riders does have its perks.

“When I take a customer, I see that they are really very happy to ride my motorcycle. When we reach [our destination], and we negotiate the money, they give me a tip. We take a selfie, they call their family, their neighbors [and say]…come and see my driver, come and see who’s brought me.” — Dathive, SafeBoda driver, Kampala

SafeBoda has helped boost Dathive’s profile by popularizing her in the media, she says.

“I have been on TV, radio, and in the newspapers. SafeBoda did this so that people could get to know me and feel confident when I show up as their rider on the app,” Dathive says. Watch Dathive’s story

Sabina, 29, is also one of a few women in a male-dominated profession. She is a carpenter with Lynk, an on-demand labor platform based in Nairobi, Kenya. Sabina’s move into carpentry was thanks to a partnership between Lynk and BuildHer, a social enterprise that equips women in Kenya with vocational and life skills.

Sabina’s experience points to the types of training that might be needed to set women up for success with platform work outside of vocational training alone. She told us that the training she received taught her life skills and coping strategies that she could apply beyond her profession as a carpenter. This is worth investigating further if there is a possibility to help more women succeed on platforms by seeing what types of training they need.

“BuildHer taught me the basics of life skills, the career itself, carpentry and joinery, and how to cope with life.” — Sabina, Lynk carpenter, Nairobi

Sabina

After completing the training and employment program with BuildHer, Sabina began working full time as a carpenter for Lynk. She spoke of how hard it was initially when she had just joined the platform. “When you come to the workshop, [men’s] first impression is like, ‘Really? Like really, can you make anything?’” Sabina says.

She adds that Lynk helped to create an environment where she could thrive as a female carpenter. The platform played a critical role in further developing the skills she needed to be able to prove herself through her competence in the profession. The platform work has become Sabina’s passion and only source of income. Watch Sabina’s story

After graduating from the University of Lagos with a degree in Chemical Engineering, Anastestia (Ann), 24, found it challenging to find a job. As she sent out job applications after job applications, a friend suggested she think more laterally. Having seen the success of some of her friends on platforms like Fiverr and Upwork, Ann thought, “Okay, I can do this, too.”

When Ann realized that she could make good money as a coder, she began to teach herself the basics of coding. She is now part of a small but steadily growing number of female developers in Africa. Recent statistics show that only one in five developers in Africa is female. She quickly fell in love with the work and loves that she’s making it as a young Nigerian woman in a male-dominated industry. Watch Ann’s story

Social commerce: An easy (and free) choice for female entrepreneurs

For platform sellers like Gloria, Mary, and Dorcas, social media sites like Facebook, WhatsApp, Instagram, and YouTube reduce the barriers to entering the world of platform selling. Women often face these barriers when setting up their own business or engaging in offline work, like a lack of capital and having to balance work with childcare responsibilities.

Social selling has provided Mary, 24, a young entrepreneurial farmer in Kenya, with the flexibility to earn an income while caring for her young son as a single parent. She already owned a computer and a phone and didn’t need much else to start her own YouTube channel, through which she shares farming best practices and promotes her mushroom and strawberry businesses. She cross-posts these videos on her Facebook page and then moves conversations with interested buyers over to WhatsApp. These channels helped her connect with buyers during the pandemic when the markets were closed. Mary is also looking forward to earning money from her YouTube channel once she reaches the 1000-follower mark, the minimum number of subscribers one must have before their channel is monetized. Watch Mary’s story

For Dorcas, 41, in Nairobi, platforms have helped her overcome the limitations of lupus that keeps her mostly housebound. She started by baking and selling cakes on Facebook and WhatsApp and has since managed to expand her home-based business to include shoes. She relies on her social networks and various Facebook groups as a low-cost means to find customers.

Dorcas

“90% of my sales are generated from Facebook. And then the few who do not buy, they give exposure. So, you find someone who will refer my page to someone else. So that’s how my business grows.” — Dorcas, social commerce entrepreneur, Nairobi. Watch Dorcas’s Story

For Gloria, 30, social commerce represented a flexible way to diversify her income, something that became critical when she lost her job just before the outbreak of COVID-19. While doing her full-time job, Gloria co-founded House of Penda, a Kampala-based company, as a side business selling fashion accessories through Facebook. It became her only source of income during the pandemic. Watch Gloria’s story

The tension behind the flexibility of online work

Some studies have discussed the tensions of “flexible” platform work and raised questions regarding the expectations of women that it perpetuates, specifically in handling both full-time obligations in the home along with platform work. An ODI study found that women gig workers in Kenya and South Africa identified childcare as the biggest challenge to their economic opportunities, along with work–life balance and the decreasing quality of their work.

Mary and her son

The women engaged in this project confirmed this tension. On the one hand, many of them talked about how platform work has offered a level of flexibility that enables them to choose when, where, and how to work. For Dathive, a single mother of four, for example, the flexibility that comes with her ride-hailing job enables her to head out early in the morning, drop students to school and workers to work, then return home in time to give her children breakfast and get them to school, too. She is glad that the money she makes from SafeBoda is enough to provide her children’s needs.

Four of the seven women in this project are single mothers. They talked about the challenges of splitting time between work and children on top of the responsibility of being the sole breadwinners for the family.

Mary, for example, explained how she needed to be online regularly to connect with her customers. She said it consumes a lot of time away from her family.

On the whole, despite the challenges, the platform experience for these women was positive. Even during the pandemic, they earned some income by doing work they enjoy through platforms.

However, the many unique challenges women face as they engage in platform-based work should not be overlooked. Platforms and policymakers should recognize the gendered aspects of platform labor and devise ways to improve women’s platform livelihood opportunities.

This year, Caribou Digital, with the support of the Mastercard Foundation, will embark on a multi-country study to further explore the intertwining of gender in digital work, aiming to make specific recommendations for programmatic and policy engagement. We look forward to sharing more on this important topic.

Watch all the videos from the project here

Women’s platform livelihoods: Balancing the opportunity with the myth of flexibility was originally published in Caribou Digital on Medium, where people are continuing the conversation by highlighting and responding to this story.


Platform livelihoods of young people in Africa: “The hustle is real”

By Miranda Grant and Grace Natabaalo In October 2020, Caribou Digital and Nairobi-based Story x Design, with the support of the Mastercard Foundation, embarked on a participatory video storytelling project that put 11 young Africans who earn their livelihood from digital platforms at the center of their own story. For two months, the seven women and four men in Ghana, Kenya, Nigeria, and Uganda s

By Miranda Grant and Grace Natabaalo

In October 2020, Caribou Digital and Nairobi-based Story x Design, with the support of the Mastercard Foundation, embarked on a participatory video storytelling project that put 11 young Africans who earn their livelihood from digital platforms at the center of their own story. For two months, the seven women and four men in Ghana, Kenya, Nigeria, and Uganda shared their experiences as platform workers and sellers.

A key theme that stood out in these stories was the prevalence of “side hustles” — money-earning businesses, gigs, or jobs undertaken in addition to a person’s primary business or employment. With personal needs and families to take care of, along with the economic impact of the ongoing COVID-19 pandemic, many platform workers and sellers have taken on side hustles in the form of additional platform work or the pursuit of off-platform ventures. Below, we share insights on these side hustles as shared through their video diaries.

1. Hustling across platforms

Many platform workers and sellers operate across various digital platforms, also known as multi-homing. As singular platform work often fails to provide enough income, workers and sellers are driven to sign up to multiple platforms. Studies by Fairwork, for example, have found that many gig workers still earn below minimum wages.

In early 2020, David, 28, took up a motorcycle delivery rider gig with Solar Taxi, a ride-hailing and courier company in Accra that uses solar-powered cars, tricycles, and motorbikes. David secured the bike on a hire purchase agreement and has to make monthly payments. To meet his payment target of $7 a week, David tells us he had to sign up to other platforms, including Wote, Bolt Food, and Enrout, on top of riding for Solar Taxi. But even then, David is worried that the money he earns from across these platforms is not enough for him to fulfill his big dreams, one of which is to start a family.

“I stay online as and when a trip comes. I accept whether on Bolt, on Solar Taxi, or on Enrout. I accept them and then I run,” he says.

In Nairobi, Dorcas, 41, sells cakes and children’s shoes across multiple social media platforms including Facebook, WhatsApp, and Instagram. In Uganda, Gloria, 30, does the same, selling fashion accessories through Facebook and WhatsApp. Although they are already using various platforms to help increase sales, they are both planning to set up their own websites as yet another way to reach more customers.

“I want to set up a website especially for the cake business. I want to make it big but still, home-based. Because I have been asked by so many people if I can tutor them on how to do just basic cakes. I can go to where they are and then just maybe tutor or even take advantage of online and then just do either Zoom classes or Google class,” Dorcas says.

Even online freelancer Anastestia, 24, who is a software developer and still new to online work, knows that creating a profile on more than one platform pays off. She is toggling Upwork, Fiverr, Toptal, and Freelance.com. She sees Upwork as her main hustle as it is where she gets most of her jobs. Watch Anastestia’s story

2. “Double-up your side hustle” and move offline

Some of the workers also balance on-platform work with off-platform side hustles, using the flexibility platform work offers to run and set up other businesses.

On top of being a rider for three apps, David is also an independent delivery guy and takes calls from customers off the platforms. However, despite working across multiple platforms and independently, he is still looking for additional means of earning money.

“There are bigger projects I want to do which this one cannot fund. So, what I will do is to add more avenues to make money so that I can actually see my dreams come true,” David shares.

Peter, 35, drives for Bolt and Taxify in Nigeria. A few months ago, he invested what money, time, and energy he had left into a 2000-bird poultry farm. If COVID-19 has taught him anything, it’s that you need to have a Plan B. He sees the poultry farm as a more stable investment than his platform work.

“You need to double up your hustle. You need to look for strategies and ways apart from what you’ve been thinking. You need to think a step or two steps ahead just to achieve or get the pursuit of your goals,” Peter says.

In Nigeria, Okoli Edwin, 20, who runs a physical shop selling solar products, tried to expand his business online through Jumia and Jiji at the onset of the pandemic. He assumed there would be a surge in solar panel demand as people would be quarantined and in need of consistent power supply. However, what he had not accounted for was the stretched pocket of buyers in a strained economic year. Okoli said he went online to look for ideas on how to make extra cash and came across forex trading. Now, Okoli takes forex trading classes online and watches the markets through his phone for any potential investments and deals.

“It got to a time where everything was just crazy — no more resources, we were out of stock, we couldn’t supply the customers. I fell down, really bad because how am I going to feed my family? How am I gonna do a lot of things? And so, I thought to myself, ‘Man, I had to get a side hustle. I had to look for another thing to do,’” Okoli says.

For Gloria, what used to be her digital side hustle has now become her only source of income. In 2019, she and a friend founded House of Penda, a social commerce store that sells fashion accessories through Facebook and WhatsApp. Back then, she still had a full-time job in brand marketing and considered House of Penda as her side business. When the pandemic hit, Gloria lost her job; she has, since then, focused on growing House of Penda. However, as it struggles to make enough money, she is now looking out for branding gigs and also applying for full-time jobs.

“I told everyone that hey, I’m available for gigs. I’m passionate about brand and communications. Tell someone that you know someone, mention my name, and that you know someone who’s really good at this. That was my strategy, get gigs and hope to God that a job comes along,” Gloria shares.
3. The struggle for those who don’t have a Plan B Stanley

Workers who are not able to find or take on side hustles are fully reliant on the work that comes through the platform, leaving them vulnerable when the COVID-19 pandemic broke out. Stanley, 35, previously a cleaner on Lynk, an on-demand labor platform in Kenya, relied on the platform as his sole source of income. In fact, as part of an incubation program, Lynk even paid him a monthly stipend and bonuses. However, despite its best efforts, Lynk was forced to halt its cleaning services a month into the pandemic, leaving Stanley with no work and no income. Stanley says this has taught him a lesson — “Have a Plan A, Plan B, Plan C, and even a Plan D.”

As Muthoni Mwaura’s Kenyan study found, “The notion of side hustling is particularly important for understanding youth livelihoods in contemporary contexts” because it helps to understand how young people make meaning of themselves and opportunities around them in extreme socioeconomic conditions. Listening to the platform workers and sellers over two months has revealed how having a side hustle — whether working across platforms or juggling on-platform and off-platform work — is key to surviving and thriving across digital platforms.

However, this also points to the shortcomings of platform work, and should make us question if this work is really dignified. Should people have to frantically piece together so many sources of income to earn a living? Does this make platform work meaningful or fulfilling?

Watch all the videos here

Platform livelihoods of young people in Africa: “The hustle is real” was originally published in Caribou Digital on Medium, where people are continuing the conversation by highlighting and responding to this story.


Meeco

Meeco Terms & Conditions Update – Feedback Welcome

At Meeco, our mission is to develop the tools to enable people to collect, protect and securely exchange personal data. We launched our first service in 2014 backed by Terms & Conditions we were proud to share. Starting with that first version, we’ve continued to invited feedback before implementing updates. ... Read More The post Meeco Terms & Conditions Update – Feedback Welcome app
At Meeco, our mission is to develop the tools to enable people to collect, protect and securely exchange personal data. We launched our first service in 2014 backed by Terms & Conditions we were proud to share. Starting with that first version, we’ve continued to invited feedback before implementing updates. We take our governance seriously, which starts with transparent and easy to understand terms of service. This is the fifth update since then. Our last major update was back in 2018 prior to the introduction of the General Data Protection Regulation. Through that update we were able to strengthen data rights, and extend the GDPR protections to everyone using Meeco. This V5 update paves the way for a range of exciting new Meeco services, including applications like mIKs-it, designed for to make digital services safer and secure for kids. We’ve also been busy building tools for developers. Tools to support amazing start-ups like My Life Capsule, who are helping people manage, prepare, capture and share data to connect and organise families across generations. We’re also deploying a range of new decentralised identity capabilities to support partners like VELA Solutions. VELA provide a digital credentialing platform to enable the secure storing and sharing of verifiable credentials. Over the next fourteen days, we would love your feedback or questions on any of the changes. Our Terms & Conditions reflect our commitment to giving people and organisations the tools to access, control and create mutual value from personal data. Here’s a high level summary of the changes: Introduction of data administration roles for parents and guardians Description of new and expanded Meeco Services Information about Meeco’s entities in Belgium and the United Kingdom Expanded terms to include children using Meeco Services  Prohibited use to protect children Additions to protect Your data rights Updates to increase Your data security Expansion of Your commencement and termination rights Introduction of terms regarding subscriptions and payments Additions to meaning of words and legal terms. If you would like to share your feedback, simply email support@meeco.me and include “Update to Terms and Conditions ” in the subject heading. 

All going to plan, our new Terms & Conditions will apply by Monday 8th March 2021!

Thank you for taking the time to read this update and we look forward to your comments.

The post Meeco Terms & Conditions Update – Feedback Welcome appeared first on The Meeco Blog.


IdRamp

IdRamp partners with the Lifelong Learner Project to win the ACE Blockchain Innovation Challenge

The goal of Phase 1 is for The Lifelong Learner Project to deliver a blockchain-based digital wallet solution to educators where they can securely receive, store and publish the verifiable credentials necessary to support teacher licensure applications and evidence of professional learning. The post IdRamp partners with the Lifelong Learner Project to win the ACE Blockchain Innovation Challenge f

The goal of Phase 1 is for The Lifelong Learner Project to deliver a blockchain-based digital wallet solution to educators where they can securely receive, store and publish the verifiable credentials necessary to support teacher licensure applications and evidence of professional learning.

The post IdRamp partners with the Lifelong Learner Project to win the ACE Blockchain Innovation Challenge first appeared on idRamp | Decentralized Identity Evolution.

Forgerock Blog

How Your Organization Can Eliminate Entitlement Creep

The Growth of Artificial Intelligence in Identity Governance Organizations are facing increasing pressure to provide employees and contractors with the right access to the right applications and systems at the right time. But how can they do this with their existing, manually-driven Identity Governance and Administration (IGA) solutions and processes? How can security and IT professionals addre
The Growth of Artificial Intelligence in Identity Governance

Organizations are facing increasing pressure to provide employees and contractors with the right access to the right applications and systems at the right time. But how can they do this with their existing, manually-driven Identity Governance and Administration (IGA) solutions and processes? How can security and IT professionals address the needs of the new remote workforce and its demands for access to new cloud applications and services? Combined with new machine identity types, accelerated DevOps/Agile development methodologies, and unplanned organizational changes, static IGA solutions and processes need to become more flexible, more dynamic, and more automated. 

With the introduction of artificial intelligence (AI) and machine learning (ML) into IGA, organizations have a clear path to hyper-automating their existing identity governance solution and processes. By applying AI and ML, enterprises can further streamline and automate intelligence across all identity governance use cases, including access requests and approvals, access reviews, and role engineering. Here are a few examples of how AI and ML can hyper-automate IGA solutions and processes to help combat your organization’s entitlement creep problems: 

Identify access risks across the entire organization and provide actionable insights to help accelerate the removal of overprivileged access Identify excessive privileges, orphaned accounts, and provide confidence scoring (example: low, medium, and high) in order to provide the right level of security risk context  Enable micro-certifications, where only a small set of entitlements and roles are approved between annual or biannual certification campaigns

While no new technology is 100% foolproof, the introduction of AI and ML capabilities into identity governance solutions and processes provides organizations with the most promising way to address the silent access challenge known as entitlement creep.

ForgeRock’s Modern Approach: Autonomous Identity  

ForgeRock Autonomous Identity provides real-time, continuous enterprise-wide user access visibility. The solution allows organizations to accelerate secure workforce access, achieve regulatory compliance, mitigate risks, and reduce costs. By leveraging AI and  ML techniques, Autonomous Identity collects and analyzes all identity data to identify security access and risk blind spots. The solution provides organizations with a complete user access landscape view – what good and bad access looks like across the entire enterprise. It provides organizations with wider and deeper insight into the risks associated with user access by providing enterprise-wide contextual insights, high-risk user access awareness, and remediation recommendations, such as the removal of overprivileged access, excessive permissions, and orphaned accounts. 

How it Works

ForgeRock Autonomous Identity links users to entitlements at the lowest attribute level. The solution uses profile data to determine the likelihood that an individual will need an entitlement, based on how entitlements are currently distributed across the organization. By applying AI and ML techniques, Autonomous Identity can quickly analyze all your organization’s identity data and identify overprivileged access, excessive permissions, and orphaned accounts. All are key contributors to your organization’s entitlement creep challenges. 

 

Why ForgeRock Autonomous Identity?

Here’s how Autonomous Identity’s unique and highly differentiated capabilities address entitlement creep: 

Global Visibility: By leveraging AI-driven identity analytics, you can collect and analyze identity data (example: accounts, roles, assignments, entitlements, and more) from diverse identity, governance, and infrastructure data sources in order to provide enterprise-wide visibility to all identities and what they have access to, including over privileged user access. This approach provides your security and risk teams with contextual insights into low-, medium-, and high-risk user access at scale. Data Agnostic: ForgeRock Autonomous Identity works with all identity data types to develop a complete view of the user access landscape. By consuming and analyzing tens of millions of data points quickly, Autonomous Identity can predict and recommend user access rights and highlight potential risks. Total landscape visibility provides highly accurate models based on what good access should and shouldn’t look like, including excessive permissions. Unlike other “black box” identity analytics solutions that are based on static rules, roles, and peer group analysis, Autonomous Identity relies strictly on organizational data to develop an analysis that is free from bias originating from human-derived rules and roles that exist in your existing identity governance solution. Transparent AI: Unlike other “black box” identity analytics solutions, ForgeRock Autonomous Identity allows you to fully comprehend how and why risk confidence scores are determined. By visually presenting low-, medium-, and high-risk confidence scores together, security and risk professionals can contextually understand what key risk indicators were met and, more importantly, why they were met. For example, why are certain employee and contractor accounts orphaned? This AI-driven approach recommends risk-based identity governance remediation updates based on enterprise-wide confidence scores. Eliminate Entitlement Creep with AI-Driven Identity Analytics

In today’s new reality, organizations have dynamic business challenges. They need a dynamic solution to help them achieve their business goals and grow the business. By applying AI-driven identity analytics, organizations can hyper-automate their existing identity governance solutions and processes, thereby eliminating entitlement creep. By detecting user access patterns – identity analytics can quickly highlight inappropriate user access. In turn, AI-driven identity analytics can automate the removal of high-confidence and low-risk access rights, lowering the risk of entitlement creep across your organization. 

To learn more about ForgeRock Autonomous Identity, read the new KuppingerCole white paper “Overcoming Identity Governance Challenges with ForgeRock Autonomous Identity.”

 


Holo

Elemental Chat — The First Holochain P2P App Released for Hosts

Elemental Chat — The First Holochain P2P App Released for Hosts Leadership & Org Update #25 When we first began this project, we shared our vision for an Internet of peer-to-peer web applications that would give users the power to direct their own online experiences. We connected with thousands of people also inspired by the possibility of removing centralising powers built into th
Elemental Chat — The First Holochain P2P App Released for Hosts Leadership & Org Update #25

When we first began this project, we shared our vision for an Internet of peer-to-peer web applications that would give users the power to direct their own online experiences. We connected with thousands of people also inspired by the possibility of removing centralising powers built into the infrastructure so that humanity might vastly expand its creative and collaborative potential.

The launch of Elemental Chat in our Alpha test certainly doesn’t get all that done. But it is a real step towards realising that promise. Elemental Chat running on HoloPorts has no central database. Each person who is running the app signs their messages to their own chain and then automatically posts them to the shared database that is hosted by the other users of the application. This is the start of something. This isn’t simply the seed of an idea. The deep roots of Holochain and Holo have fully sprouted and are now growing and strengthening.

This rapid recent growth is demonstrated in the timeline of events that led us to the current release.

15-Sept-2020
We shared publicly about Holochain RSM 15-Oct-2020
We shared our Path to Beta roadmap for Holochain RSM 12-Nov-2020
We shared about the upcoming Networking features for Holochain RSM in the Dev Pulse 27-Nov-2020
We announced also in the Dev Pulse that Networking in RSM had landed 24-Dec-2020
We announced Elemental Chat was released to Pre-release testers 29-Jan-2021
We shared about rewriting automated test tools for RSM in the Dev Pulse 10-Feb-2021
We announced the release of Elemental Chat to Hosts

The speed at which we have been able to move since the launch of Holochain RSM is indicative of how quickly and easily others will also be able to bring new P2P apps to market. So let’s dig into what Elemental Chat is, what it is not and what it lets us do moving forward.

Elemental Chat is simply a proof of concept application. What we mean by that is that it doesn’t have all the bells and whistles of popular messaging apps on the market today. You can create channels and you can send messages. You can also query the application network to see how many nodes are connected and how many people have the application open in a browser.

What you can’t do is limit what channels you see or follow, or send messages to only one person. You also can’t have reply threads, and in fact, the UI doesn’t even showcase all of the best features in Holochain. In this super-basic testing app, you see everything and can differentiate little which means a less than ideal user experience.

None of those wonderful user-friendly features are included with Elemental Chat, and that is intentional. We wanted an uncomplicated app with only a few features to avoid spending too much of our time testing and debugging a chat application, when our focus is really to build the underlying infrastructure that comprises the Holo distributed hosting platform.

A chat app differentiates Holochain from blockchain

Paul explained a few of the reasons why we chose Elemental Chat to release as the first distributed app for Holo quite nicely in the December 24 Dev Pulse:

“First, we thought it would be more fun to test; second, we knew it would push against all of Holochain’s capabilities in ways that would show up any bugs; and third, we wanted something that would show us where the performance bottlenecks were.”

But another important reason for us is because chat functionality illustrates the difference and potential of Holochain as compared to blockchain.

Releasing a chat app — even a toy app like Elemental Chat — demonstrates unequivocally that Holochain is designed for building classes of applications that could never really work on a blockchain. Think about apps where things like scalability, and speedy transactions or interactions are crucial for basic functionality. The vast majority of web apps that we use in our lives simply do not require the global consensus of a blockchain and in fact could be hampered if we try to force them into consensus models. Chat exemplifies this difference to us in the most natural of ways.

Imagine being at a party with 25 to 100 of your closest friends with people in small groups around a room chatting. Many different conversations are happening in the room simultaneously. You might be able to hear parts of many of them, but you are likely paying attention to one or two of them more than others. Each other person in the room is most likely doing the same — that is they are listening to the conversation they are having — not to the general humm of conversation in the entire space. Because of that, each person has their own unique experience of what is being said. I doubt anyone at that party would say there was something wrong with the way people were communicating. In fact — that very normal in-person multi-sensory type of setting is exactly what many of us have been missing this past year due to the pandemic.

So let’s talk about how a typical blockchain might try to model the communication at that party. One person (Alice) might say something. Another person (Bob) would need to process it and clarify to everyone else what was said and would record it and ensure that each other person understood it the exact same way. The first person (Alice) could not say something again until the first thing she said was processed by every person in the room. Some sort of order would need to be created for who was speaking because each statement would need to get placed in the correct order and shared with every other person in the room for the conversation to be deemed valid.

This sounds ludicrous — because it is. Consensus models are irrelevant for many human needs and when it comes to collaboration they are often extremely inefficient. With blockchains that inefficiency is evidenced in multiple ways but perhaps most explicitly by the extreme consumption of energy and creation of waste used to power the networks.

The reality is many applications only need to do two things really well: enable ease of coordination and reach eventual consistency.

Let’s take a look now at how Holochain’s natural patterns approach would model that same party. As you speak with a small subset of your friends, your messages are saved to your local chain. Your friends that are participating in your conversation would receive an immediate signal and hear your message. After someone’s message is saved to their local chain it is also saved to the group database or DHT. After a short while, your part of the shared database would get updated with messages from other conversations happening elsewhere in the party — and eventually you’d have access to all the gossip messages from the party. Sounds like a great and memorable party!

The use of natural patterns to model digital interaction is something that makes Holochain different — it’s why Holochain enables applications that can truly scale and why it’s ideal for collaboration. Holochain ensures agency for each user but does not require imposing a singular perspective.

How does Holochain work for currencies?

So we can see why Holochain is great for modeling a chat app, but many were expecting us to release the app that will run Holo’s cryptocurrency HoloFuel. People often ask “How can HoloFuel work without consensus?”

Imagine a bunch of parents in a community create a babysitting time swap system and have a phone app where I can credit you for three hours of babysitting, and my babysitting balance goes down by three hours when I do. Some people have provided more babysitting than they’ve received and they’d show positive balances of hours. Other folks who have received more babysitting than they’ve provided, would have a negative balance of hours.

If Alice needs to transfer some credits to Bob, do we really need global consensus? Do we need to know the state of every account in the system? No. Only Bob’s and Alice’s accounts are changing from this transaction, and they are the only authorities to make changes to their accounts. They can sign a single identical transaction to each of their chains (a transaction that shows Alice spending credits and Bob receiving them), and when they publish it to the rest of the network, others on the network will validate the transaction to make sure it follows the shared agreements. It is basically that simple. (Well it’s not quite that simple, but for a better understanding of how Holochain works, check out this twelve minute video.)

This model for currency is more like peer-to-peer accounting of value, where your chain holds the history of your transactions. Not everyone is transacting in hours of babysitting but the principles of the model work exactly the same way when you replace babysitting with distributed cloud hosting as the service, and it will work that way for food, energy, and many other applications as well.

What’s Next

The release of Elemental Chat for hosts is an important step forward on the journey. For us though, the next infrastructure milestone is the one we’ll be doing backflips about. That is when non-hosts — regular web users — will actually get the experience of creating an account and logging into Elemental Chat that is hosted on a hosts’ HoloPort. We are already beginning the testing cycle for that milestone.

Roadmap

In the coming weeks, you can count on us to support more and more hosts to connect their ports to the Holo network and to begin chatting in Elemental Chat. We will be assessing how well the performance scales and listening to what folks have to say about it.

We will most likely make a few small changes to the chat app as we prepare for the wider release to web users. We will continue to evolve the testing framework so that it can be used more generally by hosted apps, not just applications we directly install on HPOS.

As we shared in the previous article, we are also moving forward on several of the other milestones that come after Hosted Elemental Chat. We may even be releasing the Host Console work simultaneously, but we are not promising that yet. After that, we’ll be focusing on the HoloFuel application.

Approx Progress on Upcoming Milestones

What isn’t visible from these diagrams, however, is the work on Holochain that is continuing to evolve and which is critical for both Holo and community application developers who are building apps in our ecosystem.

Holochain has been steadily improving usability of the framework. When we first announced RSM, though it was a vast improvement technically, we hadn’t started releasing to crates, nor stabilized the Holochain development kit (HDK). We have recently shipped a raft of breaking changes to the HDK which should be fairly stable now. We’ve also ‘nixified’ Holochain to make it easier to install and ensure identical installations among collaborators. You can stay current with the latest tested version via Holochain.love.

In the coming weeks, there will be additional improvements to Holochain. Following the changeover of the database engine, we’ll be implementing sharding and other key network scaling features before we move on to more security and identity management features. All of these are foundations necessary for the Beta release of Holo.

So, I will leave you with yet another shout out to all of those who are supporting this amazing work coming into being. Kudos to the dev teams for bringing this release across the finish line. Kudos to the community-facing teams who are supporting hosts getting connected to the network. Kudos to all the app devs in the community pulling for new features and demonstrating how natural patterns work in Holochain. Kudos to all the admins and volunteers who share their knowledge with newcomers to the project.

In gratitude,
Mary

Elemental Chat — The First Holochain P2P App Released for Hosts was originally published in HOLO on Medium, where people are continuing the conversation by highlighting and responding to this story.


Bloom

How Bloom is Bridging the Gap Between DeFi and Decentralized Identity

Bloom's CTO Isaac Patka recently gave a talk at ETHDenver 2021 in which he discussed the co-evolution of the DeFi and Decentralized Identity spaces; and how Bloom is bridging the gap between them. Watch the full talk by clicking on this link.   Decentralized Identity & DeFi are Disconnected While the

Bloom's CTO Isaac Patka recently gave a talk at ETHDenver 2021 in which he discussed the co-evolution of the DeFi and Decentralized Identity spaces; and how Bloom is bridging the gap between them.

Watch the full talk by clicking on this link.  

Decentralized Identity & DeFi are Disconnected

While the markets for decentralized finance and decentralized identity are both expanding rapidly with new and novel products, the two are still largely disconnected.  

DeFi apps are increasingly built to be interoperable amongst each other mainly due to their adherence to a single common standard, ERC20.  This allows tokens to move between projects to contribute liquidity to Automated Market Makers, be used as collateral for loans, or provide much needed liquidity to borrowers in return for interest.  

Similarly, decentralized identity products and services are quickly maturing into its own ecosystem, with continued collaboration between service provider members of DIF and a quickly developing standard of interoperability through W3C.  And yet despite fast maturing markets in each sector, few solutions exist to bridge the communication and integration between these two worlds.

Decentralized Identity is Chain-Adjacent

The beauty of the decentralized identity space is that it is built on blockchain fundamentals.  While the identity record or credential itself is created off-chain, it can rapidly be brought on-chain and converted to a digital identity solution using technologies developed by a number of innovators in the space including Sidetree and Bloom.  Existing open-source blockchain wallet technologies already support native distribution of decentralized identities (DID).  Sidetree, for example, is a scalable DID protocol that enables interoperability with IPFS and any decentralized ledger system.

Bridging Decentralized Identity with DeFi

While decentralized identity systems can be said to be chain-adjacent, they need an intermediary to help facility the relaying of this chain-adjacent information to on-chain requestors such as DeFi applications.  These intermediaries or so-called bridges already exist in many forms and can be leveraged to provide decentralized access to DID systems.  Examples include Oracles, which can connect with data vaults to retrieve information specific to an individual user.  NFT’s are another example, where unique attributes can be packaged within them and exchanged for specifically tailored services.  Because of the rapid innovation that has occurred in the DeFi and larger blockchain space, the technology already exists to facilitate a complete ecosystem of interconnected products starting with digital identity verification to the rendering of decentralized finance products and services.

Decentralized Identity & DeFi are Complimentary

DeFi Has a Trust Problem

Popular DeFi lending apps like Compound and Aave have a trust problem.  Because they collect no information on their participants, they are unable to make informed decisions on how much to lend, what interest rate to charge, and for how long the loan should be extended.  Instead, they must treat all participants with the same level of risk, and thus require significant sums of collateral, in exchange for loans.  This means that borrowers must already possess more than the value of cryptocurrency they are asking to borrow, making DeFi lending impractical for participants who own little to no cryptocurrency.  

To compound this issue, regulations in the cryptocurrency space are quickly expanding and evolving.  According to the FATF “Travel Rule,” Virtual Asset Service Providers (VASPs) are required to verify the identities of individuals using their products in an effort to curb the financing of terrorism.  DeFi platforms are unlikely to successfully skirt these regulations based on the technicality that they are “decentralized.”  While most operate in a decentralized manner, they are still subject to regulatory enforcement.  For example, Compound has equity investors and therefore must have a base of operations and an entity to which these original equity contracts were signed.  This means it can be subject to the same regulatory requirements that are required by traditional financial institutions.  Thus, DeFi platforms like Compound are very likely to be required to adhere to Travel Rule regulations, necessitating eventual screening of participants.

Decentralized Identity has an Adjacency Problem

Even with the strides made within the decentralized identity ecosystem, including new use cases in a variety of different industries, their widespread use is still limited.  The decentralized identity space is poised to deliver a myriad of novel solutions to the world of distributed ledger applications; but its dependency on decentralized apps that are still finding traction means it may take some time to find adoption.

However, the one blockchain sector that has proven scale and usage is DeFi.  It has an immediate need and demand for decentralized identity solutions and is perfectly suited to be paired with such platforms.  It’s immediate need from both a consumer demand and regulatory compliance standpoint means that it is an ideal first-mover fit with current DID solutions.  Bloom is just one of the many decentralized identity management platforms that offers highly-performant solutions to bridge the gap between DeFi and DID systems.  Other examples exist which we will discuss.

How Decentralized Identity is Being Used

Health Data Passports

Currently the International Air Transport Association (IATA) is exploring methods in which travelers can move across borders freely without concern for further COVID spread and transmission.  IATA has proposed the use of a Travel Pass which will contain testing information on travelers in a verified and secure way.  Travel Pass itself is based on IATA’s OneID, an initiative endorsed by resolution at its 75th Annual General Meeting to help securely facilitate travel through the use of a single identity token.  It is one of the first of its kind to retain passenger identity information on a device, specifically managed and permissioned by the traveler, ensuring privacy through a decentralized identity management solution.  OneID serves as the base for Travel Pass.

Employment Information

Truu is an example of a platform that facilitates the simplified management of employment information through the use of a decentralized identity management system.  It collaborates with regulatory bodies, central National Health Service (UK) organizations and trusts to provide a digital passport for healthcare professionals.  It builds a comprehensive record that includes identity information, qualifications, workplace-based assessments, medical licensing, and employment information all as separately verified credentials.  Truu enables employers to make fast decisions on employment without the inefficiencies in tracking down records and other documents or managing communications between a variety of different data sources.  This is especially helpful in circumstances where rapid medical deployment is necessary such as in pandemic situations where medical professionals are being moved and employed between new job sites in any given day of the week.  Credentials are held digitally within a professional’s mobile device ensuring privacy and enabling specific dissemination of information on an as-needed basis.  Truu runs on Everynym’s distributed ledger technology and is a founding-member of Sovrin, a nonprofit organization enabling self-sovereign identity on the internet.

Credit, Income, KYC Data

Bloom continues to set the benchmark for how digital identity can be leveraged by decentralized financial institutions.  Bloom works by securing an individual’s digital identity information through receipt of verified credentials from an issuing body.  From there, it allows an individual to interact with various DeFi platforms that request and require identity information on its users to provide specifically tailored services such as credit facilities or to simply pass KYC checks as required by law.  Bloom’s DID platform provides several methods in which third-party applications can access DID data and isn’t specifically limited to DeFi uses cases.  It can function wherever DID is needed, and the technology is expandable to support other use cases where personalized information needs to be made sharable but private at the same time.

Building Solutions

Digital identity management systems enable DeFi platforms to perform KYC. However, in the spirit of DeFi, just as cryptocurrencies used to interact with these platforms are held non-custodially, the digital identity data can also be processed in a non-custodial manner.  In fact, according to information provided by the original author of KYC legislation, financial institutions need not store a copy of information used to perform KYC, only evidence that such KYC checks have occurred.  This means DeFi platforms can maintain the highest levels of security and privacy by reducing risks of “honey pot” hacks, while greatly improving user experience.

DID systems also provide a gateway to unsecured lending.  They help transition existing CeFi users on platforms like BlockFi and Celsius over to non-custodial DeFi platforms, enabling the benefits of CeFi without its risks and drawbacks.

The industry is ripe for exploration with new use cases and innovations being rolled out regularly.  If there is something you want to build at the intersection of decentralized identity and crypto we’d love to hear from you.

Contact: isaac@bloom.co

Telegram: @IsaacPatka

Github: https://github.com/ipatka

DIF Banking & Finance SIG: https://lists.identity.foundation/g/finance-sig



Finicity

Finicity awarded ICE Mortgage Technology Lenders’ Choice for Innovative Service Provider Award

ICE Mortgage Technology™, the leading cloud-based loan origination platform provider for the mortgage industry, announced today the winners of its 2021 ICE Mortgage Technology Innovation Awards. The ICE Mortgage Technology Innovation Awards recognize the most creative mortgage lending companies who are pushing the envelope by creating extraordinary, customized solutions with ICE Mortgage Technolog

ICE Mortgage Technology, the leading cloud-based loan origination platform provider for the mortgage industry, announced today the winners of its 2021 ICE Mortgage Technology Innovation Awards.

The ICE Mortgage Technology Innovation Awards recognize the most creative mortgage lending companies who are pushing the envelope by creating extraordinary, customized solutions with ICE Mortgage Technology to achieve their business goals with exceptional results.

The 2021 ICE Mortgage Technology Innovation Award winner for Lenders’ Choice for Innovative Service Provider is Finicity (with Waterstone Mortgage).

“With a year of unexpected challenges, these industry-leading and resilient companies customized our ICE Mortgage Technology solutions to utilize automation technologies and data-driven insights to excel during a demanding year,” said Joe Tyrrell, president, ICE Mortgage Technology. “We’re proud to recognize these exceptional winners who showed agility, flexibility and persistence as our industry continued to pivot throughout 2020.”

You can read the full press release here or more about the winners here.

You can also read about how Waterstone Mortgage and Finicity worked together to experience a 10-15% monthly boost in digital verifications. Providing asset data sooner in origination process, saved loan originators’ time, and simplified the entire borrower experience.

The post Finicity awarded ICE Mortgage Technology Lenders’ Choice for Innovative Service Provider Award appeared first on Finicity.


Infocert

IMPULSE, Artificial Intelligence and Blockchain to facilitate online identification in public services

IMPULSE (Identity Management in PUbLic SErvices) is an european project led by Gradiant – Research and Technology Organization (RTO) focused on connectivity, intelligence, and security technologies – and involving 16 entities from 9 different countries, including InfoCert. The alliance will work on the concept of Electronic Identity (eID) – the way users can identify themselves […] The post IMPU

IMPULSE (Identity Management in PUbLic SErvices) is an european project led by Gradiant – Research and Technology Organization (RTO) focused on connectivity, intelligence, and security technologies – and involving 16 entities from 9 different countries, including InfoCert.

The alliance will work on the concept of Electronic Identity (eID) – the way users can identify themselves (and be identified) through the network – and its implications in multiple contexts – through the use of Artificial Intelligence and Blockchain technologies.

***

The current health crisis around the world has limited personal interactions, digitising all kinds of procedures. This brings a number of problems, not only for those unfamiliar with the complex identification systems on the internet, but also for those who are not comfortable providing their personal data over the internet.

Furthermore, in a world where it is increasingly common to carry out all kinds of processes online, avoiding fraud and deception is only possible with first-class technology.

InfoCert and the other 15 partners involved in the project aims to create a tool which combines two of the most promising technologies available today, such as Artificial Intelligence and Blockchain networks, with the aim of improving the management of digital identity and electronic identification in the public sector.

The application of these technologies in the field of digital identity will substantially improve the existing electronic identification systems at the time legal, privacy or social issues requiring further analysis may raise.

IMPULSE will incorporate advanced face biometrics and document validation techniques based on AI to facilitate identification processes and provide the user with a digital onboarding experience that is fully transparent.

In addition, blockchain technology and the use of smart contracts will allow adding trustworthiness in the process, providing mechanisms for users to demonstrate their identity without the need to disclose their personal data to third parties, a priori, not reliable.

The consortium, made up of partners from Spain, Italy, Austria, Bulgaria, Germany, Finland, Iceland, France and Denmark, is funded by the European Union with €4 million under the Horizon 2020 programme.

Find out more about our R&D activities.

The post IMPULSE, Artificial Intelligence and Blockchain to facilitate online identification in public services appeared first on InfoCert.digital.


Coinfirm

Stellar supported on Coinfirm’s AML Platform

Starting from the 16th of February, Coinfirm has introduced the support of a new protocol in the AML Platform: Stellar Lumens (XLM)! What is Stellar? “Stellar is an open-source network for currencies and payments. Stellar makes it possible to create, send and trade digital representations of all forms of money (…) Like Bitcoin and Ethereum,...
Starting from the 16th of February, Coinfirm has introduced the support of a new protocol in the AML Platform: Stellar Lumens (XLM)! What is Stellar? “Stellar is an open-source network for currencies and payments. Stellar makes it possible to create, send and trade digital representations of all forms of money (…) Like Bitcoin and Ethereum,...

KuppingerCole

Buyer’s Compass: SOAR

by John Tolbert Security Orchestration Automation & Response (SOAR) refers to comprehensive solutions with capabilities that support a range of security tools and data sources. This KuppingerCole Buyer’s Compass will provide you with questions to ask vendors, criteria to select your vendor, and requirements for successful deployments. This document will help prepare your organization to condu

by John Tolbert

Security Orchestration Automation & Response (SOAR) refers to comprehensive solutions with capabilities that support a range of security tools and data sources. This KuppingerCole Buyer’s Compass will provide you with questions to ask vendors, criteria to select your vendor, and requirements for successful deployments. This document will help prepare your organization to conduct RFIs and RFPs for SOAR solutions.


51 Nodes

Kusama & Polkadot: Build an application specific Blockchain and launch a Parachain!

Source: https://polkadot.network/ Note: This article provides a general and slightly technical introduction to Parachains and indicates how to get started building a Substrate based blockchain which can be launched as Parachain in the Kusama & Polkadot ecosystem. It will explain some parts of Polkadot but it won’t get too much into specific details like staking, slashing or other important mech
Source: https://polkadot.network/
Note: This article provides a general and slightly technical introduction to Parachains and indicates how to get started building a Substrate based blockchain which can be launched as Parachain in the Kusama & Polkadot ecosystem. It will explain some parts of Polkadot but it won’t get too much into specific details like staking, slashing or other important mechanics in the ecosystem. Check out the official wiki to learn more about Polkadot.
Status quo

First of all it is important to mention that the Parachain functionality is not live yet. This functionality is the last missing piece of the Polkadot whitepaper and there is a public rollout plan available that provides an overview about the progress. However, since August 2020 the Parachain functionality is being tested in the official Parachain Testnet Rococo which in V0 only ran with prototype code. In December 2020 Rococo V1 was launched and recently Plasm Network announced to be the first project that joined the Rococo V1 Testnet which surely is a big milestone as the codebase of Rococo V1 will also be used in Kusama and Polkadot. Meanwhile other possible future Parachains like KILT followed and announced their own Roadmap.

Kusama vs. Polkadot

Although Kusama can be seen as the pre-production environment for Polkadot you should be aware that Kusama might and probably will evolve independently from Polkadot. If your project is in an early stage or you want to experiment with Parachains in a real world environment before moving it into production you will probably prefer Kusama over Polkadot first. You might also be interested in testing specific features in Kusama which aren’t yet available on Polkadot.

Source: https://polkadot.network/ Relay Chain & Parachains

The Relay Chain is the central chain of Polkadot. All validators of Polkadot are staked on the Relay Chain in DOT and validate for the Relay Chain. It has deliberately minimal functionality and its main responsibility is to coordinate the system as a whole, including Parachains. Other specific work is delegated to the Parachains that serve different purposes. All Parachains that are connected to the Relay Chain share the same security. Polkadot also has a shared state between the Relay Chain and all connected Parachains to ensure the validity of the entire system. Smart Contracts are not supported on the Relay Chain but can be supported by Parachains.

It’s expected that each Parachain will be as light and application specific as possible and serve a specific use case.
Source: https://medium.com/polkadot-network/parathreads-pay-as-you-go-parachains-7440d23dde06

In The Path of a Parachain Block it is described in detail how a new Parachain Block is produced.

You may notice the term Parathread in the picture above. The current (optimistic) assumption is that the Relay Chain supports up to 100 Parachains and some of these Parachain slots will be permanently reserved to serve the network on system level (e.g. Governance). Thus there probably won’t be enough slots available for all projects which is why they need to be obtained in a so-called auction (more on that below). Whereas Parachains get a guaranteed high throughput by bonding DOT tokens, Parathreads follow a pay as you go model where a fee in DOT is being paid to validators for creating blocks. In some cases this even makes sense if there aren’t many or high frequent state updates expected (e.g. DNS, Oracles).

Source: https://medium.com/polkadot-network/parathreads-pay-as-you-go-parachains-7440d23dde06

Parachains can become Parathreads and vice versa as they share the same technical base.

Parachain Slot Auction & Parachain Crowdloans

In order to become a Parachain in Kusama or Polkadot a slot must be obtained in a Parachain Slot Auction. Each slot duration is capped to 2 years and divided into 6-month lease periods. Parachains may lease more than one slot over time in order to extend their lease to Polkadot past the 2 year slot duration. Depending on the auction it might be possible that a slot is owned by 4 different Parachains during the 2 year slot duration.

Parachains don’t need to always inhabit the same slot. As long as a Parachain inhabits any slot it can continue as Parachain.

Parachain candidates can place bids in unpermissioned candle auctions that rely on verifiable random functions (VRFs) where the original mechanism has been slightly modified to be secure on a blockchain. The auction can be divided into two phases:

Opening Phase Closing Phase

No matter in what phase the auction is, the bids are public. From the first block until the last block during the Closing Phase a winner will be determined randomly and nobody knows which block determines the winner until the Closing Phase is finished.

Projects will also be able to launch a Crowdloan Campaign with specific parameters which allows them to loan DOTs from the community to obtain a Parachain slot without having to bond all the required DOTs on their own.

In the video above core developer Shawn Tabrizi explains both variants very well.

Substrate

After giving an overview about the Polkadot ecosystem it’s time to introduce Substrate. It is a modular framework that enables you to create purpose-built blockchains by composing custom and/or pre-built components.

Substrate is the foundation of all blockchains in the Polkadot ecosystem and requires knowledge in Rust.

In the Substrate Developer Hub (link below) you can learn everything that is required to get started developing your own Substrate-based chain.

Official Substrate Documentation for Blockchain Developers · Substrate Developer Hub

A few important things to know about Substrate:

The runtime of Substrate is referred to as the “state transition function” which contains the business logic that defines the behavior of the blockchain. You as a developer can define storage items that represent the state of the blockchain and functions that allow users to make changes to the state. Substrate ships with FRAME (Framework for Runtime Aggregation of Modularized Entities) which is a set of modules (called Pallets) and support libraries that simplify runtime development. Pallets are individual modules within FRAME that host domain-specific logic. To enable forkless runtime upgrade capabilities, Substrate uses runtimes that are built as WebAssembly (WASM) bytecode.

To learn more about how to perform a forkless runtime upgrade you can take a look at the official tutorial.

Pallets vs. Smart Contracts

For people heading over from other chains and ecosystems like Ethereum it might be obvious that application specific logic is always written in Smart Contracts which in turn are deployed on the blockchain. But as you already learned the idea of the Polkadot ecosystem is to have different Parachains that are as light and as application specific as possible.

There are basically 3 possibilities to add custom logic to a Substrated-based blockchain:

Write custom Pallets and include it in the runtime
- https://substrate.dev/docs/en/tutorials/create-a-pallet/
- https://substrate.dev/recipes/pallets-intro.html Include the Contracts Pallet in the runtime and write contracts in ink! (Rust based eDSL)
- https://paritytech.github.io/ink-docs
- https://substrate.dev/substrate-contracts-workshop Include Frontier (Pallets that serve as Ethereum compatibility layer) to the runtime and write contracts in Solidity
- https://github.com/paritytech/frontier
- https://substrate.dev/frontier-workshop
Custom Pallets are the best choice if you are building a greenfield project that serves a specific purpose and needs to run its own network.

Substrate already provides a lot of Pallets which you can combine together as needed, including your custom Pallet(s) in order to build a lightweight and unique runtime for your application specific blockchain:

Default Pallets of Substrate (Note: The link points to the v3.0.0 version of Substrate which was released a few days ago. Most tutorials are still based on v2.0.0) NFT Pallet (There is also a WIP reference implementation that showcases CryptoKitties on Substrate)

I also discovered a Substrate Marketplace where you are able to find different Pallets (at least the default ones) which you can include into your runtime.

If you aim to write contracts in ink! or port a Solidity based application from Ethereum over to a Substrate-based Parachain you don’t necessarily need to launch your own Parachain as there already exist projects that plan to launch a Parachain and include the required Pallets in their runtime:

Edgeware (ink! based contracts) Moonbeam (Solidity contracts)
Note: To check out how Moonbeam works you can clone the following repository: https://github.com/51nodes/moonbeam-playground
Polkadot compatibility with Cumulus

When you have written (and hopefully tested ;-)) your custom Pallets you need to make sure that you build a Polkadot compatible runtime that exposes an interface for validating its state transition and provides interfaces to send and receive messages of other Parachains.

Cumulus is an extension to Substrate that makes it easy to transform any Substrate-built runtime into a Polkadot-compatible Parachain. It is still in development, but the idea is that it should be simple to take a Substrate chain and add the Parachain code by importing the crates and adding a single line of code. You can also dive deeper into that topic and learn more about the mechanics of Polkadot by reading the Parachain Implementers Guide.

If you have built your own Parachain and want to join the official Rococo Testnet you have to follow this guide. The auction mechanism mentioned above is not active yet and Rococo is currently controlled by the development leads at Parity Technologies. For joining the network you have to run at least 1 Collator for your Parachain and 1 Validator node for Rococo.

Note: Learn how to setup a local Polkadot network based on Rococo by following the Readme of our repository: https://github.com/51nodes/polkadot-local-rococo
Final Thoughts

Substrate has a clean design and makes it very easy to build an application specific blockchain for everyone familiar with Rust. If you decide to run your own Parachain (or Parathread) you should also think about how to incentivise other players to run a Collator node in your network — for example by introducing your own network token.

Although Parachains are not live on Kusama and Polkadot yet we can already see many projects that aim to become a Parachain as soon as possible. It will be interesting to watch the battle of those projects to secure their exclusive Parachain slot on Kusama and Polkadot. Personally I am very interested to see how the auction mechanism works and which projects will secure their Parachain slot via a Crowdloan Campaign.

I am also looking forward to play around with the interoperability features that Polkadot provides when the implementation is mature enough.

51nodes GmbH based in Stuttgart is a provider of Crypto Economy solutions.

51nodes is member of the Substrate Delivery Partners program and supports companies and other organizations in realizing their blockchain projects. 51nodes offers technical consulting and implementation with a focus on smart contracts, decentralized apps (dApps), integration of blockchain with industry applications, and tokenization of assets.

Kusama & Polkadot: Build an application specific Blockchain and launch a Parachain! was originally published in 51nodes on Medium, where people are continuing the conversation by highlighting and responding to this story.


Infocert (IT)

Webinar “SPID e CIE: dinamiche di evoluzione e opportunità da cogliere”

Martedì 23 febbraio, Igor Marcolongo e Fabrizia Banti – rispettivamente Head of Business Compliance e Senior Consultant in InfoCert – saranno ospiti e relatori di un webinar organizzato da A.P.S.P. (Associazione Italiana Prestatori Servizi di Pagamento). *** Il 2020 ha segnato il decollo di SPID (il Sitema Pubblico di Identità Digitale), con oltre 16 milioni […] The post Webinar “SPID e CIE: din

Martedì 23 febbraio, Igor Marcolongo e Fabrizia Banti – rispettivamente Head of Business Compliance e Senior Consultant in InfoCert – saranno ospiti e relatori di un webinar organizzato da A.P.S.P. (Associazione Italiana Prestatori Servizi di Pagamento).

***

Il 2020 ha segnato il decollo di SPID (il Sitema Pubblico di Identità Digitale), con oltre 16 milioni di identità digitali rilasciate. Questi numeri, in aggiunta ai quasi 19 milioni di CIE (Carta di Identità Elettronica) richieste dai cittadini italiani, aprono scenari nuovi  ed estremamente interessanti per l’utilizzo di questi strumenti.

Nel corso del webinar “SPID e CIE: dinamiche di evoluzione e opportunità da cogliere” sarà presentato lo scenario evolutivo nazionale dell’identità digitale, evidenziandone le potenzialità e alcuni casi d’uso per generare valore nel cogliere le opportunità di questa evoluzione.

Agenda

Inizio webinar:

martedì 23 febbraio 2021 h. 10:00 – 11:00

Argomenti:

L’esplosione di SPID, la CIE e le opportunità da cogliere. Nuovi ruoli: service provider e aggregatori. SPID e SCIPAFI: quali relazioni? Casi d’uso per l’Identità Digitale.

Introduzione:

Maurizio Pimpinella – Presidente A.P.S.P.

Relatori:

Igor Marcolongo – Head of Business Compliance InfoCert Fabrizia Banti – Senior Consultant InfoCert Cristina Iacob – Commercial Strategy Director Experian Come collegarsi al webinar:

Per seguire il webinar tramite la piattaforma Google Meet, all’orario prestabilito segui il link:

meet.google.com/nbb-bbqr-anp

The post Webinar “SPID e CIE: dinamiche di evoluzione e opportunità da cogliere” appeared first on InfoCert.digital.


SelfKey

Something Huge is Coming your Way 📢🚀

SelfKey Weekly Newsletter Date – 17th February, 2021 Stay tuned! An announcement is coming your on 19th February, 2021. The post Something Huge is Coming your Way 📢🚀 appeared first on SelfKey.

SelfKey Weekly Newsletter

Date – 17th February, 2021

Stay tuned! An announcement is coming your on 19th February, 2021.

The post Something Huge is Coming your Way 📢🚀 appeared first on SelfKey.


Metadium

Metadium blockchain will be used by 108 companies through MYKEEPiN Alliance

Dear community, We are happy to share good news with you: MYKEEPiN Alliance has reached 108 member companies. This is relevant to the Metadium blockchain because MYKEEPiN is a personal information management service based on the Metadium Blockchain. MYKEEPiN Alliance is a DID-based service partnership program launched in April 2020 with the purpose of business model development and commercializa

Dear community,

We are happy to share good news with you: MYKEEPiN Alliance has reached 108 member companies. This is relevant to the Metadium blockchain because MYKEEPiN is a personal information management service based on the Metadium Blockchain.

MYKEEPiN Alliance is a DID-based service partnership program launched in April 2020 with the purpose of business model development and commercialization. The Alliance is designed to help corporations, institutions and other organizations to achieve a business model that implements DID.

What differentiates MYKEEPiN Alliance from other DID alliances in Korea is that it has provided real-life services that are already up and running, demonstrating Metadium Blockchain and MYKEEPiN’s stability and innovation.

Through cooperation between partners, MYKEEPiN Alliance is providing authentication services using the Metadium Blockchain in areas like:

Entrance and authentication service for unstaffed convenience stores Authentication and access to events like the Second Korean Parliament and Administration Fair and the Gyeonggi Future Show 2020 Authentication services that combine blockchain and AI-based facial recognition technologies in the education sector with a learning management system operated by Magic Eco (an AI education specialist) and the VR online education platform operated by Class V.

We thank you for your continous support.

Best,

Metadium Team

Metadium blockchain will be used by 108 companies through MYKEEPiN Alliance was originally published in Metadium on Medium, where people are continuing the conversation by highlighting and responding to this story.

Wednesday, 17. February 2021

KuppingerCole

John Tolbert: Zero Trust for Reducing the Risks of Security Incidents




Rebecca Nielsen: What is Strong Authentication in a Zero Trust Environment




Matthias Reinwarth: Zero Trust for Reducing the Risks of Security Incidents




Expert Chat: Interview with Stefan Würtemberger




Bryan Meister: Navigating Enterprise Enablement and Zero Trust




Panel - Zeroing in on Zero Trust: A Paradigm Shift in Cybersecurity




Scott Rose: Zero Trust 101




Paul Simmonds: Alignment of Zero Trust with Business Strategy




Dimitri Lubenski, Dr. Jan Herrmann: The Role of IAM within Zero Trust Architectures at Siemens




Roger Halbheer: Zero Trust - Security Through a Clearer Lens

Join us to understand how Zero Trust transforms your security strategy and makes you more resilient to a range of attacks. We will share a roadmap for leaders, architects, and practitioners, as well as talk about some quick wins and incremental progress on this journey.

Join us to understand how Zero Trust transforms your security strategy and makes you more resilient to a range of attacks. We will share a roadmap for leaders, architects, and practitioners, as well as talk about some quick wins and incremental progress on this journey.




Filipi Pires: Trust or Not Trust? Is there new mindset about CyberSecurity using Zero Trust?




Henk Marsman: From Trust to Zero - Lessons from Halfway in a Large Enterprise Environment




Panel - Zero Trust in the Enterprise




Eleni Richter: Zero Trust Use Cases




Finicity

The Paypers: Data decency and consumer trust – the bedrock of Open Banking

Wriiting in The Paypers Voice of the Industry, Matthew Driver, Executive Vice President, Services, Asia Pacific, Mastercard, discusses Open Banking and the importance of data decency, ethical data usage, and how consumer-centric data sharing has the potential to unlock new opportunities within financial services. Read the full article here. The post The Paypers: Data decency and consumer trust –

Wriiting in The Paypers Voice of the Industry, Matthew Driver, Executive Vice President, Services, Asia Pacific, Mastercard, discusses Open Banking and the importance of data decency, ethical data usage, and how consumer-centric data sharing has the potential to unlock new opportunities within financial services.

Read the full article here.

The post The Paypers: Data decency and consumer trust – the bedrock of Open Banking appeared first on Finicity.


Indicio

Be a part of the most dynamic network community in decentralized identity

The post Be a part of the most dynamic network community in decentralized identity appeared first on Indicio Tech.
Be a part of the most dynamic network community in decentralized identity Join the growing list of forward-thinking companies and organizations across the globe who are actively building the future of digital identity. This is your chance to be a part of the newest and most dynamic network in decentralized identity technology, open for innovative developers and companies eager to bring their solutions to market.

At Indicio, we’ve seen a rapid rise in demand for robust, stable, and professionally maintained networks to support decentralized identity solutions. It’s not a surprise: decentralized identity’s moment has arrived. That’s why we’ve been hard at work creating Hyperledger Indy networks upon which developers all over the world are building, testing, and launching their solutions. Powering these networks are Node Operators— companies and teams from around the world and from various industries who are designing and launching decentralized identity solutions.

What is a Node Operator?
At the heart of a decentralized identity ecosystem lies the distributed ledger— a distributed database made up of multiple copies of a ledger, hosted by various nodes. In practice at Indicio.tech, this means companies and organizations, together as a community, volunteer to run a copy of the ledger on a server that is under their authority. On the Indicio Network, we call these “Node Operators.” Together, these copies make up a verifiable data registry, from which credential issuers and verifiers can prove important information.

Set your solutions up for success by becoming a Node Operator

Be where the action is happening
We’re creating a community of doers, made up of companies worldwide who are creating digital identity solutions for use cases of all kinds, including banking, education, supply chain, travel, and humanitarian efforts. As a node operator, you’ll be on the frontline of the innovation, playing a leading role in this world-changing digital transformation.

Get access to resources
Node Operators are eligible to receive a complementary business support package for their first year in the program, including architectural guidance, best practice checks, an account-dedicated Slack channel, and a dedicated network engineer monitoring your environment and assisting you with your needs. We also help our node operators prepare their presentations and marketing materials for webinars and informational events.

Learn by doing
There’s no better way to get trained on how a decentralized identity ecosystem works than to play a critical role in the ecosystem itself. Supporting one of the nodes on the network gets your team a front-row view of how a network functions from the inside. We’ve seen firsthand how operating a node speeds up a company’s ability to develop and deploy their own solutions.

Take part in community events
Indicio hosts community events, such as monthly Node Operator sync-ups and spotlights, giving our Node Operators a platform to showcase, demonstrate, and discuss their solutions. We help keep our node operators up-to-speed by discussing new open source tools, improvements, network updates, and standards progress, as well as help them identify business opportunities.

Make identity simpler
The decentralized identity world can be daunting for newcomers and veterans alike. There are myriads of working groups, governance bodies, standards organizations, and cross-industry initiatives. While these all play a vital role in the development and adoption of the technology, they can often lead to “information overload” and distract your team from developing a refined, commercial-ready product. We’re here to help our Node Operators make sense of the tools and information available to them in the community, saving them valuable time, money, and resources. We don’t just talk the talk. We understand business demands and work closely with Node Operators to get to market fast.

Concerned running a node might be too challenging?

Our “Node Operator as Service” option can take care of your network needs, leaving you free to focus on building your identity solution and participate in the Node Operator community. Indicio can host your node on a service of your choice, maintaining it with business-critical updates.

Apply today and join a community of builders leading the way in digital identity innovation.  

The post Be a part of the most dynamic network community in decentralized identity appeared first on Indicio Tech.


Fission

Announcing TiddlyWiki on Fission

Jeremy Ruston, creator of TiddlyWiki, is working with Fission to bring a TiddlyWiki app to the platform.

Fission is working with Jeremy Ruston, creator of TiddlyWiki, to build a TiddlyWiki on Fission app and make the Webnative SDK available as a plugin to save and load TW content across browsers.

We’ve been working out in the open in the TiddlyWiki group on the Fission forum for a couple of weeks, so now it’s time to welcome more people to join us.

The goal is to make it so that TiddlyWiki can easily run as an app on the Fission publishing platform. Individual users will be able to sign up and launch a new TiddlyWiki or upload their own. Then, they will have everything saved and portable automatically from any browser, including mobile, that's logged into their Fission account.

TiddlyWiki on Fission Flag Cat. Stay tuned for more images and stickers!

Fission’s webnative javascript libraries and app publishing platform are open source and built on open standards. We’re designing the system to be portable and easy-to-personalize for anyone--not just professional developers--so that they can launch a business around apps, themes, or other creative software experiments.

We want to work with TiddlyWiki creators who are making custom Editions. The TiddlyWiki launcher that we’re working on with Jeremy can include any community Editions that people want to include, and we’ll be opening up the Github repo shortly for that.

And if you’re a creator that wants to build a business around a TiddlyWiki Edition on Fission, with your own supported app that 100s or 1000s of users can launch with a click and have their own custom domain, we’d love to talk. Our upcoming App Cloning feature is designed for this, and we’ll showcase this ability in the community-supported TiddlyWiki on Fission app so that you can see how it works.

We’re totally inspired by the TiddlyWiki community, and want to see what else we can build together. We're on the TiddlyWiki Google Group, and please come by and chat with us in the forum or Fission's Discord chat if you have questions. If you're a developer, you can follow Fission's guide to set up your developer account and get started.

Please join us at tomorrow’s Fission February 2021 Demo Day if you’d like to find out more and talk live with Jeremy and the Fission team.

Register for Fission Feb 18th Demo Day

You can also register to attend directly on Luma.


Global ID

The GiD Report#147 — Clubhouse is the end of a chapter, not the beginning of a new one

The GiD Report#147 — Clubhouse is the end of a chapter, not the beginning of a new one Welcome to The GiD Report, a weekly newsletter that covers GlobaliD team and partner news, market perspectives, and industry analysis. You can check out last week’s report here. This week: The Clubhouse story One Clubhouse challenge Telegram’s monetization story (or lack thereof) States tak
The GiD Report#147 — Clubhouse is the end of a chapter, not the beginning of a new one

Welcome to The GiD Report, a weekly newsletter that covers GlobaliD team and partner news, market perspectives, and industry analysis. You can check out last week’s report here.

This week:

The Clubhouse story One Clubhouse challenge Telegram’s monetization story (or lack thereof) States take on tech reg mantle Big Tech gets bigger It’s not about your identity, it’s what you can do with it (live performance edition) Stuff happens 1. Clubhouse is the end of a chapter, not the beginning of a new one. Photo: William Krause

Clubhouse is the belle of the ball right now. This is a pandemic story. This is a content creation story. And sure, COVID was a clear catalyst, but it was probably only a matter of time audio joined text and video in reaching this logical destination.

Facebook, Twitter, Mark Cuban, and I’m sure countless others have already gotten the memo.

As Ben Thompson notes, Clubhouse was “inevitable”:

Clubhouse, meanwhile, Silicon Valley’s hottest consumer startup, feels like the opposite case: in retrospect its emergence feels like it was inevitable — if anything, the question is what took so long for audio to follow the same path as text, images, and video.

Also:

The most obvious difference between Clubhouse and podcasts is how much dramatically easier it is to both create a conversation and to listen to one. This step change is very much inline with the shift from blogging to Twitter, from website publishing to Instagram, or from YouTube to TikTok.
Secondly, like those successful networks, Clubhouse centralizes creation and consumption into a tight feedback loop. In fact, conversation consumers can, by raising their hand and being recognized by the moderator, become creators in a matter of seconds.

In other words, it’s not that Clubhouse ushers in a new chapter of anything — well, aside from the democratization, which itself is arguably a game changer. Rather, it represents the natural conclusion of the last cycle, in part due to audio’s stunted evolution during the period, now supercharged in a locked down world.

One telling fact:

This is a well-worn story, but one story it isn’t? It’s not yet a monetization story. It’s unclear how you seamlessly insert algorithmic ads into real-life conversations.

According to Ben, tipping is a potential path forward that’s been discussed, a path Twitter is looking to also take.

And so the rapid ascent Clubhouse has enjoyed is easy to understand, but it will run into similar challenges — perhaps greater — than all the other analogs from this past cycle.

All of which speaks to the need of Trusted Engagement.

That’s the story here, and like the pitter patter on Clubhouse, that story is being told in real-time.

Stay tuned.

Relevant:

Via /j — Twitter CEO Floats User Tipping, Exclusive Content Features Via /jvsMark Cuban is co-founding a podcast app where hosts can talk to fans live and monetize their conversations Andreessen Horowitz Wins Deal for Creator Economy Startup Stir at $100 Million Valuation After Trump, the attention economy deflates Inside WeChat’s Struggle to Slow Down TikTok Owner ByteDance Clubhouse’s Inevitability Via /jvs — Fan Subscriptions | Facebook for Creators Via kiwipete — Brave Browser, The Passive Income King? 2. A reminder of Clubhouse’s old chapter challenges:

Stanford:

The Stanford Internet Observatory has confirmed that Agora, a Shanghai-based provider of real-time engagement software, supplies back-end infrastructure to the Clubhouse App. This relationship had previously been widely suspected but not publicly confirmed. Further, SIO has determined that a user’s unique Clubhouse ID number and chatroom ID are transmitted in plaintext, and Agora would likely have access to users’ raw audio, potentially providing access to the Chinese government. In at least one instance, SIO observed room metadata being relayed to servers we believe to be hosted in the PRC, and audio to servers managed by Chinese entities and distributed around the world via Anycast. It is also likely possible to connect Clubhouse IDs with user profiles.
3. In search of a monetization story, Telegram edition:

Via /andrey:

4. This week in Big Tech:

States are now leapfrogging feds when it comes to tech regulation. Axios:

A new bill out of North Dakota would force Apple to let developers go around the App Store to deliver iPhone apps directly to consumers. It would also block Apple from making developers use its payment system for in-app purchases and subscriptions.
In Virginia, a digital privacy bill that’s supported by some tech trade groups and companies like IBM is set to pass and be signed into law. It would make Virginia the second state to pass a major data privacy bill, after California’s 2018 law.
5. This week in monopolies:

Chart of the week:

Relevant:

Seen on Twitter: Economics: Rising Antitrust Creates More Bureaucracy, Threatens Productivity And Innovation & Slows Global Economic Growth China issues new anti-monopoly rules targeting its tech giants 6. It’s not about your identity, it’s about what you can do with it: 7. Stuff happens: Via /pstav — Mastercard Will Let Merchants Accept Payments in Crypto This Year — CoinDesk Via /carolyn — New Kraken Venture Fund to Target Early-Stage Crypto, Tech Startups Via /m — ID.me and Sterling Announce Exclusive Partnership to Transform Background Screening with Innovative Identity Verification Solutions — ID.me Insights Via /jApple positioned to offer cryptocurrencies: RBC report Will the pandemic finally get Americans to embrace QR codes? Credit Card Companies Should Offer Stablecoin Payments or Be Left Behind: Gartner — CoinDesk Chinese consumers now make three mobile payments every day • NFCW China Is Winning the Digital Currency War With the U.S. Via /gregkidd — Why Crypto will 100% Succeed Diem Stablecoin Prepares for Liftoff With Fireblocks Custody Partnership — CoinDesk

The GiD Report#147 — Clubhouse is the end of a chapter, not the beginning of a new one was originally published in GlobaliD on Medium, where people are continuing the conversation by highlighting and responding to this story.


IBM Blockchain

Looking past the industrial future with AI, IoT and blockchain

The industrial future lies ahead with rapid transformation through high-end technologies like AI, IoT, and blockchain. But what makes these technologies so distinct is their outstanding ability to automate the entire infrastructure. It becomes easier and smarter for you to supervise the industrial processes in detail. Also, with the rapid increase in globalization, product complexities, […] The

The industrial future lies ahead with rapid transformation through high-end technologies like AI, IoT, and blockchain. But what makes these technologies so distinct is their outstanding ability to automate the entire infrastructure. It becomes easier and smarter for you to supervise the industrial processes in detail. Also, with the rapid increase in globalization, product complexities, […]

The post Looking past the industrial future with AI, IoT and blockchain appeared first on Blockchain Pulse: IBM Blockchain Blog.


Trinsic (was streetcred)

An Introduction to Trinsic’s APIs

Trinsic is the best way to implement verifiable credentials. Our platform is made up of three APIs built specifically for verifiable credential exchange. Trinsic’s APIs make developers’ lives easier by making the implementation of verifiable credentials simple and accessible. Below, we give a brief introduction to Trinsic’s APIs with accompanying resources to learn more. Provider […] The post An

Trinsic is the best way to implement verifiable credentials. Our platform is made up of three APIs built specifically for verifiable credential exchange. Trinsic’s APIs make developers’ lives easier by making the implementation of verifiable credentials simple and accessible. Below, we give a brief introduction to Trinsic’s APIs with accompanying resources to learn more.

Provider API

We launched our newest API—the Provider API—last year as the first product built specifically for self-sovereign identity (SSI) providers. Providers are the organizations that bring together issuers, holders, and verifiers into a single ecosystem, generally under a single use case or family of use cases. They establish the rules around who is able to participate in their ecosystem.

 

In verifiable credential exchange, as shown in the graphic below, instead of verifiers having to establish a relationship with every single issuer in an ecosystem in order to perform verifications, the verifiers establish a single relationship with the provider. If they can trust the provider, they can trust the credentials coming from any number of issuers in the ecosystem.

A familiar example of a provider is Mastercard, and within the Mastercard ecosystem, banks are the credential issuers (in this case the credential is a Mastercard credit card), consumers are the holders, and merchants are the credential verifiers.

 

The Provider API makes it possible for providers to create credential issuers and verifiers programmatically instead of one at a time. With the Provider API, any organization can literally become a SSI provider for their ecosystem. When the Provider API was first released, one of our users created 13,000 verifiers in a singles script! Unlike the Credentials API and the Wallet API below, the Provider API is a paid feature on our platform.

Get started

 

See the documentation of the Provider API here. Check out our Provider Reference App to see how it can be used in a web application. We use our own Provider API to help you create organizations in the Trinsic Studio. Head to the Studio to see for yourself! Credentials API

The Credentials API is as simple as it sounds—it is for issuing and verifying credentials. Verifiable credential issuance and verification is at the core of every SSI use case, so the Credentials API is used, whether directly or indirectly through Trinsic Studio, by all of our customers.

 

Get started

 

See the documentation of the Credentials API here. Check out our Issuer Reference App for a demo on how to issue a business card credential to anyone with a mobile wallet. Check out our Verifier Reference App for an example of how to simulate a request of “proof of passport”. Wallet API

The Wallet API is for creating and managing cloud wallets on behalf of credential holders (i.e. individuals). The cloud wallets you create with this API can hold credentials, make connections, and respond to incoming requests. This API is ideal for organizations that would like to manage wallets on behalf of individuals (and uses its own authentication to authenticate the person to the correct wallet). Cloud wallets are flexible in that individuals can access them via mobile app or browser. Trinsic’s Wallet API can be integrated using our Mobile SDK.

 

Get started

 

See the documentation of the Wallet API here. Check out our Wallet Reference App for a demo on how to manage custodial wallets for an organization. Interested in using our Mobile SDK? Contact us. Start building

Ready to get started implementing verifiable credentials today using Trinsic’s APIs? Start by creating a free Trinsic Studio account which allows you to start interacting with our APIs in minutes. Or visit our documentation to learn more.

The post An Introduction to Trinsic’s APIs appeared first on Trinsic.


auth0

What is .NET? An Overview of the Platform

An overall view of .NET, Microsoft's cross-platform development environment that allows you to build any type of application with C# and other languages.
An overall view of .NET, Microsoft's cross-platform development environment that allows you to build any type of application with C# and other languages.

Auth0 Ranked on G2’s 2021 Best Software Companies List

Global identity management provider recognized as a top IT cloud management product
Global identity management provider recognized as a top IT cloud management product

UbiSecure

LEIs & corporate transparency for Obliged Entities with Ben Cronin, UBO Service – Podcast Episode 39

Let’s Talk About Digital Identity with Ben Cronin, Managing Director at UBO Service. In episode 39 of LTADI, Oscar talks to Ben... The post LEIs & corporate transparency for Obliged Entities with Ben Cronin, UBO Service – Podcast Episode 39 appeared first on Ubisecure Customer Identity Management.
Let’s Talk About Digital Identity with Ben Cronin, Managing Director at UBO Service.

In episode 39 of LTADI, Oscar talks to Ben about UBO Service and the challenges it solves around verifying Ultimate Beneficiary Owners, how UBO Service is leveraging Legal Entity Identifiers (LEIs) for KYC and enhanced CDD, and the Global LEI Foundation’s validation agent (VA) framework.

“It was very obvious to us that adding the LEI to that identification piece was very powerful because you’re really identifying and verifying the entity to a very high standard. By using data that we get from official government registries – adding that to an LEI just makes complete perfect sense to us.”

Serial entrepreneur Ben Cronin founded GBR (Global Business Register) in 2008. GBR morphed into Kyckr over the following years and Kyckr listed on the Sydney Stock Exchange in 2016, providing commercially proven products for the authentication of businesses globally. His roles at Kyckr included Managing Director and Chief Data Officer. Ben is a supporter of Max Schrem’s organisation, NOYB – European Centre for Digital Rights; the fight for data privacy is important for all citizens. Ben played professional rugby with Munster and Ireland in the 90’s. His other interests include tennis, whisky, science and family!

Find Ben on LinkedIn and on Twitter @Ben_Cronin.

Ben Cronin is currently Managing Director at UBO Service. UBO Service offers an innovative new solution for obliged entities to capture accurate Ultimate Beneficial Owner (UBO) declarations in real-time.

Find out more about UBO Service at www.uboservice.com.

UBO Service is in partnership with Ubisecure’s Legal Entity Identifier service, RapidLEI. Read more about the partnership in the press release: www.ubisecure.com/news-events/rapidlei-ubo-service-partnership-kyc/

We’ll be continuing this conversation on LinkedIn and Twitter using #LTADI – join us @ubisecure!

 

The post LEIs & corporate transparency for Obliged Entities with Ben Cronin, UBO Service – Podcast Episode 39 appeared first on Ubisecure Customer Identity Management.


PingTalk

What is Single Sign-on (SSO)?

What is Single Sign-on? Single sign on (SSO) allows a user to sign on with one set of credentials and gain access to multiple applications and services. SSO increases security and provides a better user experience for customers, employees and partners by reducing the number of required accounts/passwords and providing simpler access to all the apps and services they need.  
What is Single Sign-on?

Single sign on (SSO) allows a user to sign on with one set of credentials and gain access to multiple applications and services. SSO increases security and provides a better user experience for customers, employees and partners by reducing the number of required accounts/passwords and providing simpler access to all the apps and services they need.

 

Okta

Building and Securing a Go and Gin Web Application

Today, we are going to build a simple web application that implements a to-do list. The backend will be written in Go. It will use the Go Gin Web Framework which implements a high-performance HTTP server. The front end will use the Vue.js JavaScript framework to implement a single page application (SPA). We will secure it using Okta OAuth 2.0 authentication. Let’s get started! PS: The code for

Today, we are going to build a simple web application that implements a to-do list. The backend will be written in Go. It will use the Go Gin Web Framework which implements a high-performance HTTP server. The front end will use the Vue.js JavaScript framework to implement a single page application (SPA). We will secure it using Okta OAuth 2.0 authentication. Let’s get started!

PS: The code for this project can be found on GitHub if you’d like to check it out.

Prerequisites to Building a Go, Gin, and Vue Application

First things first, if you don’t already have Go installed on your computer you will need to Download and install - The Go Programming Language.

Finally, create a project directory where all of our future code will live:

mkdir ~/okta-go-gin-vue-example cd ~/okta-go-gin-vue-example

A Go workspace is required. This is a directory in which all Go libraries live. It is usually ~/go, but can be any directory as long as the environment variable GOPATH points to it.

Next, install the Gin package into the Go workspace using this command:

go get -u github.com/gin-gonic/gin

If you haven’t already got Node.js and npm installed, you’ll need them to use Vue.js. To install Node.js and npm, go to Downloading and installing Node.js and npm and install them.

Once you have Node.js and npm installed, you can now install Vue.js with this command:

npm install --global @vue/cli How to Build a Simple Go/Gin Application

We will start by creating a simple Gin application. Create a file called simple.go containing the following Go code:

package main import "github.com/gin-gonic/gin" func main() { r := gin.Default() r.GET("/", func(c *gin.Context) { c.String(200, "Welcome to Go and Gin!") }) r.Run() }

Let’s explain what this code does. The import statement loads the Gin package from the Go workspace. The main() function is the program entry point. First, a default Gin server is created with the r := gin.Default() statement. The r.GET() function is used to register code with Gin that will be called when a matching HTTP GET request is called. It takes two parameters: the URI to match (/), and a callback function that takes a Gin context struct as a parameter. The String function is called on the context (c), passing it the response status and the response body. Finally, the Run function is called to start the server listening on port 8080 by default.

Next, run the server.

go run simple.go

You will get some warnings when you run the command above, which you can ignore for now.

Now, we can test the server using curl.

curl http://localhost:8080

You should see the Welcome to Go and Gin! welcome message.

How to Build a RESTful Application

First of all, we will build the RESTful backend server. We will later add a Vue.js frontend and add Okta authentication.

We are going to make a Go module called golang-gin-vue which is the same name as the working directory. This will create a file called go.mod which defines the module and the version of Go.

go mod init okta-go-gin-vue-example

Next, create a file called main.go containing the following Go code.

package main import ( "net/http" "github.com/gin-gonic/gin" ) var todos []string func Lists(c *gin.Context) { c.JSON(http.StatusOK, gin.H{"list": todos}) } func main() { todos = append(todos, "Write the application") r := gin.Default() r.GET("/api/lists", Lists) r.Run() }

This contains some enhancements from the previous program.

A slice called todos has been created to contain the todo items.

The function Lists processes GET requests and returns a JSON object containing the items.

The net/http package has been imported so that we can use the more readable http.StatusOK as a status code rather than the opaque number 200.

The function c.JSON() serializes the structure into a JSON string in the response body. It also sets the Content-Type header to application/json.

The function gin.H is used to create JSON objects.

The GET function now calls the Lists function when the request URI is /api/lists.

Now we can run the server. The first time it is run Go will have to resolve the Gin library and add it to go.mod.

go run main.go

We can test that it’s working using this curl command in another terminal window

curl http://localhost:8080/api/lists

You should get a response that looks like this:

{"list":["Write the application"]} What Is a Path Parameter and How Do I Implement It in Gin?

A path parameter is a URI with a variable part. Any part of a URI can be a path parameter. In our example, we can get individual entries using URIs such as /api/lists/0, /api/lists/1 etc.

The implementation is quite simple. The variable part of the URI is prefixed with a colon when the URI to function mapping is defined.

r.GET("/api/lists/:index", ListItem)

The implementation of this is shown in the ListItem() function below:

func ListItem(c *gin.Context) { errormessage := "Index out of range" indexstring := c.Param("index") if index, err := strconv.Atoi(indexstring); err == nil && index < len(todos) { c.JSON(http.StatusOK, gin.H{"item": todos[index]}) } else { if err != nil { errormessage = "Number expected: " + indexstring } c.JSON(http.StatusBadRequest, gin.H{"error": errormessage}) } }

The name specified in the URI after the colon can be extracted using the Params function in the Gin context.

Update your code in main.go to add strconv to the imports, include the ListItem function, and add the r.GET("/api/lists/:index", ListItem) statement to the main() function so that main.go looks like this:

package main import ( "github.com/gin-gonic/gin" "net/http" "strconv" ) var todos []string func Lists(c *gin.Context) { c.JSON(http.StatusOK, gin.H{"list": todos}) } func ListItem(c *gin.Context) { errormessage := "Index out of range" indexstring := c.Param("index") if index, err := strconv.Atoi(indexstring); err == nil && index < len(todos) { c.JSON(http.StatusOK, gin.H{"item": todos[index]}) } else { if err != nil { errormessage = "Number expected: " + indexstring } c.JSON(http.StatusBadRequest, gin.H{"error": errormessage}) } } func main() { todos = append(todos, "Write the application") r := gin.Default() r.GET("/api/lists", Lists) r.GET("/api/lists/:index", ListItem) r.Run() }

The server can be tested running the go run main.go command and then using curl. Verify that the error handling works using invalid URIs.

curl -i http://localhost:8080/api/lists/0 curl -i http://localhost:8080/api/lists/1 curl -i http://localhost:8080/api/lists/foo

Notice that the last two commands will display different error messages.

How Do I Handle a POST Request Using Gin?

A POST request simply requires using POST in place of GET.

Add the following statement to the main() function in main.go

r.POST("/api/lists", AddListItem)

The POST handler uses a PostForm method to extract the parameters, add the code below to main.go:

func AddListItem(c *gin.Context) { item := c.PostForm("item") todos = append(todos, item) c.String(http.StatusCreated, c.FullPath()+"/"+strconv.Itoa(len(todos)-1)) }

main.go should look like this now:

package main import ( "github.com/gin-gonic/gin" "net/http" "strconv" ) var todos []string func Lists(c *gin.Context) { c.JSON(http.StatusOK, gin.H{"list": todos}) } func ListItem(c *gin.Context) { errormessage := "Index out of range" indexstring := c.Param("index") if index, err := strconv.Atoi(indexstring); err == nil && index < len(todos) { c.JSON(http.StatusOK, gin.H{"item": todos[index]}) } else { if err != nil { errormessage = "Number expected: " + indexstring } c.JSON(http.StatusBadRequest, gin.H{"error": errormessage}) } } func AddListItem(c *gin.Context) { item := c.PostForm("item") todos = append(todos, item) c.String(http.StatusCreated, c.FullPath()+"/"+strconv.Itoa(len(todos)-1)) } func main() { todos = append(todos, "Write the application") r := gin.Default() r.GET("/api/lists", Lists) r.GET("/api/lists/:index", ListItem) r.POST("/api/lists", AddListItem) r.Run() }

It is important to note that a REST POST request creates a resource on the server and assigns it a URI. The proper response to a POST is a status of 201 Created and the response body should contain the URI of the new resource.

The server can be tested using curl. Restart the server using go run main.go and then use the curl commands below to test it out:

curl -i -X POST -d "item=Build Vue frontend" http://localhost:8080/api/lists curl -i http://localhost:8080/api/lists How to Build a Simple Vue Application

A number of files need to be created to build a Vue application. The Vue client (Vue CLI) can create the necessary files for us. Let’s use the Vue CLI to create an application called todo-vue.

vue create todo-vue

The creation process will ask a number of questions. Use the arrow keys to select the option required then hit the enter key.

It may ask if you want to use a faster registry: answer Y. Please pick a preset: use the arrow keys to select Manually select features. Check the features needed for your project: Babel, which is a JavaScript compiler, and Linter / Formatter should be selected. Choose a version of Vue.js that you want to start the project with (Use arrow keys): select 2.x. Pick a linter / formatter config: select ESLint with error prevention only. This will check for and catch common errors but will not enforce strict rules. Pick additional lint features: select Lint on save. This will check for errors when files are saved in the project. Where do you prefer placing config for Babel, ESLint, etc.? Select In dedicated config files. Several small files are better than one big file. Save this as a preset for future projects? select type N. Pick the package manager to use when installing dependencies: select Use NPM

The Vue CLI will then create the project. This includes downloading components which can take some time.

Let’s see what got created in the new directory todo-vue. We will explain the content of the files in more detail as we build the application:

The directory node_modules contains the Node.js modules which are needed by Vue.

The file README.md contains instructions on building and running the Vue application.

The file package.json describes the Vue package and its dependencies.

The template for the application is public/index.html it contains <div id="app"></div> which gets replaced by the Vue content.

The src directory contains the core Vue application. The entry point for the application is in src/main.js which replaces the <div> in the HTML page with the content. The file src/App.vue contains the top-level Vue component. There is one component src/components/HelloWorld.vue which can be deleted as it is not required.

We will now simplify the file src/App.vueto have the following content:

<template> <div id="app"> <h1>To-Do List</h1> </div> </template> <script> export default { name: 'App', } </script> <style> #app { font-family: Avenir, Helvetica, Arial, sans-serif; -webkit-font-smoothing: antialiased; -moz-osx-font-smoothing: grayscale; text-align: center; color: #2c3e50; margin-top: 60px; } </style>

Let’s talk about what this does. The template element contains the HTML which will replace the <div> in the HTML file. The script element exports the application and gives it a name. The style element is self-explanatory and can be modified any way you want to. We can now run a built-in development server to test the application.

cd todo-vue npm run serve

This will build the application and start listening on port 8080 by default. You can access the page by pointing a web browser at http://localhost:8080. If you are curious, the port number can be changed by setting the environment variable PORT to another value.

How to Add the To-Do List to the Application

First, we are going to add a placeholder for the actual to-do list. We need to modify App.vue. The style element will be omitted in the example as it doesn’t change.

<template> <div id="app"> <h1>To-Do List</h1> <div></div> </div> </template> <script> const appData = { todolist: "More to do" } export default { name: 'App', data() { return appData; } } </script>

So, what has changed? The template has been modified to add the list. The `` gets replaced with the contents of a variable called todolist.

There have been two changes to the script element. An object called appData has been added which defines the variable todolist used in the template and gives it a default value. The default application now has a function called data() which returns the appData object.

Now, run the application and point a web browser at it. You should see the text “more to do” on the page.

How to Combine the Web Servers

When a Vue application is ready to use it can be converted into a static website. Run the command below to do that:

cd todo-vue npm run build

This creates a directory called dist containing the static content.

Then, we change the Go server to deliver the static content. We will serve the todo-vue/dist directory as the root URI /. Modify main.go and add "github.com/gin-contrib/static" to the imports and add the following line to the main() function, after the r := gin.Default() line:

r.Use(static.Serve("/", static.LocalFile("./todo-vue/dist", false)))

Run the Go server with go run main.go and point a browser at http://localhost:8080 to see the Vue generated web page.

How to Make the Vue App Call the Server

The Axios JavaScript package is required to call the server. Make sure it is installed locally.

cd todo-vue npm install --save axios

Next, we need to modify src/App.vue to make the Ajax call and render it.

<template> <div id="app"> <h1>To-Do List</h1> <ul> <li v-for="item in todolist" v-bind:key="item"></li> </ul> </div> </template> <script> import axios from "axios"; const appData = { todolist: ["More to do"] } export default { name: 'App', data() { return appData; }, mounted: function() { this.getList(); }, methods: { getList: getList } } function getList() { axios.get("/api/lists").then( res => { appData.todolist = res.data.list }); } </script> <style> #app { font-family: Avenir, Helvetica, Arial, sans-serif; -webkit-font-smoothing: antialiased; -moz-osx-font-smoothing: grayscale; text-align: center; color: #2c3e50; margin-top: 60px; } </style>

Let’s go through the changes. At the start of the <script> tag, the Axios package is imported. At the end of the script the function getList() is defined. It makes an Axios get() call to the API. The response is a JSON object which gets transformed into a JavaScript object. The JSON object contains an object called list which is an array of item strings. The response is in res.data and res.data.list extracts the array of item strings that are assigned to the todolist variable.

In the export the methods object exports the getList() function as a JavaScript function with the same name which is visible to the application. The mounted entry defines a function that is called when the page has loaded. It calls getList() to get the list of items.

In the template, there is now an unordered list. The v-for attribute iterates over the array of strings contained in todolist creating a <li> element for each entry. The v-bind:key attribute makes the item variable from the v-for attribute a template variable.

Rebuild the Vue project, then start the server using these commands:

cd todo-vue npm run build cd .. go run main.go

Then, load the web page. A single item from the API should be displayed.

How to Make a POST Request from Form Data

We are going to add a form that enables a new item to be added to the list.

First, add a form inside the <template> tag, just above the <ul> line in App.vue.

<form method="POST" @submit.prevent="sendItem()"> <input type="text" size="50" v-model="todoitem" placeholder="Enter new item"/> <input type="submit" value="Submit"/> </form>

The form contains a text input box and a submit button. The v-model attribute associates the text input with a variable called todoitem. The function sendItem() is called when the submit button is clicked. The @submit.prevent attribute prevents the page from being reloaded when the form is submitted.

Next, create the sendItem() function at the bottom of the <script> tag in the App.vue file:

async function sendItem() { const params = new URLSearchParams(); params.append('item', this.todoitem); await axios.post("/api/lists", params); getList() }

The function is made asynchronous so that it is non-blocking. The POST parameters are set using a URLSearchParams object. There is a single parameter called item and its value is the value of the text input passed into the variable this.todoitem which was defined by the v-model attribute on the text input. The POST request is made by calling axios.post(). The await makes the function wait until the POST response arrives. Finally, the getList() function is called to get the updated list.```

You will also need to add the sendItem function to the list of methods that are exposed to Vue. Update the methods section of the export default part of the <script> tag so that it looks like this:

methods: { getList: getList, sendItem: sendItem, }

Here is how your Vue.app file should look:

<template> <div id="app"> <h1>To-Do List</h1> <form method="POST" @submit.prevent="sendItem()"> <input type="text" size="50" v-model="todoitem" placeholder="Enter new item"/> <input type="submit" value="Submit"/> </form> <ul> <li v-for="item in todolist" v-bind:key="item"></li> </ul> </div> </template> <script> import axios from "axios"; const appData = { todolist: [] } export default { name: 'App', data() { return appData; }, mounted: function() { this.getList(); }, methods: { getList: getList, sendItem: sendItem, } } function getList() { axios.get("/api/lists").then( res => { appData.todolist = res.data.list }); } async function sendItem() { const params = new URLSearchParams(); params.append('item', this.todoitem); await axios.post("/api/lists", params); getList() } </script> <style> #app { font-family: Avenir, Helvetica, Arial, sans-serif; -webkit-font-smoothing: antialiased; -moz-osx-font-smoothing: grayscale; text-align: center; color: #2c3e50; margin-top: 60px; } </style>

To test the changes: Build the Vue application and start the server using these commands:

cd todo-vue npm run build cd .. go run main.go

Then load the web page. Enter a new item and submit the form. The updated list should be displayed.

How to Add Okta Authentication to the Application

To add authentication to the Vue application, we first need to install two packages.

cd todo-vue npm install @okta/okta-auth-js npm install @okta/okta-signin-widget

To use Okta authentication you need to have a free [Okta Developer account] (https://developer.okta.com). Once you’ve done this, sign in to the developer console and select Applications > Add Application. Then select Single-Page App and hit Next. The next page is filled in with default values, most of which are sufficient for this application and don’t need to be changed. Add the URL http://localhost:8080 to the allowed Login Redirect URLs. Hit Done.

There are two pieces of information that you need to obtain from the Okta Developer Console. These are your Okta domain name and your client id. These values need to be passed into the Vue application. As these values are effectively secrets, they must not be hardcoded in the application. The way around this is to create environment variables in a file called .env in the todo-vue directory containing these values. Also add .env to the .gitignore file.

VUE_APP_OKTA_CLIENT_ID=my-client-id VUE_APP_OKTA_DOMAIN=my-okta-domain.okta.com

PS: The environment variable names must start with VUE_APP_ otherwise they will not be visible in the application.

How to Add Okta Authentication to the Vue Client

We are going to use the Okta authentication widget to log the user in. A successful login will create an access token that can be used to verify that the user is logged in.

First of all we will add a container for the login form to the template in src/App.vue, add the code below just above the <form> tag:

<div id="widget-container"></div>

Next, we need to add the login code to the script section of src/App.vue, starting with importing the OktaSignIn object inside the <script> tag.

import OktaSignIn from "@okta/okta-signin-widget"; import "@okta/okta-signin-widget/dist/css/okta-sign-in.min.css";

Next, just below the import lines, modify the appData constant so that it can hold a token:

const appData = { todolist: ["More to do"], token: '' }

Next, update the contents of the mounted: function() with the code below. This code creates the widget and extracts the token.

mounted: function () { var signIn = new OktaSignIn({ el: "#widget-container", baseUrl: "https://" + process.env.VUE_APP_OKTA_DOMAIN, clientId: process.env.VUE_APP_OKTA_CLIENT_ID, redirectUri: window.location.origin, authParams: { issuer: "https://" + process.env.VUE_APP_OKTA_DOMAIN + "/oauth2/default", responseType: ["token", "id_token"], display: "page", }, }); signIn.showSignInToGetTokens({ scopes: ['openid', 'profile'] }).then(function(tokens){ appData.token = tokens.accessToken.accessToken; axios.defaults.headers.common["Authorization"] = "Bearer " + appData.token; signIn.hide(); }); getList(); },

Let’s go through the code to understand what is happening. First of all, we construct an OktaSignIn object. It takes a dictionary as a parameter. The el defines the ID of the template element in which the login form will be displayed. The baseUrl and clientId are extracted from the secrets in the .env file. The redirectUri is set to the current page. This will cause the current page to be reloaded on a successful login with the required access token passed in the URL.

The renderEl method causes the login widget to be displayed.

The .then() function is called on a successful login. The access token is extracted and then set as an Authorization header in the Axios default headers. This means that the header will be sent on subsequent Axios requests. The sign-in widget is hidden once the token has been obtained.

We are going to validate the token on the server and return an error response if it fails. We need to modify the sendItem() function in App.vue to handle the error.

async function sendItem() { const params = new URLSearchParams(); params.append('item', this.todoitem); await axios.post("/api/lists", params) .then(function() { getList(); }) .catch(function (error) { appData.todolist = [ error.message ]; }) }

Build the Vue application, start the server and load the web page:

cd todo-vue npm run build cd .. go run main.go

The login form should be displayed. Log in using your Okta Developer Console credentials.

How to Validate an Access Token in Go

The access token which the client obtained is what is known as a JSON Web Token (JWT). To verify the token we will use an Okta JWT verifier. The Go package needs to be installed.

go get -u github.com/okta/okta-jwt-verifier-golang

First, add an import to main.go to load the Okta JWT verifier package.

jwtverifier "github.com/okta/okta-jwt-verifier-golang"

Next, add a verify() function to main.go.

var toValidate = map[string]string{ "aud": "api://default", "cid": os.Getenv("OKTA_CLIENT_ID"), } func verify(c *gin.Context) bool { status := true token := c.Request.Header.Get("Authorization") if strings.HasPrefix(token, "Bearer ") { token = strings.TrimPrefix(token, "Bearer ") verifierSetup := jwtverifier.JwtVerifier{ Issuer: "https://" + os.Getenv("OKTA_DOMAIN") + "/oauth2/default", ClaimsToValidate: toValidate, } verifier := verifierSetup.New() _, err := verifier.VerifyAccessToken(token) if err != nil { c.String(http.StatusForbidden, err.Error()) print(err.Error()) status = false } } else { c.String(http.StatusUnauthorized, "Unauthorized") status = false } return status }

The toValidate map defines the values of the audience and client ID claims for the verifier. The verify() function looks for an Authorization header containing a bearer token. If the token is found then the verifier is called to verify the token. If the token is verified successfully the function returns true. If the token is not found then a 401 Unauthorized response is sent and the function returns false. If the token is found but does not verify, then a 403 Forbidden response is sent and the function returns false.

Finally modify the AddListItem function to call verify().

func AddListItem(c *gin.Context) { if verify(c) { item := c.PostForm("item") todos = append(todos, item) c.String(http.StatusCreated, c.FullPath()+"/"+strconv.Itoa(len(todos)-1)) } }

Build the Vue application—start the server and load the web page. Add a To-Do item and submit the form. You should get an error response. Login and resubmit the To-Do item to verify that token verification works.

Conclusion

The Gin package for Golang makes it very easy to create a web server. It is particularly easy to build a RESTful web server as there is a function for each of the request methods. A user-defined function is called whenever a request is received. Gin can also serve static content from a specified directory.

The Vue.js JavaScript framework makes it easy to build web front ends that generate dynamic content. The Vue client can create the basic directory structure and the files required for a Vue application. The files are validated for correct syntax which helps avoid hard to diagnose JavaScript errors. Vue provides a server that can be used to test development code. Building the code in production mode creates static content that can be delivered by any web server.

The Okta authentication widget makes the verification process very easy to implement. The OAuth 2.0 authentication process is quite complex. The widget hides the complexity securely. Once the access token has been obtained it can be sent to the API server in a header. The JWT verification process is very simple using the Okta Go JWT verifier.

If you enjoyed reading this post, you might also like these posts from our blog:

Offline JWT Validation with Go Build a Single-Page App with Go and Vue The Lazy Developer’s Guide to Authentication with Vue.js

As always, if you have any questions please comment below. Never miss out on any of our awesome content by following us on Twitter and subscribing to our channel on YouTube!

Tuesday, 16. February 2021

Meeco

EU Data Governance Act

The proposed European Data Governance Act is another progressive indication that the EU is seeking to develop a more equitable digital economy. However, where we go from here depends on how the European Union is able to use the Data Governance Act to strike a balance between the existing tech ... Read More The post EU Data Governance Act appeared first on The Meeco Blog.
The proposed European Data Governance Act is another progressive indication that the EU is seeking to develop a more equitable digital economy. However, where we go from here depends on how the European Union is able to use the Data Governance Act to strike a balance between the existing tech giants and data platforms alongside an entirely new range of services designed to enable the collection, protection and exchange of data. Currently, a handful of global players enjoy a virtual monopoly on the exploitation of data. Unlocking these data silos and regulating for data mobility and interoperability will provide the vital infrastructure required for meeting the challenges of the next century, including timely and informed decision making. At Meeco we believe that enabling citizens, students, patients, passengers and consumers to more equitably join the value chains fuelled by data will ultimately lead to greater trust and personalisation, resulting in a more prosperous society. However, this will require new commercial models, enforceable regulation such as the Data Governance Act and the digital tools to transform our connected society. We believe this will lead to significant benefits to including personalised health and education, increased financial literacy and better financial decisions, more informed consumer choices which also contribute to protecting our environment. Meeco is endorsing the Data Governance Act as a founding member of Data Sovereignty Now; a coalition of leading Europe-based technology companies, research institutions and not-for-profit organisations. We are working together to ensure that the control of data remains in the hands of the people and organisations that generate it in order to play a key role in not only securing the rights of individuals over their data, but also providing significant stimulus for the digital economy. Meeco is also a member of MyData Global and was amongst the first 16 organisations to be awarded the MyData Operator designation in 2020. We join in the goal towards developing interconnected and human-centric data intermediaries to meet the personalisation and equity challenges of open digital society. We welcome the regulation as a needed common ground for clarifying the role of data intermediaries, building trust in these intermediaries and setting the direction for data governance, including the emergence of digital human rights. In this context we offer the following suggestions: 1. Explicitly include individuals as active participants in the definitions: define the key roles in data sharing (Art. 2 Definitions) so that data rights holders (data subject) and technical data holders (controller or processor) can be separated and acknowledge the type of data sharing where individuals are active participants in the transactions. 2. Clarify the scope of the data sharing services (Art. 9 (2)) and extend it to include services that empower the data subject beyond compliance. 3. Foster the growth of intermediaries, which offer new technologies and have the greatest likelihood of success in Europe if supported by the Data Governance Act. 4. Open silos and implement soft infrastructure such as standards & open APIs to accelerate uptake and interoperability between data sharing services. 5. Foster eco-systems and demonstrate the value through practical use-cases. The EU data sharing ecosystem is formative; therefore, it is imperative to demonstrate utility and benchmark best practices that contribute to a more sustainable, healthy, resilient and safe digital society. 6. Create a level playing field for sustainable data sharing by providing funding to pioneers at the forefront of developing data eco-systems, this includes start-ups, scale-ups alongside established enterprises. Included is a Meeco white paper detailing practical use-cases aligned to our response, including the barriers the Data Governance Act can address to make data work for all. Meeco is a global leader in the collection, protection & permission management of personal data and decentralised identity. Our award-winning patented API platform & tools enable developers to create mutual value through the ethical exchange of personal data. Privately, securely and always with explicit consent. Data Sovereignty Now is a coalition of partners who believe that Data Sovereignty should become the guiding principle in the development of national and European data sharing legislation. Data Sovereignty is the key driver for super-charging the data economy by putting the control of personal and business data back in the hands of the people and organisations which generate it.

The foundation members include aNewGovernance, freedom lab, INNOPAY, International Data Spaces Association, iSHARE, Meeco, University of Groningen, MyData Global, SITRA, The Chain Never Stops and TNO. MyData Global is an award-winning international non-profit. The purpose of MyData Global is to empower empower individuals by improving their right to self-determination regarding their personal data, based on the MyData Declaration. MyData Global has over 100 organisation members and more than 400 individual members from over 40 countries, on six continents.

The post EU Data Governance Act appeared first on The Meeco Blog.


1Kosmos BlockID

Why 1Kosmos is set to fill a void in cybersecurity & digital economy

Today, I’m pleased to announce that my new venture 1Kosmos is exiting stealth mode with a $15 million investment from ForgePoint Capital. With this Series A funding and our brand-new advisory board—which includes Chairman of the Board Admiral Mike McConnell, Former U.S. Secretary of Homeland Security Kirstjen Nielsen and other industry luminaries —we plan to accelerate 1Kosmos’ growt

Today, I’m pleased to announce that my new venture 1Kosmos is exiting stealth mode with a $15 million investment from ForgePoint Capital. With this Series A funding and our brand-new advisory board—which includes Chairman of the Board Admiral Mike McConnell, Former U.S. Secretary of Homeland Security Kirstjen Nielsen and other industry luminaries —we plan to accelerate 1Kosmos’ growth and product offerings.


Fission

Inside Fission’s Account Recovery Design

At first glance, account recovery might seem like a simple topic. But, Fission’s commitment to prioritizing user control makes things a little more complicated

At first glance, account recovery might seem like a simple topic: just write a reset password function, send a challenge code, and you're done!

But, Fission’s commitment to prioritizing user control makes things a little more complicated. When the user is in control, how do we help them get back into their data if they get locked out? The main relevant factors are:

We’ve designed our account system around portable account identifiers, called Decentralized IDs (DIDs) Your account holds encryption keys that keep your private content encrypted and private only to you — even the Fission platform doesn’t have access and can’t see your content. We wanted an account recovery system that was as trust-minimized as possible. Meaning, you don’t have to trust us, and you would require as few third-party systems as possible to help you out if you lose account access.
Oh, and one more thing: we wanted to do all of this without passwords.

At Fission, being password-free is the default. Instead, we use technology built into modern browsers — including mobile browsers — that generates and keeps a private encryption key safely stored. This is called the Web Crypto API.

By the way, if you haven’t tried this yet, head over to the default Fission Drive app and create a new Fission account. Notice that it asks you for a unique username and an email address, but it doesn’t ask you for a password.

To use your account on other devices — like your desktop browser, tablet, or phone — you get access by creating a key on your new device and linking it to another device that already has permission.

Now, the private keys built into the browsers on all your devices have permission to access your account and all of your encrypted, personal data — all without setting up or having to remember a password.

Apple is known for providing users with end-to-end encryption that keeps everyone, including Apple, out of any personal information. Fission is taking the same approach.

What about lost devices?

But what happens if you lose all your devices? Can you still get access to the encrypted content stored and synced online by your Fission account? Right now, the answer is no! We don’t have a copy of your key — and the Web Crypto API built into browsers is designed in such a way that we can’t get a copy for security reasons. If you lose access to all your devices, how can you recover your encrypted content? This is where Account Recovery comes in.

One of the design goals we have with Fission is to support a broad number of default use cases — we think this stuff should just work for everyone. We do rely on the most cutting edge browser standards, but we choose these standards based on the real world support available in all major browsers, including mobile browsers.

Using a single-device as a base case

Let’s consider the base case scenario of a person having just one mobile computing device — owning and using a single phone — meaning that losing that one mobile device means you’ve lost everything.

This is the common base case we need to keep in mind.

It’s crucial to remember that the typical software developer or early adopter is much easier to support: at least one mobile device and one desktop/laptop device. The private key stored on a desktop is much more likely to stick around for a really long time.

In fact, in some of our user research interviews, developers told us that the private key they used to securely connect to GitHub was often on their machine for 5 years. Basically, the entire lifetime of owning the machine before they got a new computer and set up a new key!

But since that isn’t the deal for our single mobile device base case, we have to plan for account recovery process with just a single device to make the system accessible and recoverable by the largest number of users.

Splitting up keys

Our basic principle involves making backup keys and splitting them into pieces. We then store some of those pieces on the Fission platform linked to the user account.

The other half is stored “offline” by the user, as recovery codes.

When the offline recovery codes are combined with the pieces on Fission, it creates the complete key that can access the person’s account.

Because of our commitment to privacy, Fission never has access to the whole key.

If someone breaks into Fission's database, there are no passwords and no secret keys to leak.
Where should I store recovery codes?

On Apple iOS devices, basic iCloud file storage is built into the operating system — just like Fission accounts have the Web Native File System included. We’ll recommend that users download the recovery codes and store them in their Safari iCloud downloads folder — or somewhere else on their iCloud files.

On Android, the defaults are a lot more varied. Many Android phones have Google services built in by default, so we’ll recommend this as a similar approach to using iCloud in our iOS example.

Advanced users can store this file anywhere they feel comfortable - be it in Apple Notes with a password, printed out on paper, stored in their 1Password account, or even sent as a “Note to Self” in their Signal messenger app.

Signal's Note to Self feature

And in the future, based on the feedback, we may allow advanced security-conscious users to forgo the use of recovery codes entirely, allowing them to self-manage backup keys.

Get involved

You can check out the in-progress diagrams and notes of our Account Recovery design process in the forum.

We’re not done with our design. We are still working at the level of security, systems, and cryptographic principles, as well as streamlining the user experience and the flows for users.

The code is being developed out in the open in the Dashboard code repository. We welcome questions, suggestions, and contributions – drop by our Discord chat.


digi.me

Digi.me partners with Healthmark to enable Covid testing and verified result reporting

With effective vaccines not just on the horizon but being administered to millions, the path out of blanket lockdowns is suddenly clearer and closer. As economies begin to open up again, focus is likely to shift to finding ways to allow those who are Covid-free to access much-missed elements of pre-pandemic life such as flights, sporting venues, office buildings and restaurants. Crucially, thi

With effective vaccines not just on the horizon but being administered to millions, the path out of blanket lockdowns is suddenly clearer and closer.

As economies begin to open up again, focus is likely to shift to finding ways to allow those who are Covid-free to access much-missed elements of pre-pandemic life such as flights, sporting venues, office buildings and restaurants.

Crucially, this will involve being able to provide, quickly, easily and securely, evidence of a recent negative test result from an accredited provider.

Continue reading Digi.me partners with Healthmark to enable Covid testing and verified result reporting at Digi.me.


KuppingerCole

Identity Governance Herausforderungen mit ForgeRock Autonomous Identity bewältigen

by Martin Kuppinger Die meisten Organisationen sehen den Vorteil von Identity Governance und Administration (IGA), erkennen jedoch, dass es in der Praxis Herausforderungen hat, denn bestimmte Aufgaben sind komplex und schwerfällig. Bestehende IGA-Lösungen kommen dadurch an ihre Grenzen und werden als unzureichend empfunden. Unternehmen benötigen neue, dynamischere Ansätze mit einem hohen Automati

by Martin Kuppinger

Die meisten Organisationen sehen den Vorteil von Identity Governance und Administration (IGA), erkennen jedoch, dass es in der Praxis Herausforderungen hat, denn bestimmte Aufgaben sind komplex und schwerfällig. Bestehende IGA-Lösungen kommen dadurch an ihre Grenzen und werden als unzureichend empfunden. Unternehmen benötigen neue, dynamischere Ansätze mit einem hohen Automatisierungsgrad. Künstliche Intelligenz (KI) und maschinelles Lernen (ML) tragen das Versprechen in sich,  Automatisierung für komplexe Aufgaben zu liefern. ForgeRock Autonomous Identity implementiert diese für IGA-Plattformen von ForgeRock und Drittanbietern und sorgt so dafür, dass mehr Organisationen Automatisierung durch ML effektiv verwenden und maximalen Nutzen aus ihnen ziehen können.


Provenance

We’re hiring a Freelance Digital Brand Designer!

Join leading SaaS company and B Corp, Provenance, to help us with our brand refresh. You’ll be joining a small but mighty team with skills in sustainability, marketing, content strategy, product and... The post We’re hiring a Freelance Digital Brand Designer! appeared first on Provenance News.

Join leading SaaS company and B Corp, Provenance, to help us with our brand refresh. You’ll be joining a small but mighty team with skills in sustainability, marketing, content strategy, product and engineering. We work with global clients in the food, drink and personal care industries to enable transparency on the things that really matter about the goods we buy.

As a freelance designer, you will be helping with our new, refreshed brand concept for Provenance and our new website design, as well as supporting how our activations look (QR codes, embeds, social designs).

Skills we are looking for:

Outstanding brand-building design and concepting skills  Ability to bring a brand to life across a spectrum of channels, maintaining a consistent feel while also responding to each touchpoint’s specific requirements  Strong digital background in web design and social content for SaaS companies Creativity and problem-solving skills, ability to translate complex ideas and data into clear, concise design queues Understanding of and passion for social impact and sustainability is highly desirable  Knowledge of using Webflow is a plus

About you:

Eager to collaborate with the Provenance team to work on the brand, while bringing best-in-class knowledge and examples for how to build digital SaaS brands Available for project kick-off ASAP in late February 2021 (for project running through April)

We believe in equal opportunities.

It takes a diverse and inclusive community of passionate, talented and committed people to build a system to enable commerce to be a force for good. We’re an equal opportunity employer, so we welcome applications from people of all backgrounds, with different outlooks and experiences. We are well set up for collaborative remote working, with the full team working from home in the UK and other cities in Europe during the current lockdown.

We look forward to hearing from you!

To apply: 

Please submit your cover letter, CV and portfolio via workable. Candidates will be assessed on a rolling basis.

Deadline 28 Feb 2021

The post We’re hiring a Freelance Digital Brand Designer! appeared first on Provenance News.


IDunion

IDunion: An open ecosystem for trusted identities

IDunion (formerly SSI for Germany) has completed the competition phase of the innovation competition “Schaufenster Sichere Digitale Identitäten” and is applying to the Federal Ministry of Economics and Energy (BMWi) for the next phase of the innovation competition. The use of decentralised, self-sovereign identities for natural persons, companies and things is to be tested in […]
IDunion (formerly SSI for Germany) has completed the competition phase of the innovation competition “Schaufenster Sichere Digitale Identitäten” and is applying to the Federal Ministry of Economics and Energy (BMWi) for the next phase of the innovation competition. The use of decentralised, self-sovereign identities for natural persons, companies and things is to be tested in over 35 use cases from a wide range of sectors. The project involves 26 well-known public and private partners. More information can be found on our website idunion.org.

Frankfurt am Main, 24 November 2020 – IDunion (formerly “SSI for Germany”) creates an ecosystem for decentralised, self-sovereign identities for natural persons, companies, institutions and things. In a partnership of renowned public and private institutions, a distributed identity network will be established, which will be managed by a European cooperative. The partners develop software applications and components, which leverage the the IDunion network and enable the allocation, storage and exchange of verified identity information. The use of open source software and standardised data formats will create a vendor-neutral solution and transparency as well as promote interoperability between providers, networks and traditional identity solutions.

A distributed test network is currently operated by 12 partners and will be available as a production environment for numerous use cases also outside the consortium from 2021. A detailed legal governance structure for the network has been developed with the support of the law firm Jones Day. In addition, extensive reports were prepared on legal topics, including the GDPR and eIDAS regulations, as well as on security aspects and a freedom-to-operate analysis.

In addition, various software applications were developed for both citizens and companies and tested in extensive user acceptance tests. Currently, identity wallets from the Main Incubator GmbH (Lissi Wallet) and esatus AG (esatus Wallet) are already available for end users to download. All partners are currently evaluating or implementing applications or use-cases for the further integration of the IDunion network into their existing systems.

The use cases touch on a variety of areas and target a variety of stakeholders including citizens, (public) institutions, companies and things. The over 35 use cases are divided into seven clusters: education, e-commerce/mobility, e-government, e-health, finance, identity & access management (IAM) and

industry/IoT. The consortium is open for further partners or use cases and is happy to provide all necessary software components for further pilots. You can find more information on our website at: www.idunion.org.

As consortium leader, Main Incubator GmbH is applying together with the following 25 partners for the implementation phase of the BMWi-funded project “Schaufenster Sichere Digitale Identitäten”.

Bank-Verlag GmbH Berlin Partner für Wirtschaft und Technologie GmbH Berliner Senatsverwaltung für Wirtschaft, Energie und Betriebe Bundesamt für Migration und Flüchtlinge Bundesdruckerei GmbH BWI GmbH Commerzbank AG D-Trust GmbH DB Systel GmbH Deutsche Telekom Innovation Laboratories (T-Labs) esatus AG Festo SE & Co. KG GS1 Germany GmbH ING Deutschland Ministerium für Wirtschaft, Innovation, Digitalisierung und Energie des Landes NRW (MWIDE) R3 LLC regio iT gesellschaft für informationstechnologie mbh Robert Bosch GmbH Siemens AG Spherity GmbH Stadt Köln Technische Universität Berlin (Fachgebiet Service-centric Networking und Zentraleinrichtung Campusmanagement) Verband der Vereine Creditreform e.V. Westfälische Hochschule Institut für Internet-Sicherheit (if(is)) YES Payment Services GmbH

About main incubator

main incubator is the research & development unit and the early-stage investor of Commerzbank Group. It examines future technologies of relevance to business and society, as well as promotes and develops sustainable solutions.

Based on the technologies of additive manufacturing, artificial intelligence, cross reality, internet of things, networks, robotics and quantum computing, main incubator develops prototypes, often in cooperation with partners from industry and research. By doing this, main incubator is actively involved in the creation of sustainable products, solutions and infrastructures.

Through strategic investments in tech-driven start-ups, main incubator already supports innovations at an early stage and makes them available for Commerzbank and its customers.

Furthermore, main incubator promotes the tech ecosystem by participating in opinion-forming processes and committee work, or through its own events, such as the monthly tech start-up event series “Between the Towers”.

Main Incubator GmbH, or main incubator for short, is a wholly-owned subsidiary of Commerzbank AG headquartered in Frankfurt am Main.

Press contact

Main Incubator GmbH

Adrian Doerk & Kathrin Mateoschus

presse@main-incubator.com


Elliptic

Crypto Regulatory Affairs: Central Banks Continue the March Towards CBDC Design and Testing

In Elliptic’s 2021 Crypto Regulation and Predictions blog, David Carlisle, Head of Global Policy and Regulatory Affairs, noted that as the world’s first central bank digital currencies (CBDCs) gain traction, their money laundering implications will also start to attract serious attention. As central banks globally continue to research, consult, pilot, and experiment with various desi

In Elliptic’s 2021 Crypto Regulation and Predictions blog, David Carlisle, Head of Global Policy and Regulatory Affairs, noted that as the world’s first central bank digital currencies (CBDCs) gain traction, their money laundering implications will also start to attract serious attention. As central banks globally continue to research, consult, pilot, and experiment with various designs and applications of CBDCs, we note that the Asia Pacific region leads the way, and we don’t mean China this time. 


KuppingerCole

Apr 14, 2021: Balancing SAP Security: Access, Protection, Authorization

Join KC live event to hear experts from the SAP together with invited speakers and moderators talking about the key SAP security concepts and solution, discussing how to navigate the path from traditional SAP Access Control towards the new solutions of SAP, how to integrate SAP security and identity requirements into a broader perspective, and more.
Join KC live event to hear experts from the SAP together with invited speakers and moderators talking about the key SAP security concepts and solution, discussing how to navigate the path from traditional SAP Access Control towards the new solutions of SAP, how to integrate SAP security and identity requirements into a broader perspective, and more.

Jun 23, 2021: Managing Digital Workflows with ServiceNow

Join senior practitioners from research and enterprise as they discuss the current state of ITSM, highlight business opportunities brought about by SerivceNow and demonstrate live use-cases enabled by the platform. IT professionals from enterprise, SMEs, and government institutions - be sure not to miss out on this exciting KCLive event!
Join senior practitioners from research and enterprise as they discuss the current state of ITSM, highlight business opportunities brought about by SerivceNow and demonstrate live use-cases enabled by the platform. IT professionals from enterprise, SMEs, and government institutions - be sure not to miss out on this exciting KCLive event!

Fission

Project Cambria Overview with Geoffrey Litt and Peter van Hardenberg

An overview of Project Cambria by Geoffrey Litt and Peter van Hardenberg. Manage schema changes in distributed systems, so a version 1 app can read version 1000 app data.

The Cambria project is an exploration of how to manage changing data formats and schemas in a decentralized piece of software. We had Geoffrey Litt and Peter van Hardenberg present the project, in which they were also joined by Orion Henry.

The project is funded and operated by Ink & Switch, whose work on Local First Software we often reference. Cambria was in part inspired by the challenges involved with synchronizing data with these local first apps.

Enter Cambria: a Javascript/Typescript library for converting JSON data between related schemas, using lenses.

Translates all your data at run time, on the fly.https://t.co/nscje7zhqP pic.twitter.com/lahZ21usO7

— FISSION (@FISSIONcodes) February 10, 2021

The output of the project so far is an experimental Typescript library:

…for effectively managing schema change in distributed systems. It aims to allow developers to express relationships between schemas using bidirectional lenses, and to avoid mixing compatibility code into application logic.
Project Cambria: Schema evolution in distributed systems with edit lenses Direct link to video vimeo.com/511271022

Thank you to Geoffrey and Peter for presenting! You can follow Geoffrey @geoffreylitt and Peter @pvh on Twitter.

Speaking of Twitter, we had a great set of attendees and captured some excellent quotes and comments along the way. The Twitter handles that were shared are available in the forum chat log.

Public schemas have the general property of being either too specific for your use case or too general for anyone's use case.
-@pvh

🤣

— Jess Martin (@jessmartin) February 10, 2021

"A v1 app can read a v1000 app data" @pvh

"Everyone writes their native format and leaves clues on how to read it"

— FISSION (@FISSIONcodes) February 10, 2021

Rumours about a sea shanty re-mix of some of Peter's phrases can neither be confirmed or denied.

Resources Slides from the presentation on Pitch.com Cambria Development Notes Cambria Project on Github https://github.com/inkandswitch/cambria Fission forum post with chat log

Interested in more presentations like this? Browse the presentations here on the blog and sign up for upcoming events on the forum.

Monday, 15. February 2021

KuppingerCole

Privileged Access Management from a CISO Perspective

Privileged user accounts are significant targets for attacks as they have elevated permission, access to confidential data and the ability to change settings. And if compromised, the amount of damage to an organization can be disastrous. No wonder that this is on the mind of our chief information security officers. Join our CEO Berthold and Rob Edmondson, Technology Strategist at Thycotic in this

Privileged user accounts are significant targets for attacks as they have elevated permission, access to confidential data and the ability to change settings. And if compromised, the amount of damage to an organization can be disastrous. No wonder that this is on the mind of our chief information security officers. Join our CEO Berthold and Rob Edmondson, Technology Strategist at Thycotic in this conversation!




Where to Start Your IAM Modernization Journey

Many enterprises are nowadays dealing with the modernization of their Identity & Access Management. Modernizing Identity Governance and Administration (IGA) and well as Access Management at the same time can become too complex. In this video blog post, Martin gives practical advice on how enterprises can get their priorities straight.

Many enterprises are nowadays dealing with the modernization of their Identity & Access Management. Modernizing Identity Governance and Administration (IGA) and well as Access Management at the same time can become too complex.

In this video blog post, Martin gives practical advice on how enterprises can get their priorities straight.




ShareRing

ShareRing (SHR) Launches Staking on Launchpool, in collaboration with Alphabit

On Monday 15th February, ShareRing will be launching staking on Launchpool, a new blockchain-based project that realigns stakeholder incentives to create a more fair and... The post ShareRing (SHR) Launches Staking on Launchpool, in collaboration with Alphabit appeared first on Official ShareRing Blog.

On Monday 15th February, ShareRing will be launching staking on Launchpool, a new blockchain-based project that realigns stakeholder incentives to create a more fair and...

The post ShareRing (SHR) Launches Staking on Launchpool, in collaboration with Alphabit appeared first on Official ShareRing Blog.


IBM Blockchain

A sustainable future: How your next home project could help end illegal logging

As part of staying home these last many months, I’ve been working on different projects around the house. When I buy new materials for these projects from the home improvement stores in my community, I of course look for the best combination of quality and price, but like so many others today but I’m looking […] The post A sustainable future: How your next home project could help end illegal log

As part of staying home these last many months, I’ve been working on different projects around the house. When I buy new materials for these projects from the home improvement stores in my community, I of course look for the best combination of quality and price, but like so many others today but I’m looking […]

The post A sustainable future: How your next home project could help end illegal logging appeared first on Blockchain Pulse: IBM Blockchain Blog.


KYC Chain

KYC-Chain and unFederalReserve Announce Partnership

KYC-Chain and fintech company unFederalReserve announce partnership. The post KYC-Chain and unFederalReserve Announce Partnership appeared first on KYC-Chain.

Space Elephant \ Unikname

Comment sécuriser et mieux gérer votre parc de sites web ?

L’article Comment sécuriser et mieux gérer votre parc de sites web ? est apparu en premier sur Unikname.

WEBINAR Jeudi 18 Février à 10h30  Je participe

Vous gérez plusieurs sites web sur différentes technos ? Alors ce Webinar est fait pour vous ! À l’occasion ce nouvel épisode, vous découvrirez nos secrets en matière de sécurité et surtout des bonnes astuces pour faciliter votre gestion au quotidien. Soyons honnête, maintenir un parc de sites web n’est pas toujours évident…

Qui de mieux pour en parler que notre partenaire privilégié : Cyllène ! Olivier Poelaert, Président du Groupe, nous fera l’honneur de sa présence, et partagera avec nous son expérience, puisque Cyllène administre aujourd’hui un parc de +200 sites, et ce, sur plusieurs environnements.

Prévoyez vos questions, nous sommes prêts à y répondre en direct !

Olivier Poelaert
Président – Groupe Cyllène

Damien Lecan
Directeur Technique – Unikname

Au programme 35 min de présentation + 5 min de questions Découvrez tous nos secrets dans la gestion multi-sites ! Au sommaire de ce Webinar :

Marché – Introduction aux principes de la gestion multi-sites

Retour d’expérience – Témoignage Cyllène

3 secrets pour administrer plusieurs sites à la fois tout en gardant un haut niveau de sécurité

Réservez votre place dès maintenant ! « 3 secrets pour gérer et mieux sécuriser votre flotte de sites web »

Votre nom

Votre Prénom

Votre Entreprise

Votre Email

J’en profite pour m’abonner à la Newsletter Unikname (1 à 2 mail par mois, pas de spam 😉 )*

5 + 8 =

M'inscrire Vos informations vous appartiennent et restent sur nos propres serveurs, elles ne seront divulguées à personne, promis. Voir notre politique de respect de la vie privée.*

L’article Comment sécuriser et mieux gérer votre parc de sites web ? est apparu en premier sur Unikname.


Tokeny Solutions

Security Tokens Get the Green Light in Luxembourg

The post Security Tokens Get the Green Light in Luxembourg appeared first on Tokeny Solutions.
February 2021 Security Tokens Get the Green Light in Luxembourg

Last week, Luxembourg’s regulator, the CSSF, adopted a bill that explicitly recognised the possibility of using distributed ledger technology for the dematerialisation of securities. See the market spotlight below for the panel I spoke on with the CSSF and some other friends.

The regulation is moving quickly across Europe: Tokenized securities now fall under the same rules and regulations as traditional financial instruments in many other European countries including France, Germany, Italy, the Netherlands, Romania, Spain and the UK.

What will this new bill in Luxembourg mean for tokenized securities?

The Bill is a continuation of the law passed in 1 March 2019. Perhaps unsurprisingly, after it was introduced we saw a surge in demand from financial institutions in the region. Curiosity was sparked in Luxembourg and in wider European markets. Most importantly, it gave market actors the confidence they needed to start utilising the technology operationally. In the coming months, we expect the same to happen across the industry due to this recent regulatory development.

How can market actors adopt the technology?

For market actors to adopt the technology and benefit from the automation and transferability improvements, they need to do so in a compliant manner. Control and regulation still need to be applied. Therefore, proven and open-source security token standards like the T-REX protocol are essential as they allow assets to be represented on the blockchain compliantly. They enable agents appointed by the issuer to exercise control over assets and ensure the securities can only be held by eligible investors.

What do we expect to happen in the year ahead?

We believe the regulation will give even more confidence to the financial services sector in Luxembourg and beyond. So far, Luxembourg has been one of the most proactive jurisdictions in Europe with its first law in 2019. However, this needed to be developed and last week’s law will bring more clarity and more confidence to asset owners. Real estate closed-end funds are currently the asset class that benefits the most from this new form of securities and we expect this to accelerate in 2021.

Due to the progression in regulation we are experiencing a spate of interest and projects from the fund industry. Our customers, WeInvest, Finimmo and Digibrixx have already begun building their projects, with more on the way.

Market Spotlight

FOCUS ON TOKENIZATION PANEL

Organized by Luxembourg For Finance 

The event brings together the regulator CSSF and experts from across financial services to provide insights on the topic.

Market Insights

Bitcoin Powers to New High as Tesla Takes It Mainstream

Bitcoin extended gains on Tuesday to a record high as the afterglow of Tesla Inc’s investment in the cryptocurrency had investors reckoning it would become a mainstream asset class.

Reuters

Read More

 

BNP Paribas Securities, Prudential Partner for Blockchain Derivatives Solution

Yesterday BNP Paribas Securities Services announced they implemented a blockchain solution for exchange-traded derivatives (ETD) developed by Singapore fintech Hashstacs.

Ledger Insights

Read More

 

Visa Prepares for Crypto Future

Visa CEO Alfred Kelly says the card scheme is preparing its payments network to handle a full range of cryptocurrency assets.

Finextra

Read More

 

Swiss Bank Seba Issues Series B Equity as Security Tokens

Swiss digital asset bank Seba is set to issue its Series B equity to shareholders as security tokens on a blockchain.

Finextra

Read More

Compliance In Focus

Fine Wines Become First Tokenized Securities Under New Swiss Blockchain Law

Sygnum, a digital-asset finance firm with a Swiss banking license, has tokenized its first set of assets under the nation’s new law addressing the use of distributed ledger technology (DLT).

Coindesk

Read More

 

Secretary Yellen’s First Action on Bitcoin Will Set the Tone for the Next Four Years

If there is one thing that criminals crave, it’s anonymity. The ability to cloak their illicit activities behind layers of privacy — or to blend in with everyone else — gives bad actors an edge in every arena.

The Hill

Read More

 

Luxembourg Gives the Green Light for the Native Issuance of Security Tokens

Last week, Luxembourg’s regulator, the CSSF, adopted a bill that explicitly recognised the possibility of using distributed ledger technology for the dematerialisation of securities.

Tokeny Solutions

Read More

Subscribe Newsletter

A monthly newsletter designed to give you an overview of the key developments across the asset tokenization industry.

Previous Newsletter  Feb15 Security Tokens Get the Green Light in Luxembourg February 2021 Security Tokens Get the Green Light in Luxembourg Last week, Luxembourg’s regulator, the CSSF, adopted a bill that explicitly recognised the possibility of… Jan22 2021: Digital Assets Become Officially Accepted by Financial Institutions January 2021 2021: Digital Assets Become Officially Accepted by Financial Institutions Happy New Year and welcome to 2021’s first edition of Tokeny Insights. This time… Nov9 We Turned Three – What Have We Achieved So Far? November 2020 We Turned Three – What Have We Achieved So Far? Last month we celebrated our third birthday. Alongside sharing our celebrations with the… Oct15 What Next for Private Markets? October 2020 What Next for Private Markets? Private markets have been booming of late. Over the last ten years allocation to private markets has doubled. However,…

The post Security Tokens Get the Green Light in Luxembourg appeared first on Tokeny Solutions.


Infocert (IT)

Sistema Scuola Impresa, tre modelli femminili in InfoCert per le future generazioni

Il World Science Report dell’Unesco ha definito la carriera professionale femminile un tubo forato in riferimento al fatto che, dopo la laurea, non tutte le donne proseguono il proprio percorso nel mondo del lavoro. A livello mondiale la percentuale di donne impegnate in attività di ricerca non supera il 29%, mentre in Italia: La percentuale […] The post Sistema Scuola Impresa, tre modelli femmi

Il World Science Report dell’Unesco ha definito la carriera professionale femminile un tubo forato in riferimento al fatto che, dopo la laurea, non tutte le donne proseguono il proprio percorso nel mondo del lavoro.

A livello mondiale la percentuale di donne impegnate in attività di ricerca non supera il 29%, mentre in Italia:

La percentuale di donne ricercatrici varia molto in funzione della disciplina: in biologia, medicina, e in generale nelle scienze della vita i numeri sono molto incoraggianti. […] Gli ultimi dati sui livelli di istruzione registrano un numero maggiore di donne laureate (22,4%) rispetto agli uomini (16,8%). […] La percentuale di ricercatrici invece cala se si considerano solo le materie STEM (Science, Technology, Engineering and Maths), in modo particolare scienze matematiche, informatiche, fisiche, ingegneria ma anche scienze economiche e statistiche. In questi campi le donne hanno maggiori difficoltà nel continuare la via della ricerca scientifica dopo aver conseguito il dottorato o anche un assegno post doc.

Lucia Votano – dirigente di Ricerca emerita Istituto Nazionale di Fisica Nucleare

Considerando questo contesto, InfoCert ha aderito con entusiasmo al progetto Sistema Scuola Impresa di ELIS: un progetto formativo rivolto agli studenti delle scuole italiane con un’attenzione specifica alle ragazze per il loro accesso alle professioni nei settori a prevalenza maschile.

Il progetto ha l’obiettivo di presentare a docenti e studenti dei modelli in cui potersi rispecchiare e in cui riconoscere talenti inespressi e percorsi possibili. Le Role Model individuate dal progetto, sono professioniste aziendali dimostratesi capaci di incidere nel contesto in cui operano con entusiasmo e determinazione.

3 dipendenti InfoCert laureate in materie scientifiche sono state selezionate per prendere parte al progetto in qualità di Role Model.

Alla fine di una serie di incontri di formazione, queste incontrarono gli studenti delle scuole italiane in una nuova edizione del programma, che nella precedente ha raggiunto oltre 18.600 studenti sul territorio italiano, il 97% dei quali consiglierebbe ad altri studenti di fare la stessa esperienza.

Per saperne di più sul progetto: sistemascuolaimpresa.elis.org

The post Sistema Scuola Impresa, tre modelli femminili in InfoCert per le future generazioni appeared first on InfoCert.digital.


KuppingerCole

May 27, 2021: Enabling the Future of Identity and Access Management

Join the KCLive Event on the future of identity and access management to learn how to implement an interconnected IAM architecture designed for the post-pandemic era.
Join the KCLive Event on the future of identity and access management to learn how to implement an interconnected IAM architecture designed for the post-pandemic era.

May 11, 2021: Modern IGA Capabilities for Identity-Centric Security

Join the KCLive Event on IGA capabilities for identity-centric security to challenge legacy IGA and get insights on reducing security risk, strengthen compliance and improve efficiency with a modern future-oriented approach.
Join the KCLive Event on IGA capabilities for identity-centric security to challenge legacy IGA and get insights on reducing security risk, strengthen compliance and improve efficiency with a modern future-oriented approach.

auth0

How to Read and Remove Metadata from Your Photos With Python

Smartphones include EXIF metadata in their photos. Here’s how to read, write, and erase it using Python.
Smartphones include EXIF metadata in their photos. Here’s how to read, write, and erase it using Python.

Sunday, 14. February 2021

KuppingerCole

KuppingerCole Analyst Chat: The Need For New Drivers to Improve Cybersecurity

The press, security vendors, politicians and analysts alike currently often focus only on the recent SolarWinds security incident and its exceptional features and effects While this is in fact an extremely important topic to learn from and to clean up, the shadow of this hype causes that at the same time it is often neglected that even very basic cybersecurity aspects are poorly addressed in many

The press, security vendors, politicians and analysts alike currently often focus only on the recent SolarWinds security incident and its exceptional features and effects While this is in fact an extremely important topic to learn from and to clean up, the shadow of this hype causes that at the same time it is often neglected that even very basic cybersecurity aspects are poorly addressed in many organizations. Alexei and Matthias look beyond the hype and discuss the need for new initiatives to achieve an actual adoption of proper measures to improve basic cybersecurity hygiene in essentially all organizations.




Identosphere Identity Highlights

Identosphere Weekly #19 • The Flavors of Verifiable Credentials • Intaba Roundtable • Mydata on DGA

⑲ The cream off of the weekly news cycle: Upcoming Webinars, Videos, Podcasts, & other News on Verifiable Credentials, Decentralized Identity and Personal Data ⑲
Welcome to another issue of identity highlights!

Much thanks to our Patrons who support this publication.
If you haven’t already, consider making a contribution of your choice at https://patreon.com/identosphere/.

Upcoming EVENTS Digital Sovereignty in eID-Solutions – Self-sovereign, Centralised or Privatised? Part 2

NGI Forward Salon • 24 February (free)

Internet Identity Workshop XXXII (#32)

If you are new to the SSI space this is the place to be. Take advantage of the virtual event this April 20 - 22, 2021 it is our 3rd Virtual Event via QiqoChat. We are planning to be in person (if possible) for IIW #33 in Mountain View, October the 12-14th, 2021.

Thoughtful Biometrics is March 8,10,12

creating a space to dialogue about critical emerging issues surrounding biometric and digital identity technologies.

Company News Meeco is Hiring

Graduate or Junior UX/UI Designer for our Australian team, where you can help shape the API-of-Me.

Technical Team Lead for our Australian team, where you can help shape the API-of-Me.

From Trinsic  New Tools to Support Production Deployments

Status.trinsic.id

View historical uptime: Using the status page, you can see the last 90 days of uptime of all our externally-facing services. You can also inspect individual incidents and view incident reports.
Be notified of incidents: By clicking the “subscribe” button in the upper-left of the screen, you can have any downtime or incidents trigger a notification to your email or Slack workspace.

Trinsic Community Slack

As Trinsic has grown in popularity among the SSI developer community, several Trinsic User Groups have started organically. While we encourage this, we also want to give these communities an official home. That’s why we’ve created a Slack workspace just for the Trinsic community.

Company Culture & Trinsineers

Trinsineers are people who’ve agreed to take the journey to make the world more accessible to people everywhere. We’re a team of people who happen to be working together inside a legal entity called Trinsic. This journey is not a casual stroll, but an expedition. As Trinsineers, we’re developing a culture that is not only helping us accomplish our goals but bringing fulfillment and enjoyment along the way.

SSI and Music in Web 3.0

while our primary goals of financial disintermediation and inclusion are being realized in our existing projects, a greater long-term goal remains: to return to musicians and artists the control of their own data. Music publishing companies, record labels, performance rights organizations, and other industry intermediaries have had too much power for too long.

Ditto Music developing Opulous on Algorand

More on the company who wants to bring SSI (and DeFi) to independent artists.

we’ve helped more than 250,000 artists get their music out to the world independently.

Our business has expanded from distribution to providing record label, publishing and management services. Every move we’ve made has been based on our mission to help artists take control of their own music careers.

That’s why I’m so excited about our latest product Opulous, which we’re developing with Algorand. It’s our first step into the world of DeFi.

If you read nothing else this week get these 2 papers

Critical in understanding interoperability issues with Verifiable Credentials Formats. 

The Flavors of Verifiable Credentials

is complete and published on the Linux Foundation Public Health Blog

The differences between the different flavors of VCs for technically inclined readers. It elaborated on the differences between JSON and JSON-LD and articulated differences between the two different implementations of ZKP style credentials. The ‘Journey of a VC’ section articulated all steps where VCs are active and highlighted the differences in how different VC flavors ’behave’. 

Decentralized identity discussed: An INATBA roundtable round-up

We pointed to this before it happened. It was great. Here is a round up from Jolocom. It is highly recommended.

The paper Decentralised Identity: What’s at Stake?. Answering the paper’s core question of what’s at stake, it gives three essential scenarios:

Ideal – full convergence of SSI technology with interoperability by default.

Functional – partial convergence resulting in detached ecosystems.

Dysfunctional – no convergence and isolated, locked-in ecosystems.

Blogs A Unified Theory of Decentralization

The good news is that centralization isn’t the only source of convenience. As you will see in the discussion farther down, decentralized solutions for the problems of distributed systems are more robust and designed to operate in the worst of conditions.

Windley Writes Passwords are Ruining the Web 

Passwords are ruining the web with awful, lengthy, and inconsistent user experiences. They're insecure and lead to data breaches. The good news is there are good ways for web sites to be passwordless. If you hate passwords, build the world you want to live in.

Persistence, Programming, and Picos

Picos show that image-based development can be done in a manner consistent with the best practices we use today without losing the important benefits it brings.

About: PICOS

The project name, PICOS, is an abbreviation of “Privacy and Identity Management for Community Services”. The objective of the project is to advance the state of the art in technologies that provide privacy-enhanced identity and trust management features within complex community-supporting services that are built on Next Generation Networks and delivered by multiple communication service providers. The approach taken by the project is to research, develop, build trial and evaluate an open, privacy-respecting, trust-enabling identity management platform that supports the provision of community services by mobile communication service providers.

Learn more about the motivation, the objectives, tasks and achievements of PICOS, and get to know the PICOS exemplary communities.

Kaliya has two cool new jobs.

Working as the Ecosystems Director at Covid Credentials Initiative and heading the Verifiable Credentials Policy Committee of the Blockchain Advocacy Committee. 

Organization News Jolocom’s latest contributions to DIF

Over the course of 2020, 

Jolocom added support for an off-chain element based on KERI. This is in addition to the Jolocom DID method (did:jolo and did:keri), which supports the Jolocom-Lib, our own SDK and the Jolocom SmartWallet.

Jolocom focused on the Rust KERI implementation, which we donated to DIF last fall

An example of the KERI DID registrar/resolver integrated in our library can be found here. This is also included in the Jolocom SmartWallet via the SDK integration. (KERI is currently being worked on in the Decentralized Identity Foundation’s Identifiers and Discovery Working Group,)

We at Jolocom strongly believe that DIDComm is a crucial infrastructure element for the broader and future-proof SSI stack, and current work on DIDComm v2 includes Jolocom’s implementation of the specification with authcrypt (authenticated encrypted) and most of the low level of the protocol.

DIF F2FJan21 - DIDComm Demo Session with Ivan Temchenko, Tobias Looker, and Oliver Terbu

During the live demo he showed the message lifecycle in various setups using the new, open source didcomm-rs library on GitHub

Sovrin ANN: Compliance & Inclusive Finance Working Group (CIFWG)

Since 2019, Sovrin has hosted the Compliance and Payments Task Force (CPTF), an open group of traditional bank and non-bank financial institutions, regulators, policymakers, technologists, ethicists, and legal experts. The CPTF has developed and promoted the Rulebook, an innovative best practices framework that extends traditional banking compliance and payments guidance to emerging fintech and VASP processes.

MyData Global response to Data Governance Act, Feb 8th 2021 Towards interconnected and human-centric data intermediaries

We believe that the Data Governance Act can influence global norms on sustainable data governance in the same way as the GDPR pushed the data protection norms beyond the EU.

Our top picks for potential improvements are:

1. Explicitly include individuals as active participants in the definitions

2. Clear and comprehensive scope

3. Moderate requirements

4. Interoperability between the data sharing services

LFPH Calls for Coordination of Digital Vaccination Records Using Open Standards

The CCI community collaborated with Linux Foundation Public Health to write a letter to the Biden Administration about how Verifiable Credentials could be used to support re-opening the economy. 

Some states and other countries have started to pilot this approach, as have various industries like film and aviation. But, the inconsistent use of standards and varying implementations have already led to confusion and public concern. An effort coordinated at the federal level would lead most quickly to uniform adoption and true inter-state and cross-domain interoperability.

LFPH and our partner organizations are ready to collaborate with you on this.

Covid Vaccinations ‘Data Donor’ Program – A Proposal for the Scottish Government

“The Scottish Government must invest in data, digital and technology in health and social care to help Scotland recover from Covid-19. Closing the data gap in the sector could be worth £800m a year and deliver savings of £5.4bn to NHS Scotland. SCD said better data would help to build resilience against future public health challenges, which in turn will drive a healthy economy.” - Scottish Council for Development and Industry

Our solution provides a platform for achieving exactly this, both in terms of equipping Scotland with a powerful integrated data environment and also through a framework where developers can further build on this with other apps for a myriad of other use cases. It could be tied in with the vaccination scheduling system as an immediate step for example.

On Tuesday, the Good Health Pass Collaborative (GHPC) launched.

ID2020 announced the launch of the Good Health Pass Collaborative along with more than 25 leading individual companies and organizations in the technology, health, and travel sectors — including the Airports Council International (ACI), Commons Project Foundation, Covid Credentials Initiative, Evernym, Hyperledger, IBM, International Chamber of Commerce (ICC), Linux Foundation Public Health, Lumedic, Mastercard, Trust Over IP Foundation, and others.

Working Together on What “Good” Looks Like - Hyperledger

This initiative is intended to define, in the context of test results and vaccination records for opening up borders for travel and commerce, a high bar for implementations of identity and credentialing systems to meet with regards to privacy, ethics and portability. They will also work with the implementers of such systems to converge towards common standards and governance.  

DID Method Onion Specification

🧅 part of the torgap technology family

DIDs that target a distributed ledger face significant practical challenges in bootstrapping enough meaningful trusted data around identities to incentivize mass adoption. We propose using a new DID method that allows them to bootstrap trust using a Tor Hidden Service's existing reputation.

we'd like to review more with our community how close we want to keep did:onion to did:web, and if we want to incorporate some elements of did:peer or KERI or to leverage services like Open Time Stamps.

Torgap architecture & products

Torgap is the Blockchain Commons security and privacy architecture model for creating gaps between connected apps and microservices. It supports privacy, service anonymity, identity psuedonymity, non-correlation, censorship-resistance, and seperation-of-interests and reduces single-points-of-failure. This emerging architecture is supported by QuickConnect and Blockchain Commons' Gordian system, while our Airgapped Wallet community and our research papers are charting its future.

PodCasts Self-Sovereign Identity Authors Alex Preukschat & Shannon Appelcline Discussing

Decentralized digital identity and verifiable credentials explain what Self-Sovereign Identity (SSI) is, why it’s important, and provide examples of practical applications for individuals and organizations.

Kaliya appeared on the Mint and Burn 

An academic nerdy podcast, out of RMIT, Australia

Episode 6: 'Digital Identity & Blockchain' with Kaliya Young, Prof. Jason Potts, & Prof. Ellie Rennie PSA Today #33: Kaliya & Seth talk Synthetic Data with Harry Keen, CEO and co-founder of Hazy.com

Originally a UCL AI spin out, London-based Hazy was initially incubated by Post Urban Ventures and CyLon cybersecurity accelerator. Our startup began trying to fix the flaws of traditional data redaction and then data anonymisation. We soon discovered anonymised data will always pose a risk to re-identification.

NOT SSI BUT IDENTITY

Ministry of Economy, Trade and Industry and OpenID Foundation in Liaison Agreement on eKYC & IDA for Legal Entities

The OpenID Foundation (OIDF), the international standards development organization which maintains the OpenID Connect for Identity Assurance (OIDC4IDA) standard, and the Japanese Government’s Ministry of Economy, Trade and Industry (METI) have signed a liaison agreement to work together.

Under the agreement, METI will lead policy efforts to implement identity assurance frameworks for legal entities in Japanese Government and private sector while the OIDF’s eKYC & Identity Assurance (eKYC & IDA) Working Group continues to advance the technical standards that enable many digital identity solutions. The agreement:

Provides a mechanism to collaborate “about Authentication and Identity Assurance for Legal Entity”, mutually approved white papers, workshops, podcasts  and other outreach activities;

Allows participation of each party’s staff and members in the other party’s meetings, as mutually agreed;

Provides for direct  communications to communicate (without obligation and only to the extent each party chooses) about new work and upcoming meetings; 

Supports common goals, including where appropriate and mutually agreed, to Specifications of Authentication and Identity Assurance for Legal Entity. 

2021 OpenID Foundation Board Update

Nat Sakimura and John Bradley were re-elected to new two-year terms as community member representatives. Nat and John’s well-known technical expertise and global thought leadership ensures continuity across working groups and as the Foundation transitions to new leadership in 2021.

What's New in Passwordless Standards, 2021 edition! (Microsoft)

The Web Authentication API (WebAuthn) Level 2 specification is currently a Candidate Recommendation at the W3C. "Level 2" essentially means major version number 2.

The version 2.1 of theClient to Authenticator Protocol (CTAP) specification is a Release Draft at the FIDO Alliance. This means the spec is in a public review period before final publication.

We think you might want to hear about what we think is especially fun about WebAuthn L2 and CTAP 2.1.

Identity Ownership and Security in the Wake of the Pandemic

Highlights from Ping Identity’s Andre Durand, and Richard Bird on an episode of Ping’s new podast Hello User

we explore how the pandemic has opened up an opportunity to shape the future of personal identity.

Takeaway #1: We digitized much of our economy during the pandemic but neglected one important aspect: identity.

Takeaway #2: Third parties have much more control over digital identity than individuals.

Takeaway #3: We’re on the cusp of a tectonic shift in the notion of digital identity.

Takeaway #4: The pandemic has accelerated the changes needed to shape the future of digital identity security.

Takeaway #5: Moving control of digital identity to the individual will dramatically change our current identity and access management systems.

Thanks for Reading

We’ll be back next week with the latest news and developments.

In the meantime, you can read previous issues, share and subscribe via newsletter.identosphere.net.

If you are interested in getting a corporate subscription please contact us.

We ask you to consider supporting this publication with a monthly contribution of your choice at https://patreon.com/identosphere.

Saturday, 13. February 2021

ShareRing

KardiaChain and ShareRing Conclude the First Virtual Treasure Hunt!

The first Virtual Treasure Hunt, hosted by KardiaChain and ShareRing is now concluded. Congratulations to Minimoog for unlocking the chest and claiming $500 in SHR... The post KardiaChain and ShareRing Conclude the First Virtual Treasure Hunt! appeared first on Official ShareRing Blog.

The first Virtual Treasure Hunt, hosted by KardiaChain and ShareRing is now concluded. Congratulations to Minimoog for unlocking the chest and claiming $500 in SHR...

The post KardiaChain and ShareRing Conclude the First Virtual Treasure Hunt! appeared first on Official ShareRing Blog.

Friday, 12. February 2021

Fission

Startup Vancouver Interviews Boris Mann about Fission

CEO & Co-founder of Fission, Boris Mann, is interviewed by Reza Varzidehkar as part of the Startup Vancouver Episode 44. Boris covers some of the vision and mission behind what Fission is building.

I was invited to join Episode 44 of Startup Vancouver's #WhatKeepsYouUpAtNight show. I was interviewed by Reza Varzidehkar. The Q&A format was a really great chance to talk about some of the vision and "why" behind what we're doing, as well as explain some of the technical building blocks of how Fission works.

Here's the 17 minute clip of my interview:

Direct link to video vimeo.com/511387825

The co-hosts of the show are Maryam Mobini, Colin Weston and Reza Vee. Thanks Reza for interviewing me, and Colin for inviting me!

Other guests included Melanie Ewan of Volition Advisors, and James Basnett of Shape.

The full episode is available on YouTube. The events are free to attend, but donations are accepted on behalf of jack.org, "Canada's only charity that trains and empowers young leaders to revolutionize mental health in every province and territory".


KABN Network

Valuations in our sector are rocking — with no signs of slowing down.

Valuations in our sector are rocking — with no signs of slowing down. Best-in-class, scalable technology solutions that exceed customer expectations are in high demand. Precis 2020 saw a rush of adoption and evolution in eCommerce and online presence, not created by but fuelled in pace by the advent of the COVID-19 pandemic. By sector we are referring to technology, and fast-growing
Valuations in our sector are rocking — with no signs of slowing down.

Best-in-class, scalable technology solutions that exceed customer expectations are in high demand.

Precis

2020 saw a rush of adoption and evolution in eCommerce and online presence, not created by but fuelled in pace by the advent of the COVID-19 pandemic.

By sector we are referring to technology, and fast-growing subsectors digital identity and verification needs and facilitations, the financial technology space, and all eCommerce interactions in this next internet era. Fintechs are consistently part of challenger banking now, or neobanking, upending the oligopolistic traditional financial services practice. This sector is bringing along fast-moving partners and symbiotic cutting-edge solutions that meet and drive customer expectations and behaviour. For clear reasons, the need for safe and trusted digital tools is quickly spreading to all arenas, including verified identity.

Safe Harbour

Tom Kennedy has been remunerated by KABN Systems NA in the past twenty-four months. He holds a long position in KABN (CSE:KABN) at the time of publication.

His research is intended for the sophisticated investor to assess market developments and company performance and make insights. His comments are in no way intended as a solicitation to trade in any securities. All comments are subject to Risks and Uncertainties outside of the control of the author.

Please see the Disclosure Statements at the end of document.

The US and Canadian tech sectors are in favour right now, sped up by the pandemic

In Canada and the US, and in fact across the globe, technology is a sector in significant favour and flux, and is attracting a lot of interest right now.

Dozens of new companies over the past few months have found private investor groups in multiple seed rounds, and many are hitting the public markets via IPOs and RTOs for more exposure and access to capital. The M&A cycle is seeing a lot of activity too. Incumbent players are moving fast on new developments to bolster their product offerings and keep up with the customer and market demands. Financial investors are searching for the next unicorn, or more likely a puzzle piece positioned to be gobbled up at handsome valuations by leveraging vertical integrations or amalgamating user bases into their own.

Internet deals in the US led the way in Q4/20, with almost double the dealflow Healthcare saw in second place. While we do not expect this pace to continue its linear growth, we do expect that 1) Canada typically lags the US markets, and 2) the horse has already left the barn. While KABN’s timing appears to have been excellent, execution demand will only increase as these players are expected to demonstrate they are worth premium valuations.

Figure 1: Q4/20 was all about internet deals

Source: PwC/CB Insights MoneyTree™ Report Q4 2020. The stock market is supportive of valuations

The stock market is very useful for valuation as buyers and sellers commit to pricing every day trading is open. Over the past two years, we have seen a cyclical shift away from certain sectors like commodities, consumer goods, and traditional financial services towards technology.

There are many ways to illustrate this point. We compare the Dow Jones industrial Average, a metric of the broad US markets, against the Technology subsector, over the past two years, in Figure 2:

Figure 2: Tech Stocks are just beginning to outperform

Source: Investing.com; Tom Kennedy. The pandemic didn’t create this upcoming Tech Boom — but it didn’t hurt

While the COVID-19 pandemic has undoubtedly had huge exogenous impacts on markets and consumer habits, we contend it did not create this evolution. That said, it has clearly sped things up.

One anecdotal way to explain adoption is that inevitably the customer is likely to embrace a more convenient, safer, cheaper, or better alternative brought on by technology eventually. The exact timing however is often difficult to pinpoint. During the pandemic, many people chose to or were strongly encouraged to purchase food online. This may be an activity they had never engaged in before and might not even have the technology. So they end up buying groceries that arrive at their door in good time and good order, or a prepared meal, and they enjoy it, and are now committed online shoppers. According to 451 Research, a boutique eCommerce research firm in the US, eCommerce spending rose by 23% year over year in 2020 over 2019. These kinds of numbers represent rapid adoption that will not reverse to pre-COVID levels.

We would note the broader market has also done very well during the pandemic, which is helpful to enhancing access to capital for any subsector.

A different way to look at the relative performance of tech versus the broader market is seen comparing the value of $1 invested (all figures USD for this point) on 23 March, 2020 to 22 January, 2021. That dollar would be worth $1.95 now, relative to $1.67 for a buck in the broader Dow Jones Industrial Average.

This represents an alpha, or relative outperformance, of the tech subsector, at 16.8% over that timeframe which is approximately 10 months of data. A couple of caveats are that we have chosen the US market for more robust data but further believe the US market is an excellent proxy for Canada. That said we have not normalized for the exchange rate, but there are so many things from elections to interest rate movements and virtually infinite economic minutiae that affect all durations of currency fluctuation. In addition, Canada typically lags the US, particularly in market developments. This element enhances our point if true and furthermore contributes to the benefits of obtaining a wider US investment audience both from a valuation and adoption perspective.

Figure 3: Value of $1 invested at pandemic market bottom

Source: Investing.com; Tom Kennedy. The Customer will ultimately drive adoption — and that’s where we believe KABN can excel

Global professional services firm Accenture surveyed 47,000 customers on their digital banking practices in their Accenture 2020 Global Banking Consumer Study. Our interpretation is that the customers can be segmented logically by range of adoption, from Traditionalists to Pioneers, and they are rapidly adopting digital interactions with a focus on humanized elements (“Congratulations, your loan is approved!”) with security and online threats being a major theme. Note that ‘humanized’ and ‘personalized’ are two different important factors. Humanized means counteracting the user feeling their requests are being served by a computer program, while personalized refers to a customized experience with their online interactions that feels tailored to their needs, attitudes, and utility rather than fees seemingly for a beige product suite with few alternatives. Well, guess what incumbents: there are fast-moving disruptors with better and better mousetraps.

Trust in interactions has also fallen precipitously as seen in by the survey’s results, led by bank institutions and insurance, but broadly across all major categories. Trust in banking fell to 37% in 2020 (from 51% of customers trusting their bank in 2018), Insurers fell to 32% (40% — 2018) and Online Payments to 21% (from 30% Trust in 2018).

KABN is uniquely placed to play a role in all these personalized, customer-centric interactions. The customer stands to gain online safety and confidence, the customer owns and can transparently monetize their own identity, and Liquid Avatar stands to be a customized key, wallet, and armour to open empowered and expanded online presence and functionality.

Two recent transactions — the announced sales of Kount and SoFi — indicate takeout valuation goalposts

Two relevant transactions were announced thus far in 2021. The first was Kount, which announced an acquisition for US$640 MM in cash by Equifax (EFX:NYSE). The second was the acquisition of fintech startup SoFi for US$8.65 BB by one of Social Capital’s special purpose acquisition corporations.

So to valuation: in each of these cases, the takeout premiums were impressive, and de-emphasize revenues in favour of what SoFi refers to as “WINNER TAKES MOST.” We know what this means. My thesis is that the firms with best-in-class products that are in a position to respond to the massive growth in online eCommerce are going to share a huge pie, not take the whole thing. And the value they are in a position to create is driving buyers to pay up for years of as-yet unrealized performance.

Take Kount, a leading provider of anti-fraud solutions to thousands of North American brands.

They have two significant assets, one being a robust proprietary data library and the other being a “secret shopper” type process whereby they test your company’s system for fraud protection. Kount then operates as a high end security consulting firm with bespoke solutions for clients’ eCommerce suites.

This drove a valuation we estimate at >13x trailing twelve months’ (TTM) revenue, a number that obviously demonstrates the value of their non-revenue performance. They appear cash flow breakeven for at least another full year, as Equifax speaks to the value they will add in over two years of future performance as part of the basis of their purchase price. Equifax constantly runs the risk of becoming a dinosaur every passing day and needs the shot in the arm that is found in new and promising technology. That said, they have the liquidity to acquire new products and teams rather than build them in-house so they can quickly leverage for their existing customers and cross-sell for internal use and customer acquisition alike. Equifax is no stranger to security challenges across their own customers, being dealt a blow in 2017 over a security breach that exposed over 150 million customers’ private data.

Social Capital’s offer to acquire SoFi represents a significant going-public premium. SoFi is widely recognized as a central market disruptor in financial services right now. This transaction represents an acquisition by a Special Purpose Acquisition Company, not dissimilar to an RTO (reverse takeover) method of going public. This is effectively a financial transaction with a lofty premium in spite of the lack of any synergies or strategic development other than going public.

SoFi is a fintech pioneer that is disrupting the banks, being vastly more customer friendly and customer focused, and offering a far lower cost model given the internet infrastructure rather than bricks and mortar banking. The valuation being paid by venture capital firm Social Capital implies 13.6x 2025E forecast Adjusted Net Income on figures released by Social Capital. To provide a nearer term perspective, this implies a valuation of 13.9x 2020E Adjusted Net Revenue.

This means the buyer has already pre-paid for five full years of success culminating in projected results to be announced around now in 2026, at a 50%+ percent premium to current banking institutions’ one-year forward earnings. In the case of SoFi, the SPAC shell saw its value soar, meaning investors see more room above and beyond the premium’s baked in accomplishments. We would also note there are normal considerable growth milestones yet to be accomplished by SoFi that their growth relies upon, notably the receipt of a federal banking license.

Summary

In summation, we believe the market is entering a new period of non-standard valuations for best-in-class technology that delivers on customer expectations. We expect this will continue all through 2021 and into the foreseeable future.

We will leave with a parting thought: Merriam-Webster Dictionary has added the word “crowdfunding” for 2021. Things are developing so fast even the dictionary is trying to stay relevant.

KABN is potentially well-positioned to thrive in this fast-paced environment.

© 2021 Thomas Kennedy. All rights reserved. See Disclosure Statement

Disclosure Statements: Please view the Disclaimer

This report has been prepared on behalf of KABN Systems NA Holdings Corp. and it’s subsidiary, KABN Systems North America Inc, effectively “KABN” or the “Companies” and is confidential and proprietary. It does not purport to contain all the information that a prospective investor may require in connection with any potential investment in the Companies or related program(s). You should not treat the contents of this report, or any information provided in connection with it, as financial advice, financial product advice or advice relating to legal, taxation or investment matters.

This report does not include all available information in relation to the business, operations, affairs, financial position or prospects of the Companies. No representation or warranty (whether express or implied) is made by the Companies or any of its shareholders, directors, officers, advisers, agents or employees as to the accuracy, completeness or reasonableness of the information, statements, opinions or matters (express or implied) arising out of, contained in or derived from this report or provided in connection with it, or any omission from this report, nor as to the attainability of any estimates, forecasts or projections set out in this report.

This report is provided expressly on the basis that you will carry out your own independent inquiries into the matters contained in the report and make your own independent decisions about the business, operations, affairs, financial position or prospects of the Companies. The Companies reserves the right to update, amend or supplement the information contained in this report at any time in its absolute discretion (without incurring any obligation to do so) without any obligation to advise you of any such update, amendment or supplement. The delivery or availability of this report shall not, under any circumstance, create any implication that there has been no change in the business, operations, affairs, financial position or prospects of the Companies or that information contained herein is correct after the date of this report.

Neither the Companies nor any of its shareholders, directors, officers, advisors, agents or employees take any responsibility for, or will accept any liability whether direct or indirect, express or implied, contractual, tortuous, statutory or otherwise, in respect of the accuracy or completeness of the information contained in this report, for any errors, omissions or misstatements in or from this report or for any loss howsoever arising from the use of this report. Any such responsibility or liability is, to the maximum extent permitted by law, expressly disclaimed and excluded.

This report does not constitute, or form part of, any offer or invitation to sell or issue, or any solicitation of any offer to subscribe for or purchase, any securities of the Companies, nor shall it form the basis of or be relied upon in connection with, or act as any inducement to enter into, any contract or commitment whatsoever with respect to such securities. Under no circumstances should this report be construed as a prospectus, advertisement or public offering of securities.

Future Matters

This report may contain reference to certain intentions, expectations, future plans, strategy and prospects of the Companies. Those intentions, expectations, future plans, strategies and prospects may or may not be achieved. They are based on certain assumptions, which may not be met or on which views may differ and may be affected by known and unknown risks. The performance and operations of the Companies may be influenced by a number of factors, many of which are outside the control of the Companies. No representation or warranty, express or implied, is made by the Companies, or any of its shareholders, directors, officers, advisers, agents or employees that any intentions, expectations or plans will be achieved either totally or partially or that any particular rate of return will be achieved.

Given the risks and uncertainties that may cause the Companies actual future results, performance or achievements to be materially different from those expected, planned or intended, you should not place undue reliance on these intentions, expectations, future plans, strategies and prospects. The Companies do not represent or warrant that the actual results, performance or achievements will be as intended, expected or planned.

The information contained in this report includes some statement that are not purely historical and that are “forward-looking statements.” Such forward-looking statements include, but are not limited to, statements regarding our and their management’s expectations, hopes, beliefs, intentions or strategies regarding the future, including our financial condition, results of operations. In addition, any statements that refer to projections, forecasts or other characterizations of future events or circumstances, including any underlying assumptions, are forward-looking statements. The words “anticipates,” “believes,” “continue,” “could,” “estimates,” “expects,” “intends,” “may,” “might,” “plans,” “possible,” “potential,” “predicts,” “projects,” “seeks,” “should,” “would” and similar expressions, or the negatives of such terms, may identify forward-looking statements, but the absence of these words does not mean that a statement is not forward-looking.

The Companies seek Safe Harbor.


1Kosmos BlockID

About Gartner's 5 Key Predictions for IAM and Fraud Detection

Long before the COVID-19 pandemic, employees used their work devices to check their social or news notifications while on break. Now, the lines between personal and work uses are even more blurred with most employees working remotely. Unsurprisingly, this has caused extra challenges for IAM leaders. Cyber attackers use phishing methods to gain personal information from employees, and

Long before the COVID-19 pandemic, employees used their work devices to check their social or news notifications while on break. Now, the lines between personal and work uses are even more blurred with most employees working remotely. Unsurprisingly, this has caused extra challenges for IAM leaders. Cyber attackers use phishing methods to gain personal information from employees, and when employees are on their work computers nearly all day, there are more opportunities for phishers to attack them. Now, IAM leaders must learn new ways to mitigate various issues such as identity proofing in a 100% remote environment.


Elliptic

One of the World's Most Prolific Cybercriminals Has Retired - And May Well Be a Bitcoin Billionaire

  Hundreds of millions of cards have been stolen from online retailers, banks and payments companies before being sold for cryptocurrency on dozens of online marketplaces. According to Elliptic’s analysis, the founder of one of the most popular carding marketplaces, Joker’s Stash, has retired having amassed a fortune of over $1 billion. Every time you make a purchase with

 

Hundreds of millions of cards have been stolen from online retailers, banks and payments companies before being sold for cryptocurrency on dozens of online marketplaces. According to Elliptic’s analysis, the founder of one of the most popular carding marketplaces, Joker’s Stash, has retired having amassed a fortune of over $1 billion.

Every time you make a purchase with a credit or debit card, your card details are transmitted and stored in computer systems. Many of the major hacks of retailers and other companies are motivated by getting hold of these card credentials.

Stolen cards have value because they can be used to purchase high-value items or gift cards, which can then be resold for cash. This process is known as “carding”, and has become a key part of the cybercriminal’s playbook. Carding is very profitable in its own right, but it is also used to help launder and cash-out cryptocurrency obtained through other types of cybercrime.

 

Joker’s Stash - the King of Carding

Over the past six years, Joker’s Stash rose to become the largest online seller of stolen credit cards and identity data. It is an example of a carding AVC (automated vending cart), which allows large volumes of cards to be sorted, filtered and purchased for Bitcoin with immediate delivery. In the image below you can see these cards categorised and searchable by country, bank, expiry date and other attributes. The going price for a single card on Joker’s Stash ranges from $1 to $150, with those at the upper end coming complete with the cardholder’s name, address and social security number.


Ocean Protocol

Ocean Product Update || 2021

What We’re Building in 2021, and Why Summary This post describes Ocean’s 2021 product roadmap. It reflects our shift from achieving early product-market fit (0 →1), to that of traction (1 →N). The roadmap includes fixing gas costs via multiple chains, Data Farming to further incentivize data consumption, fine-grained permissions, and streaming data services. There will be continued improvem
What We’re Building in 2021, and Why Summary

This post describes Ocean’s 2021 product roadmap. It reflects our shift from achieving early product-market fit (0 →1), to that of traction (1 →N). The roadmap includes fixing gas costs via multiple chains, Data Farming to further incentivize data consumption, fine-grained permissions, and streaming data services. There will be continued improvements in the core Ocean developer tools, Ocean Market, and OceanDAO.

Ocean 2021 Roadmap Summary Outline

The rest of this post is organized as follows. We describe Ocean’s maturation from targeting product-market fit (0→1) towards traction (1→N). We then describe how we think about traction, then drill down in further and further detail from key performance indicators, to traction hypotheses, and finally to the 2021 product roadmap by quarter. Let’s get started!

From 0→ 1 (The Past)

From Ocean’s inception until the end of 2020, the focus for Ocean was getting product-market fit (“0 → 1”) around unlocking data [1]. In early 2020, we published our 2020 roadmap, and by the end of 2020 we had met it.

Let’s review the major items. In fall 2020, we released Ocean V3. It was a huge release for us. It included two products to publish / buy / consume / stake on datasets / services:

Ocean Market webapp — for data publishers, consumers, and stakers Ocean developer tools — core software libraries & contracts for developers to integrate with other apps or build wholly new apps

In Dec 2020, we released:

OceanDAO — a community-curated grants DAO for growth & long-term sustainability.

The reception to all three products has been enthusiastic. There are tens of teams building on Ocean core software, Ocean Market has hundreds of datasets with more than €5M TVL [2], and OceanDAO has a steady cadence of monthly funding cycles to teams which promise positive value. There’s also steady chat in Telegram and Discord channels surrounding all three products, and a vibrant weekly videochat town hall.

The most important outcome was: People started to treat data as an asset.

We’d designed for this, by wrapping each dataset as an ERC20 token (datatoken), for use in DeFi. MetaMask became a data wallet, and Balancer a data DEX. People launched new data assets as Initial Data Offerings. People bought data assets not just to consume, but also to speculate. People staked on data AMM pools, not just to curate but also to provide liquidity. The perennial question “how do you price data?” got answered definitively: the market prices the data. Trading volume is over 100 times the consume volume; this is a good thing because trading and liquidity are preconditions for markets to form. We just witnessed the birth of the open Web3 Data Economy.

To 1→ N (The Future)

In short, Ocean has a first toehold in product-market fit (“0→1”). Data is being treated as an asset; this is the foundation going forward. Many users are able to get real value from Ocean’s core product offerings. There’s still room to improve each of these products, for more users. Let’s focus on “more users”. The overall goal is ubiquity — we want Ocean to be used everywhere as key IT infrastructure for the planet, side-by-side with DNS, Bitcoin, and other emerging blockchains and protocols. To achieve ubiquity, we need sustained growth over long periods of time. The label ”traction” encapsulates the goal of growth over time. Therefore the focus for Ocean from 2021 onwards is traction (“1→ N”). Importantly, $OCEAN is engineered to grow as traction grows.

We’ve already started on the traction path, by having rapid iterations with users, to get feedback and translate that into improvements in each of the products (and all the planning, design, construction, etc that happens in between). For example, Ocean Market has seen remarkable evolution in just the few months it’s been live, due to these rapid cycles — from bug fixes to handling fraudulent actors to better-aligning incentives of publishers with stakers.

That’s a near-term approach to improve traction. We can structure this more rigorously for longer time periods.

Ocean Traction Strategy

Here’s our traction strategy (inspired by the book Traction):

Have target key performance indicators (KPIs) Have hypotheses of how to grow those KPIs Try the most promising hypotheses, and observe how they do Double down on the hypotheses that do the best and prune the rest. Once those experience diminishing returns, repeat the process with new hypotheses

The next three sections outline (i) the KPIs, (ii) hypotheses, and (iii) specific plans respectively.

2021 KPIs

In Dec 2020 and early Jan 2021, the Ocean core team met several times for planning 2021. Traction was the name of the game.

The following traction-oriented KPIs emerged. We aim to grow each of these indicators.

Data consumer/provider traction

Value of data consumed in USD. This is particularly important — it’s a good indicator of value being created in this new data economy Number of datasets / services offered

Staking & DeFi traction:

Total Value Locked of OCEAN & datatokens Number of DeFi protocols with OCEAN & datatokens as first-class citizens

Developer & community traction:

Number of active high-quality dapps using Ocean Number of GitHub contributors, external maintainers & support, active ambassadors Number of community members, and content creators Value of monthly grants distributed by OceanDAO, while retaining quality

Enterprise/government traction:

Number of engagements. Specifically, for a dedicated handful of partners, maximize their success towards real-world production deployment 2021 Traction Hypotheses

With these KPIs in hand, we brainstormed on hypotheses of ways to drive them. (And of course we also included many ideas from the past too.) There were many great ideas! We then filtered to our favorites. Here’s the distillation of that, grouped according to three pillars: (I) Bottom-up, (II) Top-down, and (III) DeFi. Let’s elaborate on each pillar.

I. Bottom-up: Ocean Market users, developers, startups. Main activities:

Drive growth of Ocean Market. Improve user experience (UX), including reducing gas costs. Further optimize incentive structure. Add virality. Help data publishers, consumers, and liquidity providers make money. Provide arbitrage opportunities. Help people build with Ocean libraries. Improve developer experience (DX). Help teams make money. Fund teams for building, outreach, unlocking data. Via OceanDAO (community-curated grants), Shipyard (core-team curated grants), and Data Economy Challenge (prizes). It’s especially important to grow OceanDAO, since it will decentralize Ocean further and position Ocean for long-term sustainability. Details here. Integrations. Integrate more datatoken / service types: streaming data, oracles, etc. Deploy on more chains: Eth → many (details here). Integrate to more crypto apps: wallets, DEXes, DAOs, and more. Integrate to existing AI tools like scikit-learn, Anaconda, TensorFlow, OpenMined, and OpenML.

II. Top-down: enterprises, governments. Main activities:

Business development and services support, starting with POCs but leading to live deployments on massive scale. Help enterprises turn their data into value, including data “on their books” rather than as a liability. Help governments implement data policies with the help of Ocean. Service partners. Growing a network of service partners to support these enterprise and government activities (to offload reliance on core team).

III. DeFi. Main activities:

Data Farming. It’s similar to liquidity mining but emphasizes consuming data (versus pure speculation). Details here. Work closely with DeFi projects. For each, get OCEAN as a first-class citizen, then follow up with datatokens. Conduct many experiments, from stablecoins to insurance, and double-down on what works best. Data DeFi “snowballs”. Incentivize DeFi applications that make more money with more data, leading to a positive feedback loop on Ocean DeFi usage (“snowball effect”). This includes arb bots, trading systems, and yield farming tools that use AI for decision making.

These traction pillars complement each other. All three have real-world use cases. All three involve users that have data, and want data. Bottom-up has low latency in feedback (hours or days), enabling rapid improvements in the products. Its exponential nature promises huge impact over the long term. Top-down demonstrates use cases on large brand names, typically has large numbers attached, and the added credibility helps influence policymaking around data. While top-down may not scale as well, doing non-scalable things can really help traction in early days. DeFi has some of the best aspects of the others: like bottom-up it’s low latency and exponential in nature, and like top-down it has large numbers attached. And, it’s fun.

2021 Ocean Product Roadmap, by Quarter

With the traction-oriented KPIs and traction hypotheses in hand, the Ocean core teams for product, ecosystem, etc did extensive planning. We reconciled this with the needs of our enterprise collaborators.

Below is what we arrived at, for our 2021 plans, organized quarterly. These plans support the three traction pillars, and in turn (we hope) help to drive the KPIs. We also reserve the right to change our plans based on learnings in the rapidly evolving crypto environment.

The plans focus on Ocean products; dovetailing this there are extensive activities in ecosystem, enterprise, and communications towards the 2021 traction goals.

Q1/2021 Targets

We aim to complete the following by the end of Q1/2021. The highlight is fixing gas costs, and Compute-to-Data in Ocean Market.

Fix gas costs. Gas costs on Ethereum mainnet have become exceedingly high and may not abate. We are addressing this by deploying Ocean contracts and running Ocean Market on >1 chains. Moonbeam and other chains are in the works. Each chain will include an OCEAN bridge to Ethereum mainnet. At least one will be ready by end of Q1. Ocean Market: has Compute-to-Data. This enables buying & selling of private data from Ocean Market webapp itself. Ocean Market: allow more than one pool for each datatoken. This helps migrating from one pool to another, and is a first step towards Ocean Market linking to other AMM pools, order book based exchanges, and more. Ocean Market: fewer bugs, faster. This includes continued bug fixes, use TheGraph for Total Value Locked (TVL), and more. OceanDAO: streamline flow to reduce friction for people creating proposals and voting. Grow the amount of funding, while ensuring quality projects. Ecosystem, enterprise and communications activities that dovetail the products. This includes experiments in DeFi initiatives using OCEAN or datatokens. Q2/2021 Targets

We have the following goals for Q2/2021. The highlights are the first deployment of Data Farming, an OCEAN-backed stablecoin in Ocean Market, and supporting more than one datatoken type.

Data Farming: first deployment. This program incentivizes data consume volume. We will start by allocating a modest amount of OCEAN, then allocate more in each cycle as data consume volume grows. As we learn, we will refine the program. OCEAN-backed stablecoin in Ocean Market. Having prices and liquidity set (optionally) in a stablecoin reduces variance due to fluctuations in OCEAN. A stablecoin backed by OCEAN, like $OceanO by OpenDAO, helps align interests between Ocean Market users and the broader Ocean community. Architecture for more than one datatoken type. This refactor includes smart contract changes and a security audit. It allows us to move metadata to the ERC20 contract itself, which we will also do. The added flexibility will make possible a variety of datatoken types, including more fine-grained permissions, fee rate flexibility, and more. Multi-chain: improve support by deploying to more chains, and enabling Ocean Market users to see data assets across several chains at once. Over time (not necessarily this quarter) support bridges for datatokens and cross-chain liquidity by leveraging developments in cross-chain technology. OceanDAO, Ecosystem activities: continue growing and improving. Q3/2021 Targets

We aim to complete the following by the end of Q3/2021. The highlights are ramping up Data Farming, fine-grained permissions, and supporting more data services.

Ramp up Data Farming. We will build on the learnings from Q2 to further scale up the OCEAN allocated, to further catalyze data consume volume. 3rd Party Markets: fine-grained permissions using Role-Based Access Control (RBAC) while retaining datatoken ERC20 mental model. Helps marketplace instances to control which actors can publish assets in that marketplace instance, view assets, and consume data assets. This is for 3rd party markets; Ocean Market will remain as open-ended as possible. Data Services: support two more data services as data assets. Possibilities include WebSockets (streaming data), Chainlink, TheGraph, Streamr, GraphQL streaming data, and more; we will prioritize as we go. Multi-chain, OceanDAO, Ecosystem activities: continue growing and improving.

Q4/2021 Targets

We aim to complete the following by the end of Q4/2021. The highlights are better staking and helping the community to monetize better.

Better staking. People can stake OCEAN in data pools without having to buy the data asset in the process. This will use Balancer V2. In general, it will better align the incentives of publishers, consumers, and liquidity providers / stakers / curators. Help community monetize. For people running their own marketplaces (as dedicated apps, or in wallets, etc), enable potential for fees in publishing, swapping, staking, and consuming, with flexibility on setting the fees. Market: trustless & convenient urls, to retain Ocean Market’s current convenience while removing its having brief custody of urls. Data Farming, Data Services, Multi-chain, OceanDAO, Ecosystem activities: continue growing and improving. Conclusion

Up until the end of 2020, our goal was to get initial product-market fit for Ocean (0→1). Now that we have a foothold on that, our focus going forward is traction (1→N). This article presented our KPIs, our hypotheses of how to achieve those, and finally specific product plans towards implementing hypotheses.

Appendix: Roadmap with Dependencies

The following image illustrates the roadmap presented above, but adding dependency information. It highlights critical paths, and streams of work by color.

Notes

[1] “0 to 1” is a concept popularized by Peter Thiel in his book Zero to One: Notes on Startups, or How to Build the Future.

[2] For an up-to-date value of Total Value Locked, please refer to the stats at the bottom of https://market.oceanprotocol.com/ page

Updates Feb 15, 2021. Added “Roadmap with Dependencies” section Feb 17, 2021. Removed “Ratings & Comments” Q3 feature. Re-prioritized for later.

Follow Ocean Protocol on Twitter, Telegram, LinkedIn, Reddit, GitHub & Newsletter for project updates and announcements. And chat directly with other developers on Discord.

Ocean Product Update || 2021 was originally published in Ocean Protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.


KYC Chain

How to Use KYC-Chain for Your Upcoming Token Sale

As many regulators are now beginning to treat all tokens as securities, choosing the right KYC tech is an important step for any company considering an ICO. This article outlines the benefits of implementing the right KYC tools and what to look for in your KYC provider. The post How to Use KYC-Chain for Your Upcoming Token Sale appeared first on KYC-Chain.

Peer Ledger

More secure than Personal Health Records

Encryption and anonymized data are just a few of the features that make MIMOSI Health more secure than Personal Health Records and Electronic Medical Records. One year into the COVID-19 pandemic and the healthcare system is still plagued by technical glitches, fragmented systems and difficult to use software. Enter: MIMOSI Health. MIMOSI Health was developed with the goal of preventing

Encryption and anonymized data are just a few of the features that make MIMOSI Health more secure than Personal Health Records and Electronic Medical Records.

One year into the COVID-19 pandemic and the healthcare system is still plagued by technical glitches, fragmented systems and difficult to use software. Enter: MIMOSI Health.

MIMOSI Health was developed with the goal of preventing community spread of COVID-19. It is a digital platform providing real-time data on disease spread, testing, recovered cases and vaccinations. MIMOSI Health unifies stakeholders in public and private sectors by bridging disparate systems and providing the big picture data needed to make timely, informed decisions.

Focused on privacy and security

MIMOSI Health supports safe return to work by providing digital, verifiable proof of testing or vaccination. The simple, intuitive design makes it easy for you to book appointments for testing and vaccinations, or upload the results of at-home tests. Simply access your test results via email using 2-factor authentication.

MIMOSI Health only collects data relevant to COVID-19 testing and vaccination, meaning your personal medical history is never at risk. All demographic data is immediately anonymized before being aggregated into dashboards. Role-based access control (RBAC) limits the data visible to authorized users based on their job functions. User roles are customizable to accurately reflect testing and vaccination workflows.

MIMOSI Health is not a Personal Health Record (PHR) or Electronic Medical Record (EMR). It is a standalone solution that can easily integrate with existing health management systems. MIMOSI Health supports a holistic approach to the fight against COVID-19 by creating a seamless and secure way to keep everyone informed and protected.

MIMOSI Health vs. PHRs & EMRs

*Authorized users include:

Regulators can access visual analytics for current/recovered cases, number of appointments booked and much more. Regulators can also access the MIMOSI Health Heat Map, which provides a concise overview of where positive test cases are self-isolating.

Nurses and Contact Tracers can access the patient’s contact information and testing history.

Community Testers record patient information and schedule testing appointments. Community testers are authorized to access the mentioned information.

Lab Technicians can access a patient’s testing history.

Patients can securely book appointments for testing or vaccinations and record the results of at-home tests on the system. Patients access test results and vaccination status in real-time via protected 2-factor authentication.

Stay connected. Stay immune. Stay safe.

Contact us to request more information or schedule a demo. We are ready to help you!


digi.me

TechUK’s plan to accelerate innovation in healthcare is a welcome way forward

The last 12 months have spurred innovation in healthcare like no other as the world battles to get on top of the coronavirus pandemic. The scale of achievement, from vaccine development through to contact-tracing apps, has highlighted the potential of technology to transform healthcare – and pointed to even greater possibilities in the future. Recognising this, and to build on this existing mo

The last 12 months have spurred innovation in healthcare like no other as the world battles to get on top of the coronavirus pandemic.

The scale of achievement, from vaccine development through to contact-tracing apps, has highlighted the potential of technology to transform healthcare – and pointed to even greater possibilities in the future.

Recognising this, and to build on this existing momentum, trade association techUK has published a Ten Point Plan for Healthtech, which sets out ten key steps to improve individual experience and outcomes, as well as the UK’s reputation in this area.

Continue reading TechUK’s plan to accelerate innovation in healthcare is a welcome way forward at Digi.me.


Microsof Identity Standards Blog

What's New in Passwordless Standards, 2021 edition!

Hi everyone and welcome to chapter 14 of 2020! It’s been a little while since we talked about standards for passwordless so we’re excited to tell you about some new enhancements and features in FIDO2 land that you'll start seeing in the wild in the next few months!     Specification Status   The Web Authentication API (WebAuthn) Level 2 specification is currently a Candidate

Hi everyone and welcome to chapter 14 of 2020! It’s been a little while since we talked about standards for passwordless so we’re excited to tell you about some new enhancements and features in FIDO2 land that you'll start seeing in the wild in the next few months!

 

Specification Status

 

The Web Authentication API (WebAuthn) Level 2 specification is currently a Candidate Recommendation at the W3C. "Level 2" essentially means major version number 2.

 

The version 2.1 of the Client to Authenticator Protocol (CTAP) specification is a Release Draft at the FIDO Alliance. This means the spec is in a public review period before final publication.

 

These new draft versions are on their way to becoming the next wave of FIDO functionality (as of the writing of this blog, we support Level 1 of WebAuthn and CTAP version 2.0). We think you might want to hear about what we think is especially fun about WebAuthn L2 and CTAP 2.1.

 

Enterprise Attestation (EA)

 

Enterprise Attestation is a new feature coming as part of WebAuthn L2 and CTAP 2.1 that enables binding of an authenticator to an account using a persistent identifier, similar to a smart card today.

 

FIDO privacy standards require that "a FIDO device does not have a global identifier within a particular website" and "a FIDO device must not have a global identifier visible across websites". EA is designed to be used exclusively in enterprise-like environments where a trust relationship exists between devices and/or browsers and the relying party via management and/or policy. If EA is requested by a Relying Partying (RP) and the OS/browser is operating outside an enterprise context (personal browser profile, unmanaged device, etc), the browser is expected to prompt the user for consent and provide a clear warning about the potential for tracking via the persistent identifier being shared.

 

Authenticators can be configured to support Vendor-facilitated and/or Platform-managed Enterprise Attestation. Vendor-facilitated EA involves an authenticator vendor hardcoding a list of Relying Party IDs (RP IDs) into the authenticator firmware as part of manufacturing. This list is immutable (aka non-updateable). An enterprise attestation is only provided to RPs in that list. Platform-managed EA involves an RP ID list delivered via enterprise policy (ex: managed browser policy, mobile application management (MAM), mobile device management (MDM) and is enforced by the platform.

 

Spec reference:

CTAP 2.1 - Section 7.1: Enterprise Attestation
WebAuthn L2 - Section 5.4.7: Attestation Conveyance Preference

 

Authenticator Credential Management and Bio Enrollment

 

Credential Management is part of CTAP 2.1 and allows management of discoverable credentials (aka resident keys) on an authenticator. Management can occur via a browser, an OS settings panel, an app or a CLI tool.

 

Here's an example of how the Credential Management capability is baked into Chrome 88 on macOS (chrome://settings/securityKeys). Here I can manage my PIN, view discoverable credentials, add and remove fingerprints (assuming the authenticator has a fingerprint reader!) and factory reset my authenticator.

 

 

 

Clicking on "Sign-in data" shows the discoverable credentials on the authenticator and allows me to remove them. This security key has an Azure AD account and an identity for use with SSH.

 

 

 

 

Bio Enrollment allows the browser, client, or OS to aid in configuring biometrics on authenticators that support them. This security key has one finger enrolled. I can either remove the existing finger or add more.

 

 

Here's an example of authenticator credential management via a CLI tool, ykman from Yubico.

 

 

 

Spec references:

            CTAP 2.1 - Section 5.8: Credential Management

            CTAP 2.1 - Section 5.7: Bio Enrollment

 

Set Minimum PIN Length and Force Change PIN

 

CTAP 2.1 allows an RP to require a minimum PIN length on the authenticator. If the existing PIN does not meet the RP’s requirements, a change PIN flow can be initiated.

 

An authenticator can also be configured with a one-time use PIN that must be changed on first use. This is an additional layer of protection when an authenticator is pre-provisioned by an administrator and then needs to be sent to an end user. The temporary PIN can be communicated to the end user out of band. We see this being used in conjunction with Enterprise Attestation to create a strong relationship between an authenticator and a user.

 

Spec reference:

CTAP 2.1 - Section 7.4: Set Minimum PIN Length

 

Always Require User Verification (AlwaysUV)

 

AlwaysUV is part of CTAP 2.1 and allows the user to configure their authenticator to always prompt for user verification (PIN, biometric, etc), even when the Relying Party does not ask for it. This adds an extra layer of protection by ensuring all credentials on the authenticator require the same verification method.

 

Spec reference:

CTAP 2.1 - Section 7.2: Always Require User Verification

 

Virtual Authenticator DevTool

 

This one is not tied to updates of either specification but we love it and wanted to share! Chrome and Edge (version 87+) now include a virtual authenticator as part of DevTools. It started as a Chromium extension back in 2019 and is now native! Oh, and the code is on Github!

 

It is a great tool for testing, debugging and learning! Try it with one of the awesome WebAuthn test sites: Microsoft WebAuthn Sample App, WebAuthn.io, Yubico WebAuthn Demo.

 

To access the tool, open Developer Tools ( F12 or Option + Command+ I ), click the Menu icon on the top right (…) then More tools and WebAuthn.

 

 

Enabling the virtual authenticator environment will allow you to create a new authenticator by picking a protocol (CTAP2 or U2F), transport (USB, Bluetooth, NFC or internal), resident key (discoverable) and user verification support.

 

 

 

As new credentials are created, you’ll see them listed and the sign count will increase as the credential is used.

 

 

Want to know more? Here’s an amazing blog by Nina Satragno from the Chrome team over at Google who created this amazing DevTool!

How we built the Chrome DevTools WebAuthn tab

 

Wrap Up

That rounds out the major features we believe will have the most impact. Here’s a few other enhancements and features that are important to mention!

cross-origin iFrame usage Apple’s anonymous platform attestation format Large blob storage extension to support storing a small chunk of encrypted data with a credential to support use cases like SSH certificates

 

If you’d like to hear more about any of these enhancements/features (or anything else identity related, let's be honest), leave us a note in the comments.

 

Thanks for reading!

 

Tim Cappalli | Microsoft Identity | @timcappalli

 

 

 

Thursday, 11. February 2021

Finicity

Why Financial Data Aggregators Should Comply with FCRA Requirements

Consumers can leverage their financial data to improve their financial health, gain access to financial services, and enhance control over their finances. In order to benefit from these financial outcomes, consumers provide access to their financial data to end users through data access providers. If data access providers truly want to empower consumers, they must […] The post Why Financial Data

Consumers can leverage their financial data to improve their financial health, gain access to financial services, and enhance control over their finances. In order to benefit from these financial outcomes, consumers provide access to their financial data to end users through data access providers. If data access providers truly want to empower consumers, they must protect that data and ensure that control of the data remains firmly in the hands of the consumer. And a commitment to protect data in word only isn’t enough. 

Data access providers that share financial data they have assembled for the purpose of lending, insurance, and employment, must adhere to the Fair Credit Reporting Act (FCRA). Operating as a Consumer Reporting Agency (CRA) is the best way to ensure a consumer-first lending approach that both protects and empowers borrowers. Anything less is just words.

Let’s look at this in greater detail. 

The FCRA and Financial Data Sharing

The Fair Credit Reporting Act protects consumers by establishing and protecting the right for individuals to dispute inaccurate data in consumer reports and get those errors fixed. In order to maintain transparency and ensure accuracy, the Act also requires that consumers have access at any time to any personal financial data provided for credit or insurance eligibility decisioning. 

While FCRA has historically applied to credit bureaus and certain specialty CRAs, like tenant screening and medical information companies, the Act has a broad scope of coverage and is not limited to traditional credit reporting or a narrow set of CRAs. Third-party data access providers that power financial data permissioning between lenders and consumers use a different type of data sharing model for which FCRA compliance provides an equally critical component of protecting consumers when they access credit and other financial services.

As it stands, the FCRA applies to consumer reports that CRAs provide to third parties for certain permissible purposes described in the FCRA, such as determining a consumer’s eligibility for credit or insurance. Consumer reports can only be provided and used for these permissible purposes. If data access providers, also known as data aggregators or data agents, are assembling a consumer’s financial data and sharing it for the purpose of credit or insurance decisioning, shouldn’t those providers be considered CRAs, and shouldn’t the FCRA apply to their data sharing?

In order to protect and empower consumers, we think so.

Empowering Consumers with Compliant Data Sharing

FCRA compliance is the only sure way to guarantee fairness, accuracy, and transparency when data access providers assemble consumer financial data and provide it to lenders or insurers to make credit or insurance decisions. 

A current regulatory interpretation of the FCRA suggests that an organization does not become a CRA when it forwards financial data to a third party at a consumer’s request because they are simply engaging in “permission-based sharing” on behalf of the consumer. Finicity believes this interpretation of the FCRA was meant to address different circumstances, such as where a mortgage broker forwards a consumer’s application and credit report to prospective mortgage lenders at the consumer’s request. This and other “conduit”-like functions fall outside the FCRA. 

We do not believe, however, that this interpretation was intended to cover situations where a data access provider or other party “assembles” consumer data to provide to financial data users. Such a broad reading of the “permission-based sharing” interpretation would run counter to the purpose of the Act and undermine the protections the FCRA was created to uphold. 

Why can’t data access providers simply promise protections to consumers? Such assurances are of course important, but can vary from provider to provider. Consumers are best served when all data access providers are held to a common standard of consumer protection. Operating as a CRA requires that data access providers adhere to the FCRA and provide specific dispute and disclosure processes that enable consumers to access and view their data, dispute any errors, and understand how their data is being used. 

If a data access provider is delivering consumer data it has assembled to creditors for use in credit decisioning, and is not functioning as a CRA, it is not adhering to the FCRA and not protecting consumers as well as it could.

With open banking and digital financial services continuing to pick up speed, it’s more crucial than ever that the industry demonstrate that the consumer-permissioned data sharing process is conducted fairly, accurately, and with transparency for the consumer. Those positive outcomes follow when a data access provider, in appropriate circumstances, is functioning as a CRA and is legally required to adhere to the FCRA. 

Only when such protections are in place can consumers reliably enjoy the empowerment and improved financial outcomes they deserve. And the financial services industry can similarly benefit from the growth and innovation that comes from the increased acceptance of leveraging consumer-permissioned financial data for the benefit of consumers. To learn more about the benefits of FCRA compliance for data access providers, check out our whitepaper.

The post Why Financial Data Aggregators Should Comply with FCRA Requirements appeared first on Finicity.


Finledger: Finicity talks data aggregators’ treatment under the Fair Credit Reporting Act

Rebecca Ayers sits down with Finicity CEO Steve Smith who explains how FCRA regulation simultaneously protects the consumer and open banking innovation. Read the full article.  The post Finledger: Finicity talks data aggregators’ treatment under the Fair Credit Reporting Act appeared first on Finicity.

Rebecca Ayers sits down with Finicity CEO Steve Smith who explains how FCRA regulation simultaneously protects the consumer and open banking innovation.

Read the full article. 

The post Finledger: Finicity talks data aggregators’ treatment under the Fair Credit Reporting Act appeared first on Finicity.


Finicity Promotes FCRA Accountability for Financial Data Aggregators

Finicity releases detailed analysis affirming that data aggregators should be deemed CRAs and subject to FCRA requirements SALT LAKE CITY, Utah – February 11, 2021 – Finicity, a Mastercard company and leading provider of open banking solutions, has produced a detailed whitepaper with analysis showing why the Fair Credit Reporting Act (FCRA) should apply to […] The post Finicity Promotes FCRA Acc

Finicity releases detailed analysis affirming that data aggregators should be deemed CRAs and subject to FCRA requirements

SALT LAKE CITY, Utah – February 11, 2021 – Finicity, a Mastercard company and leading provider of open banking solutions, has produced a detailed whitepaper with analysis showing why the Fair Credit Reporting Act (FCRA) should apply to providers of data aggregation services (“data agents”) in certain circumstances. 

The FCRA protects consumers by establishing and protecting the right for individuals to dispute inaccurate data in consumer reports and get those errors fixed. The growth of consumer-permissioned data sharing in lending, insurance, and employment means that FCRA compliance will become a critical component of consumer protection. 

“As Open Banking puts consumers in control of their financial data and gives lenders a more complete view of consumers’ creditworthiness, it’s critical that the sharing of financial data is conducted fairly, accurately, and with the utmost transparency,” said Steve Smith, Co-founder and CEO of Finicity. “Leveraging the protections of the FCRA will provide a vital regulatory framework that gives everyone in the financial ecosystem confidence in all of these areas, as well as manageable, familiar compliance standards for the data agents behind growing aggregation capabilities for underwriting use cases.”

Lenders have increasingly relied on new forms of financial data, such as information about a consumer’s bank account transactions, which can enable better, more effective credit-decisioning for lenders. This data supplements the traditional credit score to expand the lenders’ view of a borrower’s financial health, and ultimately stands to increase financial inclusion. Not only does this financial data give lenders a more thorough credit review process, it also empowers consumers with control over their own financial data, which they can grant or revoke access to. 

“With the emergence of third-party data aggregators and their ability to provide financial data for the purposes of credit decisioning, FCRA compliance is essential to protecting the consumer,” said Chi Chi Wu, Staff Attorney with the National Consumer Law Center.

Finicity believes this new credit process also necessitates that it function in concert with fair reporting principles: data should be accurate, consumers should have access to their own data, and they should know when and how their data is used. 

In December 2019, the Consumer Financial Protection Bureau (CFPB) and the federal banking agencies released an Interagency Statement on the Use of Alternative Data in Credit Underwriting. In the Interagency Statement, the agencies recognized the consumer benefits of considering cash flow data in credit decisions and noted how consumers “can expressly permission access to their cash flow data”—through data agents—to enhance transparency and consumer control. 

The CFPB is expected to propose new rules around consumer-authorized access to financial data in early 2021. 

Finicity is releasing a detailed white paper “FCRA and Data Agents” with analysis of the mechanics and benefits of FCRA compliance for data aggregators. The full white paper can be downloaded at:  https://info.finicity.com/fcra-data-agents/

To learn more about Finicity, its data services, and its commitment to fast, reliable and high-quality data, visit www.finicity.com

About Finicity

Finicity, a Mastercard company, helps individuals, families, and organizations make smarter financial decisions through safe and secure access to fast, high-quality data. The company provides a proven and trusted open banking platform that puts consumers in control of their financial data, transforming the way we experience money for everything from budgeting and payments to investing and lending. Finicity partners with influential financial institutions and disruptive fintech providers alike to give consumers a leg up in a complicated financial world, helping to improve financial literacy, expanding financial inclusion, and ultimately leading to better financial outcomes. Finicity is headquartered in Salt Lake City, Utah. To learn more or test drive its API, visit www.finicity.com.

The post Finicity Promotes FCRA Accountability for Financial Data Aggregators appeared first on Finicity.


Tokeny Solutions

Focus on Tokenization

The post Focus on Tokenization appeared first on Tokeny Solutions.

Our CEO Lux Falempin spoke at ‘Focus On Tokenisation’ organized by Luxembourg For Finance. The event brings together experts from across financial services. The Luxembourg regulator, Commission de Surveillance du Secteur Financier (CSSF), gave their opening remarks on the new regulation regarding security tokens. Watch this insightful panel to find out more.

The post Focus on Tokenization appeared first on Tokeny Solutions.


Authenteq

The 2021 Landscape: What’s Happened So Far

The post The 2021 Landscape: What’s Happened So Far appeared first on Authenteq.

Trinsic (was streetcred)

New Tools to Support Production Deployments

Since launching the first production-grade platform for verifiable credentials, we’ve seen over 1,000 innovators from all over the world build solutions leveraging the verifiable credential standard. These innovators collectively have deployed production verifiable credentials to over 80,000 people across five continents. These teams trust Trinsic to maintain a robust platform upon which they can

Since launching the first production-grade platform for verifiable credentials, we’ve seen over 1,000 innovators from all over the world build solutions leveraging the verifiable credential standard. These innovators collectively have deployed production verifiable credentials to over 80,000 people across five continents. These teams trust Trinsic to maintain a robust platform upon which they can depend. Today we’re announcing two new resources to support these and future production deployments: an API status page and a community Slack channel.

Status page

Dependability, or being as secure, scalable, and available as we can be, and transparency, or being open with our partners about our platform’s capabilities and limits, are two of our core values. The best way we know to demonstrate our dependability in a transparent way is to host a public API Status page.

 

A status page has been a highly-requested feature for some time. You can find the new status page at: status.trinsic.id. We began testing the status page a few months ago using several tools that interact with various different API endpoints across our multiple APIs. This ensures that each kind of endpoint (for example, endpoints that access wallet storage and ones that write to the ledger) is sufficiently covered.

 

View historical uptime: Using the status page, you can see the last 90 days of uptime of all our externally-facing services. You can also inspect individual incidents and view incident reports.

 

Be notified of incidents: By clicking the “subscribe” button in the upper-left of the screen, you can have any downtime or incidents trigger a notification to your email or Slack workspace.

Recent outages at Slack, Google, Facebook, and more remind us that no company is immune to incidents. However, our partners generally expect a few things from us: 1. that Trinsic will proactively address potential issues to avoid incidents from occurring, 2. if incidents occur, respond immediately, 3. be transparent and communicative throughout the process, and 4. remain within our SLA.

Slack community

As Trinsic has grown in popularity among the SSI developer community, several Trinsic User Groups have started organically. While we encourage this, we also want to give these communities an official home. That’s why we’ve created a Slack workspace just for the Trinsic community. It’s free for anyone to join and get support from the community. Our team is also present to support teams on paid plans in private channels where we can offer more dedicated support for production deployments.

Click here to join the Trinsic Community on Slack!

Earning trust

In order to enable trust online, we know we must first earn the trust of our customers. We are committed to do all we can to be the best and most reliable option for any organization wishing to implement verifiable credentials and will continue to build the tools you need to be successful in production deployments.

The post New Tools to Support Production Deployments appeared first on Trinsic.


KuppingerCole

Overcoming Identity Governance Challenges with ForgeRock Autonomous Identity

by Martin Kuppinger Most organizations see the value of identity governance and administration (IGA). However, they recognize that it has its challenges in practice, with certain tasks being complex and cumbersome. As a result, existing IGA solutions are faltering. Dynamic businesses require new approaches with a high degree of automation. Artificial intelligence (AI) and machine learning (ML) be

by Martin Kuppinger

Most organizations see the value of identity governance and administration (IGA). However, they recognize that it has its challenges in practice, with certain tasks being complex and cumbersome. As a result, existing IGA solutions are faltering. Dynamic businesses require new approaches with a high degree of automation. Artificial intelligence (AI) and machine learning (ML) bear the promise of delivering such automation to complex tasks. ForgeRock Autonomous Identity implements such capabilities for ForgeRock and third-party IGA platforms, ensuring that more organizations can use them effectively and gain maximum benefit from them.


HID Global Authentication Platform

by John Tolbert HID Global offers robust and highly secure solutions for identity and access management, including physical access controls, smart identity card manufacturing and credential issuance, biometric authentication, and mobile/remote identity proofing. HID Global’s Authentication Platform combines each of these elements into a packaged service that is suitable for B2B, B2C, B2E, and G2C

by John Tolbert

HID Global offers robust and highly secure solutions for identity and access management, including physical access controls, smart identity card manufacturing and credential issuance, biometric authentication, and mobile/remote identity proofing. HID Global’s Authentication Platform combines each of these elements into a packaged service that is suitable for B2B, B2C, B2E, and G2C use cases.


Privileged Access Management for DevOps

by Paul Fisher Privileged Access Management (PAM) is an important area of access risk management and identity security in any organization. Privileged accounts have traditionally been given to administrators to access critical data and applications. But, changing business practices, hybrid IT, cloud and other aspects of digital transformation has meant that users of privileged accounts have becom

by Paul Fisher

Privileged Access Management (PAM) is an important area of access risk management and identity security in any organization. Privileged accounts have traditionally been given to administrators to access critical data and applications. But, changing business practices, hybrid IT, cloud and other aspects of digital transformation has meant that users of privileged accounts have become more numerous and widespread. One area in sharp focus is DevOps support which has become essential to many organizations looking to become more responsive and innovative. Application developers and other agile teams increasingly need privileged access to essential tools, and several PAM vendors are responding to this demand.


My Life Digital

MyLife Digital MD, J Cromack, Named as Top Influencer in Data and Analytics

J Cromack, Managing Director of MyLife Digital, has been named as one of DataIQ’s top 100 most influential people in data. Cromack was acknowledged for his ongoing dedication to protecting data rights. The post MyLife Digital MD, J Cromack, Named as Top Influencer in Data and Analytics appeared first on MyLife Digital.

Press Release

J Cromack, Managing Director of MyLife Digital, has been named as one of DataIQ’s top 100 most influential people in data. Cromack was acknowledged for his ongoing dedication to protecting data rights.

The DataIQ power table, developed in partnership with data visualisation expert Tableau, has been used to track the rise of data scientists, data governance professionals, and chief data and analytics officers since 2014.

Cromack was recognised as a key influencer in the data space, ranking amongst the most prominent data enablers on the country’s first and only fully-curated list of the most powerful data and analytics practitioners. “As an enabler, my focus is on helping my clients maintain the utility of, and build, their first party data assets and ensure the data is fully compliant with global data protection regulations,” says Cromack of his role in data protection and privacy.

Together, both Cromack and MyLife Digital are fighting to change the way that consumer data is handled, advocating for greater transparency and control on behalf of customers. As part of this, MyLife Digital has created a scalable consent and preference management platform that empowers consumers to make smarter, more informed decisions over how their data is used, while also supporting businesses in their efforts to build stronger relationships and instil trust in consumers through secure data management.

He said, “I’m delighted to be included in the DataIQ 100, it is an incredible accolade and excellently reflects the amazing work that the team at MyLife Digital have been doing to help organisations adopt more ethical frameworks to use personal data. We’re very proud to have built great technology that helps businesses grow trust through data understanding while empowering consumers to make decisions on how their data can and can’t be used.”

The coveted spot on the DataIQ 100 list comes just months after Cromack’s nod at the 2020 DataIQ awards, where the data expert was named Privacy and Trust Champion of the year. He was recognised for his ongoing support in protecting individual data rights in the digital world, and for his appreciation of the importance of building trust between organisations and their customers. Cromack was also a runner up in the Data for Good category.

As one of the country’s top data influencers, Cromack is planning to use his status to better promote the importance of privacy UX and data ethics, which the MyLife Digital team predict will be hot topics throughout 2021. Data ethics is a key area of interest, and Cromack and his team are currently developing plans to educate businesses on this subject, shifting away from tracking technologies and moving closer towards authenticated journeys. MyLife Digital is on track to release a Data Ethics whitepaper in the spring.

The post MyLife Digital MD, J Cromack, Named as Top Influencer in Data and Analytics appeared first on MyLife Digital.


ValidatedID

VIDsigner electronic signatures integrated with Sage

Available for Sage X3, Sage Connected Offices, and Sage 200c. VIDsigner electronic signatures allow to send documents to be signed without leaving Sage.
Available for Sage X3, Sage Connected Offices, and Sage 200c. VIDsigner electronic signatures allow to send documents to be signed without leaving Sage.

Improve productivity with electronic signatures and SAP

The integration of VIDsigner e-signatures with SAP allows you to sign your documents within SuccessFactors, Business One, By Design, S/4HANA and ECC.
The integration of VIDsigner e-signatures with SAP allows you to sign your documents within SuccessFactors, Business One, By Design, S/4HANA and ECC.

auth0

Auth0 Launches Virtual Summer Internships and Junior Engineer Program

Introducing these two new programs to bring new and diverse talent to our organizations.
Introducing these two new programs to bring new and diverse talent to our organizations.

Rethinking Your Workforce Strategy: A Look at Culture in 2021

Auth0’s SVP of People on how to foster a people-driven culture
Auth0’s SVP of People on how to foster a people-driven culture

Meeco

Our next UI/UX Designer could be you!

Before we dive into explaining this amazing opportunity, we don’t want to waste your time. So, if you don’t have Australian citizenship or permanent residency, or if you are representing a recruitment agency or an offshore/provider, then this is not for you. We understand you may be an awesome candidate, or have access to great candidates, but ... Read More
Before we dive into explaining this amazing opportunity, we don’t want to waste your time. So, if you don’t have Australian citizenship or permanent residency, or if you are representing a recruitment agency or an offshore/provider, then this is not for you. We understand you may be an awesome candidate, or have access to great candidates, but we will not consider such applications at this time. Meeco is looking for a Graduate or Junior UX/UI Designer for our Australian team, where you can help shape the API-of-Me. What is the API-of-Me you might ask? It is an emerging platform that enables the collection, protection and exchange of personal data. Our main objective is to provide a suite of developer tools and APIs to companies and organisations to allow them to build privacy first solutions We are passionate about building a new economy where data is not refined, as is done with oil, for power and profit, of a few mega corporations. Rather, we are committed to finding a new equilibrium where people and organizations enter a win-win relationship and build a new, more meaningful relationship. Read our manifesto to learn more. Don’t hesitate to apply, no matter what your background is. We would like to build a diverse team, and we welcome everyone. Our team is spread across Australia and Belgium, and collectively we come from diverse cultural backgrounds, speaking a range of languages, each with a unique personal data story. About the Role This is an exciting opportunity to start your UX career with a technology start-up working to improve the world! Meeco is a growing team of 25 people based in Australia, Belgium and the UK and we are looking for a junior UX/UI designer to join our Adelaide based design team. This role is ideally suited to someone who doesn’t necessarily have commercial experience in the UX field, but is excited and passionate about user experience and how design can help humanise technology. You will be working closely with our Chief of Design, and you will be involved in the full design process. The following will help you understand what you’ll be doing on a day to day basis: Help to gather and evaluate user requirements Conduct UX research Brainstorm and workshop design solutions Contribute to developing user personas, customer journey maps, storyboards and user flows Contribute to wireframing and high-fidelity designs Contribute to creating design-based prototypes We use a variety of industry standard tools including Figma. It is not a requirement that you have experience in using any of these tools, but a willingness to learn and self-directed education is required. About You We are looking for someone who is either studying or has recently finished a course in UX or similar fields. You will have worked on at least two UX projects that you are proud of and be willing to present them and explain your decisions (these can be personal projects or student projects). We build applications for the internet of tomorrow, a place where personal data is placed right at the center, protected start to finish. This means we often tackle questions that have no established technical or design solutions. We don’t expect you to know everything from the start and we have great mentors to show you the ropes. An open mind, a desire to learn and being able to incorporate constructive feedback is essential. Meeco is an environment with more questions than answers. If you are drawn to original problems, which require imaginative solutions, you’ll feel right at home. This does require you to be open for and comfortable learning new things and develop proof-of-concepts on short timelines, identifying the tasks that matter the most. The following is our wish-list. It is not necessary that you have all this experience, butthe more you have or can pick up quickly will increase your chances of success in the role: Be able to demonstrate a genuine interest in UX/UI, either through your study or personal projects Experienced in using design tools such as Figma, Sketch, Affinity or Adobe design suite You have experience in using design tools such as Figma, Sketch, Affinity or Adobe design suite Keeping up to date with the latest UI/UX design and usability trends and techniques Understand what a design system is Understand and have experience with web development technologies such as Javascript and CSS This is a job where we support remote work. However, we do have an office in Adelaide. Given the mentoring aspects of the role, we would like you to be in the office at least three days a week. At other times you are free to work where you work best. We have team members in Adelaide, Melbourne and Sydney in Australia, London (UK) and spread out in Belgium. An important reality is that this can often mean early mornings or long days.  We have flexible hours, so you can design your day around being an early bird or night owl. For six months of the year the time difference with Europe can feel brutal, and then the other six months when Europe is on summertime it is much easier. However, being able to work across time zones and be self directed is a critical aspect of this role. We value people who can work independently, are able to manage themselves, are supportive to each other and not afraid to ask for help when needed. We love you asking for help! What We Offer We pay a competitive start-up package based on your experience and seniority in line with the existing members in the team. As we are still building our foundation team, additional incentives such as Options are available for the right person. What We Have Achieved We are early pioneers in the personal data economy and have been campaigning for digital rights, data portability and data control since 2012. You can read more on our achievements and announcements on our blog.  Funding; our founder bootstrapped the company for the first two years before raising private equity. We have runway and revenues, so the focus for us now is building awesome products Brand & Reputation: We launched our first product into the market in 2014 and have built a strong brand in the personal data community.   Search for Meeco on-line and you will see our consistent focus in helping build a global personal data economy. If you would like to know more about our crazy, wonderful, roller-coaster start up journey check out this recent keynote at the MyData Conference in Helsinki, or these recent podcasts such as this on entrepreneurship or this on identity featuring our founder, Katryna Dow. Awards: We have won eight international awards in Australia, UK, Netherlands and Germany for personal data management, identity, innovation and FinTech services. Market: In December 2019 we launched our secure data vault inside KBC’s (Belgium) banking app and next month we are launching a media platform for kids called mIKs-it These are just two great examples of how enterprises and other start-ups are using our technology to developer better Privacy and Security by Design personal data solutions. Innovation: Through 2020 we doubled our engineering team along with some really interesting blockchain/distributed ledger projects which we are currently productising to launch throughout 2021. How to Apply By reading this you have reached the end, so if you feel you are the right person for the job, all you have to do is contact us at jointheteam@meeco.me Please include what you like about the role, why you may want to join, any questions along with your resume and confirmation of your Australian citizenship or residency. Last time we advertised we had hundreds of applications. Whilst we were delighted it took us a few weeks to consider the long-list and juggle interviews between Australia and Europe. If you meet the criteria, we will be in touch with you as soon as possible. We’re all looking forward to meeting you soon! 

The post Our next UI/UX Designer could be you! appeared first on The Meeco Blog.


Spruce Systems

Credible v0.1 is live

Credible v0.1 pushed to GitHub with its core functionality enabled, documentation, and is moving towards stability for white-labeling. Today, Spruce is open-sourcing under the Apache 2.0 license an early developer release of Credible (GitHub repository here), a native mobile wallet for W3C Verifiable Credentials and Decentralized Identifiers built on DIDKit and Flutter. We were able to package th

Credible v0.1 pushed to GitHub with its core functionality enabled, documentation, and is moving towards stability for white-labeling.

Today, Spruce is open-sourcing under the Apache 2.0 license an early developer release of Credible (GitHub repository here), a native mobile wallet for W3C Verifiable Credentials and Decentralized Identifiers built on DIDKit and Flutter. We were able to package the DIDKit library written in Rust into a Flutter application running on both Android and iOS, using C bindings and Dart’s FFI capabilities. A special thank you to brickpop/flutter-ffi-rust for paving the way in the use of this technique, which inspired our own implementation’s architecture.

DIDKit’s Architecture

We hope you enjoy this release, and if you would like an invitation to the iOS TestFlight or Android Play Beta store builds of Credible, please enter your email to receive an invite soon.

Who this is for

We believe that there will be no “one size fits all” credential wallet because the user experience must be appropriately tailored to each use case. Workflows involving trusted information tend to be extremely sensitive to context, with vast differences in the kinds of information used, methods of verification, and user interfaces from situation to situation. For example, the following use cases will demand completely different interfaces:

A hospital wants to allow its qualified medical staff and pharmacists to digitally sign and verify prescriptions, respectively. A trading desk wants to set position limits based on a trader’s digital authorizations issued by the firm. A user wishes to demonstrate their history of transactions with a merchant, granting them special status and benefits at another venue through a commercial partnership.

Comparing just these three, there’s a lot of variety that no single user experience can truly satisfy. They all share a need to make important data verifiable and portable, but the difference in their user experiences shows how unlikely it is for one overarching “wallet interface” to rule them all.

Instead, we are striving to build a solid, general-purpose technological foundation suitable to be tailored for specific use cases and workflows. This is the wallet counterpart to the rich, growing toolkit supplied by DIDKit, the two pillars of a reference architecture for creating trusted interactions at scale.

Credible’s core technologies: VCs and DIDs

Verifiable Credentials (VCs) are a cryptographically-signed, contextualized, and highly portable data format used to represent beliefs about reality, such as someone’s educational certifications, employment history, and licenses. They can also be used to represent beliefs about non-person entities (NPEs) such as code repositories, IoT devices, and data sets. Credible brings the ability to receive with, safely store in, and present VCs from your smartphone, making them truly yours.

Decentralized Identifiers (DIDs) are a kind of URI that can be universally resolvable across the Internet, public blockchains, and other data sources. They are used in conjunction with a DID method to establish public key-based control authority of a “DID controller” and enable service discovery of “service endpoints.” People may manage several DIDs each to keep different aspects of their life separate and private. DIDs may also be managed by non-person entities, such as software agents, industrial equipment, and computer servers. Credible currently supports the use of a single DID from your smartphone, with multi-DID support on the roadmap.

VCs and DIDs may be used together in powerful patterns, built from the fundamental issuance, storage, verification, and revocation workflow. DIDs can be used to cryptographically sign VCs, and because most DIDs can be resolved to their public keys, anyone can then verify the produced VC as authentic and tamper-proof, thereby increasing their trust of the information inside. Credible currently supports the use of DIDs to request, store, verify, and present VCs. Future versions of Credible will also allow for issuance and revocation.

Credible ships out of the box with support for the Tezos DID method, which we have jointly authored with the Tezos ecosystem. The development of this DID method, as described in our DIDKit release post, will be able to take full advantage of future protocol upgrades on the Tezos public blockchain to provide best in class decentralized identity-specific support across the whole stack, down to the VM bytecode. We especially look forward to the use of shielded transactions to increase privacy while retaining the availability of DID resolution, and the use of tickets to allow for sophisticated on-chain permissioning while exercising DID document control.

Currently, the only server interactions that Credible will make are through URLs scanned from QR codes. The user must explicitly consent to dereferencing the URL before any connection is made.

What’s currently supported Credible:

Connections over QR codes containing URLs

Verifiable Credentials

Fetch and Send CredentialOffers over HTTPS Receive Verifiable Credentials over HTTPS Store Verifiable Credentials View Verifiable Credentials Details Send Verifiable Presentations with embedded Verifiable Credentials over HTTPS

User Profile

Basic information Placeholders for notices, privacy policies, etc. Via DIDKit:

Verifiable Credentials

Issue Verifiable Presentations containing Verifiable Credentials Verify Verifiable Credentials

DID-based signing

Key generation (via ring) Key export (via BIP39) did-tezos (1 layer of resolution for tz1)

DID method resolution

did-tezos (1 layer of resolution for tz1) did-key What’s next for Credible Committed Cross-platform continuous integration testing for Android and iOS TestFlight and Play Store Beta releases Full did-tezos implementation (3 layers of resolution and smart contract-based management across tz1/tz2/tz3) did-web support on Credible (we must either package openssl to Flutter or allow did-web to use ring via rustls) Support for multiple DID management Research Flutter Web through DIDKit in WASM Flutter Desktop with a desktop native user interface DIDComm or TorGap support to allow two Credible instances to broker and maintain secure communications based on DIDs. OpenID Connect SIOP v2 support to serve as the IdP in a SIOP interaction Support for Presentation Exchange Support for Universal Wallet Interop Spec

We hope you enjoy this release, and if you would like an invitation to the iOS TestFlight or Android Play Beta store builds of Credible, please enter your email to receive an invite soon.

If you would like to discuss how to deploy Credible for a specific use case, please take 30 seconds to leave us a message, and we will respond within 24 hours.

Follow us on Twitter

Follow us on LinkedIn

Wednesday, 10. February 2021

Evernym

SSI Roundup: February 2021

In February's SSI Roundup, we’re sharing updates on several COVID credential initiatives and the launch of a new cross-sector collaboration, Good Health Pass. The post SSI Roundup: February 2021 appeared first on Evernym.

In February's SSI Roundup, we’re sharing updates on several COVID credential initiatives and the launch of a new cross-sector collaboration, Good Health Pass.

The post SSI Roundup: February 2021 appeared first on Evernym.


Ocean Protocol

OceanDAO — Round Two Grant Results

OceanDAO — Round Two Grant Results Highlights from OceanDAO’s Second Funding Round Introduction Hello, Ocean community! OceanDAO Round Two officially concluded on February 5, 2020. For those unfamiliar with OceanDAO, you can learn more about Ocean’s monthly community curated funding process and get involved here! Thank you to all of the participants, voters & proposer
OceanDAO — Round Two Grant Results Highlights from OceanDAO’s Second Funding Round Introduction

Hello, Ocean community!

OceanDAO Round Two officially concluded on February 5, 2020.

For those unfamiliar with OceanDAO, you can learn more about Ocean’s monthly community curated funding process and get involved here!

Thank you to all of the participants, voters & proposers.

Proposal submissions for OceanDAO Round Three are officially open.

Highlights

OceanDAO Rounds are monthly, with project proposals due on the first of every month.

In Round Two, the top nine highest voted grant proposals from the snapshot ballot were selected to receive 10000 OCEAN to foster positive value creation for the overall OCEAN ecosystem.

The Ocean ecosystem becomes self-sustainable as the Web3 data economy builders leverage Ocean Protocol to create products, services, and resources that the community finds valuable.

Funding Category Types:

• Build / improve applications or integrations to Ocean

• Outreach / community (grants don’t need to be technical in nature)

• Unleash data

• Build / improve core Ocean software

• Improvements to OceanDAO

Proposal Vote Results:

• 16 proposals submitted

• 81 Voters

• 1,195,649.10 $OCEAN voted

Voter Turnout:

•Voting opened on Feb 1st at 23:59 GMT

•Voting closed on Feb 4th at 23:59 GMT

Recipients

Congratulations to the top 9 vote recipients. These projects will be receiving an OceanDAO grant in the form of $OCEAN tokens.

Research the Data Economy

This community project actively reaches out to leading German think tanks and research institutions that are active in the following research areas: economics, data economy, open science, open data, research data management, web science, science 2.0, AI, and machine learning.

Videowiki — A collaborative content creation platform

VideoWiki is a platform for publishing video content that can be automatically generated using AI from text-based content. They aim to be a fully decentralized and censorship-resistant video publishing platform. VideoWiki is an open, collaboratory content editing platform that enables rapid creation, modification, protection, and monetization of immersive content.

Maintain Ocean Protocol core component documentation

As Ocean core components that power the Ocean economy are continually evolving, proposing to maintain the documentation for three months.

Documentation is a very crucial part of any cutting-edge tech like Ocean Protocol. Documentation is the entry point for all the developers willing to contribute to the new technology. Ill-maintained docs cause more frequent developer churn. A well-maintained and up-to-date documentation will help save developers’ time and bring in more devs and dApps on Ocean Protocol.

The Data Whale alga. Development Sprint: Phase 2

In Phase 2 of the alga. development, we are looking to integrate the following functions: mobile data-token swaps, buys, mobile staking, and premium access to users that have consumed access to our datasets before and provide them with a fully customizable analysis tool highlighting important information about the data-economy.

Ocean Academy: Project Oyster

Ocean Academy is a comprehensive project encompassing all competencies relevant to the success of Ocean Protocol. The Academy is different from tutorials and publications, which are specific to short-term needs or development milestones. It is different from reference documentation, which is more complete, often developer-focused, and not delivered in a learning-focused fashion.

Liquidity Pool Handling 2.0

Dissolve a liquidity pool and send back fair shares to the people that are in it by assessing the amount of tokens/liquidity pool shares that the participants own and then splitting up the Ocean in the pool accordingly and sending them to them. This will cost the dissolver ETH and possibly some of their own tokens.

Rebalance a liquidity pool by minting new tokens and sending them to the liquidity pool as well as the token holders. This will also cost the invoker ETH to do the transactions. But this could then be used by, e.g., the Swash pool or old pools to fix their token: Ocean ratio.

German AI/ML goes Ocean

Together we aim to kick-start the German Ocean-based data economy by introducing Ocean to the leading German AI/ML companies and the German AI Association.

Data Brokers and Evotegra will introduce Ocean and the Ocean Market to the German AI Association, representing 250 leading German AI/ML companies. Our goal is to give Ocean visibility to the industry, onboard new publishers, and potentially launch a German Ocean Market for AI/ML publishers and data service consumers.

RugPullIndex.com

Rugpullindex.com attempts to rank data sets by measuring their markets’ performance. We crawl all of Ocean Protocol’s markets daily and rank them by their liquidity and shareholder equality (gini coefficient).

Governor DAO — Data Aggregation

Governor DAO comes as the community-ran, grassroots reboot following the CBDAO exit scam in October of 2020. The project is founded upon the same vision that many CBDAO supporters shared: to create a comprehensive ecosystem to ease the development or transition of a project to a working DAO framework.

Future Rounds

OceanDAO proposal submission deadlines are the first of every month. Round Three information:

Project proposal deadline: March 1, 2021, at 23:59 GMT. Voting wallet balance snapshot: March 1, 2021, at 23:59 GMT. Voting closes: March 4, 2021, at 23:59 GMT. Funding: funds will be disbursed within 24 hours after voting ends.

The goal is to grow the DAO each round. We encourage the $OCEAN ecosystem to apply or reapply AND vote!

See the OceanDAO wiki for up-to-date round information and any links you will need to get involved as a project team member or a community curator.

Onward. Thank you!

Follow Ocean Protocol on Twitter, Telegram, LinkedIn, GitHub & Newsletter for project updates and announcements. And chat directly with other developers on Discord.

OceanDAO — Round Two Grant Results was originally published in Ocean Protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.


BLOK

BLOK joins Good Health Pass Collaborative

As the first covid-related health pass certified by ID2020, we’re proud to offer our support to this initiative to create an interoperable blueprint for #identitysolutions and restore our economies and lives. Starting today, BLOK BioScience – as part of the BLOK Solutions family – will work towards building widely accepted, secure, privacy-first standards for health passes, in […] The

As the first covid-related health pass certified by ID2020, we’re proud to offer our support to this initiative to create an interoperable blueprint for #identitysolutions and restore our economies and lives.

Starting today, BLOK BioScience – as part of the BLOK Solutions family – will work towards building widely accepted, secure, privacy-first standards for health passes, in the good company of a growing number of leading global organisations.

Of the initiative, our CTO Areiel Wolanow says:

Several companies – including our own – have brought already health passport solutions to market, and many more are in the process of doing so. At the same time, most of the world’s countries are trying to figure out what they need from a health passport, or if they want one at all. With both requirements and solutions in such a state of flux – and likely to remain so – the key to having solutions that remain viable and actually help manage this pandemic is for those solutions to be interoperable. Most of the health passport solutions currently on the market claim to be interoperable — as long as you accept their own solution as the base framework for any such collaboration.

 

What we really like about ID2020’s Good Health Pass is that it enables true interoperability, setting a basis for collaboration that is truly independent of any one solution while enforcing preservation of the values that led us to build a solution in the first place. We are very happy to endorse Good Health Pass, and commit its adoption to our product roadmap.

You can read the original release on the launch from the GHPC Medium channel below.

Original Good Health Pass Collaborative release

Today, ID2020 announced the launch of the Good Health Pass Collaborative along with more than 25 leading individual companies and organizations in the technology, health, and travel sectors — including the Airports Council International (ACI), Commons Project Foundation, COVID-19 Credentials Initiative, Evernym, Hyperledger, IBM, International Chamber of Commerce (ICC), Linux Foundation Public Health, Lumedic, Mastercard, Trust Over IP Foundation, and others.

The Good Health Pass Collaborative is an open, inclusive, cross-sector initiative to create a blueprint for interoperable digital health pass systems that will help restore global travel and restart the global economy.

The COVID-19 pandemic has impacted every segment of the global economy, but none as profoundly as travel and tourism. Last year, airlines lost an estimated $118.5 billion USD with related impacts across the economy in excess of $2 trillion USD.

In conjunction with the announcement, the Collaborative also released its first white paper, entitled, Good Health Pass: A Safe Path to Global Reopening.

Collaboration Among a New Ecosystem of Players

“There’s one thing the world agrees on — we need to address the health concerns today to support a return to normalcy,” said Ajay Bhalla, President of Cyber & Intelligence at Mastercard. “Delivering a global, interoperable health pass system can only happen if we come together in a way that meets the needs of everyone involved. This Collaborative will be critical in helping to define how we connect the pieces that will bring travel back safely, spark job creation and jumpstart the world’s economic engine.”

Various efforts are currently underway to develop digital health credentials systems — both vaccination and test certificates — for international travel. Yet, despite this race to market, it is unlikely that a single solution will be implemented universally — or even across the entire travel industry. Thus, it is critical that solutions are designed from the onset to be interoperable — both with one another and across institutional and geographic borders.

The Good Health Pass Collaborative is not intended to supplant existing efforts but rather to help weave them together, fill gaps where they may exist, and facilitate collaboration among a new ecosystem of stakeholders, many of whom have never worked together before.

“Fragmentation is a risk we simply cannot ignore,” said ID2020 Executive Director Dakota Gruener. “To be valuable to users, credentials need to be accepted at check-in, upon arrival by border control agencies, and more. We can get there — even with multiple systems — as long as solutions adhere to open standards and participate in a common governance framework. But without these, fragmentation is inevitable, and travelers — and the economy — will continue to suffer needlessly as a result.”

Global Travel & Digital Health Credentials

COVID-19 test results are already required for entry at some airports and at international borders. But existing paper-based certificates are easy to lose, unnecessarily expose sensitive personal information, and are prone to fraud and counterfeiting.

By contrast, digital health credentials can be printed (e.g., as a QR code) or stored on an individual’s mobile phone. They enhance user privacy and “bind” an individual’s identity to their test result or vaccination certificate, thus enabling real-time, fraud-resistant digital verification.

“Our health data consists of the most sensitive personal information, deserving of the strongest privacy,” said Dr. Ann Cavoukian, Executive Director of the Global Privacy & Security By Design Centre. “Release of our health data must be under our personal control. The Good Health Pass does just that: With Privacy by Design embedded throughout, you control the release of your digital health data, and to whom; all de-identified and decentralized. Privacy and functionality: Win/Win!”

The World Health Organization recently convened the Smart Vaccination Certificate Consortium to establish standards for vaccination certificates, but no analogous effort currently exists for test certificates. Given that it is expected to take years for vaccines to be universally available globally, widespread testing will remain an essential public health tool — and one that must continue alongside vaccination to ensure a safe and equitable return to public life.

The Good Health Pass Collaborative has defined four primary requirements that digital health credential systems for international travel must satisfy:

Cross-border: Solutions must work at airports, airlines, ports-of-call, and borders worldwide and comply with international and local regulations. Cross-industry: Solutions will require the collaboration of the travel, health, governments, and technology sectors. Secure & privacy-protecting: Solutions will require the collaboration of the travel, health, governments, and technology sectors. Solutions must comply with all relevant security, privacy, and data protection regulations and must be able to bind the presenter of the credential to the credential itself at the required level of assurance. Frictionless: Solutions must seamlessly integrate into testing and travel processes, thus enhancing and streamlining the experience for individuals and airlines alike. Solutions must not add new material costs for travelers. Optimally, validation processes will be contactless to maintain or enhance hygiene.

The Collaborative welcomes the participation of policymakers and representatives of government agencies; companies in the health, technology, and travel sectors; and civil society organizations who share a commitment to safely restoring international travel and economic activity while simultaneously ensuring that equity, privacy, and other civil liberties are protected.

If you are interested in learning more, please visit the Good Health Pass website at goodhealthpass.org.

Endorsing Organizations Affinidi Airport Council International (ACI) Airside analizA AOKpass Bindle Systems BLOK Solutions CLEAR The Commons Project Foundation Covid Credential Initiative (CCI) Daon Everynym Global Privacy & Security by Design Centre Grameen Foundation Hyperledger IBM idRamp International Chamber of Commerce (ICC) iProov Linux Foundation Public Health Lumedic Mastercard MIT SafePaths National Aviation Services (NAS) Panta PathCheck Foundation Prescryptive Health SITA STChealth Trust Over IP Foundation ZAKA

The post BLOK joins Good Health Pass Collaborative appeared first on BLOK BioScience.


KuppingerCole

Mar 24, 2021: Identity ist kein Selbstzweck – Unternehmenskritische Applikationen mit Identity Security absichern

Die Corona-Pandemie hat die Digitalisierungsbemühungen in der DACH-Region angetrieben, aber auch Grenzen aufgezeigt. Die digitale Identität von Angestellten steht immer häufiger im Fokus des Interesses – sowohl von Sicherheitsbeauftragten als auch von Hackern. Identity Security ermöglicht Compliance über Plattformen und Anwendungen hinweg; sie hilft Komplexität und Kosten in den Griff zu bekommen.
Die Corona-Pandemie hat die Digitalisierungsbemühungen in der DACH-Region angetrieben, aber auch Grenzen aufgezeigt. Die digitale Identität von Angestellten steht immer häufiger im Fokus des Interesses – sowohl von Sicherheitsbeauftragten als auch von Hackern. Identity Security ermöglicht Compliance über Plattformen und Anwendungen hinweg; sie hilft Komplexität und Kosten in den Griff zu bekommen.

Mar 18, 2021: Maturing a Zero-Trust Strategy for the Extended Enterprise

In the digital era, a 20th century perimeter-based approach to security is no longer appropriate or effective in securing the modern extended enterprise. Instead, a more flexible, identity-based approach is required that can be implemented at every layer of IT, from devices and networks to applications and data.
In the digital era, a 20th century perimeter-based approach to security is no longer appropriate or effective in securing the modern extended enterprise. Instead, a more flexible, identity-based approach is required that can be implemented at every layer of IT, from devices and networks to applications and data.

Provenance

5 key themes at Edie’s Sustainability Leaders Forum 2021

This year is especially crucial to our national sustainability efforts, with big decisions to be made in the wake of Covid-19 and COP26 approaching in November. So it was particularly important for... The post 5 key themes at Edie’s Sustainability Leaders Forum 2021 appeared first on Provenance News.

This year is especially crucial to our national sustainability efforts, with big decisions to be made in the wake of Covid-19 and COP26 approaching in November. So it was particularly important for business, government, investors and NGOs to come together at this year’s Edie Sustainability Leaders Forum – to share insight and help drive the collective fight for a sustainable future. Below are the key themes that I saw emerge over the last three days.

1. Sustainability is not a bolt-on

Time and again this week, experts denounced the approach of treating sustainability strategy as separate to core company strategy. James Bidwell (Re_Set, Springwise) reminded us on Wednesday afternoon of that time when e-commerce was in a separate bucket. His message was plain:

“Sustainability is the new digital, and businesses that embed sustainability and diversity in their business models are the ones that will win in the future.”

Dan Botterill (Founder & CEO, Rio AI) expressed similar sentiment at a panel on the future of reporting: “sustainability reporting isn’t going to be called that – it’s just going to be called reporting.”

The same idea returned on Thursday morning, when Edie’s Luke Nicholls asked whether the ultimate goal of sustainability professionals is to make themselves redundant. In response, Chris Cook (ICI Pension Fund) encouraged sustainability professionals to ask themselves, “how can this business continue to do the right thing when I’m no longer at the table?”

2. Shoppers want to know about the impact of products, not brands

As citizens, we want to be the hero of our own sustainability story. This week provided a reminder that brands shouldn’t put themselves at the centre of their sustainability narrative. In a fascinating CSR campaigns workshop on Wednesday, Futerra’s Solitaire Townsend and Hannah Phang explained how this is especially the case with Gen Z, who are more interested in the product in their hand than the company that made it. 

Given that 90% of a product’s emissions can come from the supply chain, it’s unsurprising that company-level impact is a secondary priority to shoppers. The consensus was that you need to be talking about sustainability at product-level, which is exactly what we help brands with here at Provenance. 

Brands should also decentre themselves from sustainability communications by talking about customers, employees and supply chain actors. In time, people will associate your brand with any progress made. This is also an approach which minimises the risk of greenwashing.

3. Business must tell the simple truth on sustainability

This week’s speakers reminded brand leaders of the importance of moving away from green ‘spin’ towards warts-and-all honesty – which is exactly what we’re helping brands do here at Provenance.

In Wednesday’s panel on sustainability reporting, Dan Neale (World Benchmarking Alliance) spoke of how “the traditional world of CSR comms is driven by messaging, rather than what we need to see, which is boring, chunky data – it’s disclosures and reporting, it’s not ‘reports’.” As he put it, “the gloss is dross. What we need is ease of access.”

Image: Ecologi’s Crista Buznea called for simplicity around carbon targets.

In a workshop on effective CSR campaigns, there was also a fascinating discussion about how ‘born again’ corporate brands are adopting the honest, human tone of voice normally associated with ‘born good’ challenger brands. These legacy brands are increasingly becoming more honest, vulnerable and becoming advocates for change.

I also heard a number of calls for simplicity when it comes to sustainability communications. On Wednesday, Rebecca Burgess, Chief Executive of City to Sea, argued convincingly that people are already overwhelmed; she encouraged change campaigners to “take the hype away” and ask “what’s the action that’s needed?”. Ecologi’s Crista Buznea similarly called for simpler messaging around environmental goals: “We really need to debunk this whole net zero terminology and the complexity around it – the focus needs to be on action and making it very clear and accessible to all.”

4. Togetherness is key

‘Togetherness’ emerged as another big theme at this week’s conference. Thursday morning’s plenary session provided a reminder of the importance of intersectionality in the very white ‘world’ of sustainability, and challenged us to bring new voices to the fore.

‘Togetherness’ was also discussed in terms of collaboration between brands and their supply chain actors. “We’re all each other’s keeper,” said Richard Griffiths MP, as he argued that better supplier collaboration can help us bring change faster. His thoughts were echoed by Tor Burrows (Grosvenor); she advised brands to both “choose suppliers that share your ethos/commitments,” or else to “show suppliers that there is customer demand for sustainable products.”

5. We have reason for both optimism and urgency

In the opening keynote, Tom Rivett-Carnac (Global Optimism) set out the case for us to adopt a “stubborn optimism” in the face of the reality that is coming at us. He explained that this wasn’t blind hope – it was about “accepting that genuine failure is possible” and “acknowledging that real success is also possible”. This call to arms saw Solitaire Townsend take on a new job title – tired of laying out the problems to skeptics, she is now embracing optimism in her new job title ‘Chief Solutionist’.

Jonathan Porritt (Forum for the Future) acknowledged some progress in the form of the government’s ban on the sale of petrol and diesel cars by 2030. But his keynote on Thursday focused more on the need for urgency. He described today’s model of capitalism as “essentially suicidal”, reminding us that every year, $4-6 trillion of taxpayer money goes towards activities that are undermining or destroying life on earth.

By his calculations, the lag effects on scientific, governmental, corporate and consumer progress mean we’re at least 10 years off the pace. He criticised the focus on 2050 – “a wonderfully convenient date” – and called on brands to adopt an emergency response. “Whatever you’ve settled for as your net zero target, bring it forward. And make it happen.” 

It was a sobering conclusion, but incredibly motivating. One thing is clear to me: this must be a year of solutions. At Provenance, we want to meet this need for urgency by helping brands navigate their impact and communicate progress honestly.

 

Interested in how Provenance can help you communicate your brand’s impact in 2021? Get in touch.

The post 5 key themes at Edie’s Sustainability Leaders Forum 2021 appeared first on Provenance News.


Ontology

Ontology Weekly Report (February 1st- 8th, 2021)

Highlights Ontology has had a great first week of February; we launched multiple campaigns involving NFTs, inviting friends to send and receive red packets through ONTO, which made for a fun celebration of Chinese New Year and will help continue growing ONTO’s ecosystem. Ontology continues to make strides in the decentralized digital identity space, and this week the company attended Polkadot AMA

Highlights

Ontology has had a great first week of February; we launched multiple campaigns involving NFTs, inviting friends to send and receive red packets through ONTO, which made for a fun celebration of Chinese New Year and will help continue growing ONTO’s ecosystem. Ontology continues to make strides in the decentralized digital identity space, and this week the company attended Polkadot AMA’s, published bylines and released new branding visions.

Latest Developments

Development Progress

We completed 30% of the Ontology EVM-integrated design, which will make Ontology fully compatible with the Ethereum smart contract ecology after completion.

Product Development

Released ONTO v3.7.2. This version added support for Kusama, support for adding custom assets, an “Invite Friends” feature, and a “Guides” column in the News section, in addition to an optimized “Discover” section and Npay features.

dApps

112 dApps launched in total on MainNet. 6,397,738 dApp transactions completed in total on MainNet. 42,849 dApp-related transactions in the past week. $108m total value locked on Wing. $9m total value locked on Unifi.

Community Growth

This week we onboarded 533 new members across Ontology’s global communities.

Global Events

Li Jun, Founder of Ontology, was invited to speak at “Web 3.0 TALKS”, on the topic of Decentralized Identity, The discussion touched upon how Ontology is developing the building blocks for a better web. This week Li Jun published a byline in Global Banking and Finance titled ‘Why Privacy Should Be ESG Investors’ Next Big Priority’. ONTO supported Binance Blockchain Week’s online broadcast. Users were able to scan the QR code on their screens, enabling the possibility of winning from a limited number of 2,000 DEGO NFTs. ONTO launched its “Invite Friends” campaign. From February 4th to 7th, users could invite new users to register on ONTO. Participants who invited and accepted invites earned a bonus reward. Ontology uploaded an original video called “Decoded Decentralized Identity”, to share our opinions about data and identity. Ontology shared our new branding vision — “A blockchain for self-sovereign ID and Data”.

Industry News

Why Privacy Should Be ESG Investors’ Next Big Priority

By Li Jun, Founder of Ontology

Investors are increasingly integrating ESG (Environmental, Social, and Governance) factors into investment decisions in the hope that sustainable businesses offer less risk and long-term return on investment. Against this backdrop, data privacy and data management remains one of the most pressing issues facing organizations today.

Ontology focuses on the development of public blockchain technology, while building a decentralized digital identity protocol (ONT ID) and decentralized data exchange protocol (DDXF) on top, using the blockchain’s distributed ledger to help users properly manage these identities. On the Ontology platform, users have a decentralized way to manage their data and apply different data selectively to different scenarios. You can choose who to share your data with and set up separate data profiles for different people and organizations. No centralized organization can interfere with your right to own your data.

A Big Year for Decentralized Identity and Infrastructure

The COVID-19 crisis has increased consumer demand for identity solutions that don’t compromise individual privacy and freedoms. What does it mean to anchor on digital identity as a reaction to a very specific set of circumstances (namely an unprecedented global pandemic)? Digital identity systems are relevant for everything from financial services to re-skilling workforces, which each have distinct requirements.

We are gearing up for another exciting year, Decentralized Identity is going to be a major talking point, hopefully leading to more usage and development in this sector.

Find Ontology elsewhere

Ontology website / ONTO website / OWallet (GitHub)

Twitter / Reddit / Facebook / LinkedIn / YouTube / NaverBlog / Forklog

Telegram Announcement / Telegram English / GitHubDiscord

Ontology Weekly Report (February 1st- 8th, 2021) was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


Revoke

Award-Winning Technology

Revoke won the Digital Jersey Security / Privacy award of the year! The post Award-Winning Technology appeared first on Revoke.

The post Award-Winning Technology appeared first on Revoke.


Cozy Cloud

Tech - How we improved our apps performances through indexes

Overview of the most common mistakes we saw at Cozy Cloud about data indexing and querying. The concepts we discuss here are rather generic and can be applied to many database engines, both SQL and No-SQL.
Introduction

Cozy is a versatile open-source personal cloud used to gather your personal data in a safe place. You can develop any client-side application to manipulate existing data types, called doctypes, or simply create your own.
As a result, many types of data can be stored in a single database instance. It is then crucial to be able to efficiently query data so the user can smoothly retrieve it.

Over the years, we witnessed many different use-cases of data manipulation at Cozy Cloud and gained some experience about scalability issues and common traps developers might fall in when dealing with databases.

This article aims to give a short overview of the most common mistakes we saw (and - let’s admit it - sometimes made!) about data indexing and querying. The concepts we discuss here are rather generic and can be applied to many database engines, both SQL and No-SQL. In the following, our examples primarily assume a CouchDB database, as this is the one used in Cozy, but we provide pointers to the equivalent concepts for PostgreSQL and MongoDB, which are some of the best known database engines, respectively in SQL and document-oriented No-SQL.

Our example uses cozy-client, a Javascript library that provides an API to simplify the interaction with the Cozy backend, for authentication, realtime, offline, etc. And most importantly, it provides interfaces to query the database and deal with the indexes.

Indexes: the what and the why

Index: why?

Indexing data is useful as soon as your database grows and data cannot be retrieved in a reasonable time through full-scan queries, i.e. by scanning the whole database.

Most of the time, you don’t want to query all you data at once: you use filters to restrict the selected data. It is typically done through a where clause in SQL, a find method in MongoDB or a selector in CouchDB.
You might also want to retrieve data in a particular order, by sorting it, or selecting only certain fields to be returned.

While it is possible to do such tasks on the client side, they are generally delegated to the database server to ensure scalability. After all this is its job!
However, there is no magic here: you cannot expect a database to automatically deal with large amount of data in a perfectly suitable way for your needs.
This is where indexes takes place: they are used to tell the database how to organize data in such a way that your queries can be efficiently run, even though the database contains millions of elements.

ℹ️ Index is obvisouly not the only thing to keep in mind when dealing with databases, but we focus exclusively on it for this article, for the sake of conciseness.

Index: what?

An index is a way to organize documents in order to retrieve them faster.
To give a simple and non-database comparison, a recipe book might have an index listing all the recipes by their name with their page number. If the reader is looking for a particular recipe, this index avoids him to scan the full book to find it.

The principle is the same for database indexes: we want to avoid scanning the whole book (which can be a table, a collection, etc) each time we are looking for specific data: even though computers are way faster than humans to scan data, this task can be very time-consuming when dealing with thousands or millions of documents.

In more formal terms, time complexity of a query run on a database with n documents will be O(n) without index, as it requires to scan all documents sequentially.
There are many types of indexes in the litterature. One of the most common index structure is the B-Tree: it is the default in PostgreSQL, as well as MongoDB. This index structure performs in O(log n) in average, which guarantees a very fast access time even in large

From a practical point of view, B-trees, therefore, guarantee an access time of less than 10 ms even for extremely large datasets.
—Dr. Rudolf Bayer, inventor of the B-tree

In CouchDB, the indexes are actually B+ Trees, a structure derived from the B-Trees.
The indexed data are document fields. When indexed, it is called keys, and are organized as a tree, with the documents stored in the leafs nodes. So, finding a particular document means browsing the tree, from the root to the leaf.
The keys are organized by their value: if we suppose a node with two keys k1 and k2 and three children, the leftmost child node must have all its keys inferior to k1, the righmost child’s keys superior to k2, and the middle child’s keys between k1 and k2.

In the Figure below, borrowed from the CouchDB guide, you see that each node has three keys, but in the CouchDB implementation, a B+ Tree node can actually store more than 1600 keys.

This design is very efficient for range queries: in the example above, a query looking for documents having an indexed field between 1 and 3 will browse the tree to find the first document position (with a key of 1) and then get the next two contiguous documents.

ℹ️ In the following, we refer about “B-Tree”, that can indifferently be applied to B-Tree or B+ Tree: the main concepts remain the same in both cases and their differences are not relevant in this discussion.

Traps with indexes

Indexing data is not as easy as it might look. There is not one way to index, which must depend on you data model and the queries you need to run for your application. But, as you will see in this article, keeping in mind how the indexes - here the B-Tree - work in background helps a lot!

ℹ️ Each database engine has its own way to index data with its own specificities. Therefore, we do not focus on the indexing itself but rather on the common traps that you might encounter when designing indexes for an application, which can be applied to most of the databases, as long as a B-Tree is involved.

Pagination

When there is too much data to handle at once on the client side, it is good to implement pagination. This consists of splitting a Q query into several Qi sub-queries so the client can progressively retrieve data and update the UI accordingly.

ℹ️ From a UX perspective, there are different strategies to implement pagination. Typically if you data is rendered as a list, you can either have a ‘Load more’ button at the bottom or automatically detect the need to fetch more data when the user is at the end of the list.

We focus here on how to split Q and the consequences on the performances.

A simple way to do it consists of combining a limit and skip parameters to the query.

💡 Those parameters are available in CouchDB, as well as MongoDB or PostgreSQL (in SQL, skip is called offset).

Let’s assume Q retrieves 1000 documents. You can implement pagination by running Q 10 times by setting a limit to 100 and skip the number of already retrieved documents.

⚠️ This pagination method does not scale and can be very bad for performances! Let’s see why.

This method actually breaks the B-Tree indexing logic: there is no way to efficiently skip a fixed number of data in such a tree, so skipping documents consists of normally running the Q query and then splitting the results starting from the skip number of documents, until the limit.
Thus, instead of taking benefit from the tree traversal to efficiently start each Qi, the skip forces the query to start each query from the first document of the tree, and perform a sequential scan from there.

In the schema below are represented the scanned documents for Q1, Q2, Q3, assuming a limit=2 to simplify the drawing.
Q1 needs to scan 2 docs, Q2 4 docs and Q3 6 docs.
The boxes in grey are the documents scanned by Qi and the boxes in green the documents actually returned. We can see that each Qi with i > 1 needs to scan useless documents, this behaviour being worse for each new query.

Fortunately, there are simple and efficient ways to perform pagination.

In CouchDB, you can use bookmarks, that you can see as a tree pointer to indicate where to start the query. This is simply a string returned by each Qi that you can pass as a parameter for the next query:

By doing so, each Qi starts at the end of the previous query, avoiding unecessary scan:

💡 This bookmark system does not exist in MongoDB or PostgreSQL. However, you can still benefit from efficient pagination by using the last document key returned by the previous query. As the tree is organized by keys, Qi will be correctly positioned in the tree, that is, at the end of the previous query. See this article for more details about a SQL approach and this article for a MongoDB approach.

Non-contiguous rows

When designing indexes, it might help to keep in mind how the B-Tree are structured, notably when dealing with multi-fields indexes.
Indeed, if you need to query several fields of a same index, you might perform sub-optimal queries if you are not careful.

Let’s take an example to illustrate how such situation could occur. Q is a query for a To-do application that retrieves tasks by their creation date and category, respectively created_at and category.

To do this, you first create an index on those two fields, in this order. Then, you perform a query, for instance to get all the tasks made in 2020 in the "work" category. With cozy-client, this can be expressed like this:

Below is represented a sample of the scanned index data: it starts from 2019-01-01 and stops at 2020-01-01 as the the data is sorted by its first row, the created_at.
Then, for each scanned row of this interval, the entry is returned if it has a "work" category. The returned rows for the query are represented in green.

The problem here is that you might scan unnecessary rows: in the example above, you can see a “sport” task made the 2019-01-03. It is not a big deal if the "work” category represents the majority of the tasks, but it is a pity if there are only a few of them. Imagine you have 1000 tasks for this year, with only one being in a "work” category: you’ll have to scan the 1000 tasks to only find one element, so 99.9% of the scanning is actually unnecessary.

In order to make this query efficient, you need to make an index that will organize data in contiguous rows.

Let’s take back our example, but this time by changing the index order: you first index category and then created_at.

The index data is now organized as follow:

Now, the very same query can be efficiently processed: the "work” category is first evaluated in the B+ Tree, which return all the relevant rows, sorted by created_at. Then, the range is easily found now that all the rows are contiguous and only the returned rows are scanned.

💡 When creating an index, carefully think about how your data will be organized and which query will need to perfom. The performances can dramatically vary depending on your design, with several orders of magnitude.

💡 When dealing with a query with both equality and range, you should first index the equality field. If you are interested about this, you can learn more here for a SQL approach.

However, it is not always possible to keep this design: you might need a more advanced concept called partial index.

Partial index

In the previous section, we saw that a query should be able to be run on contiguous rows in an index to maximize performances. However, this is not always possible when dealing with more complex queries.

We introduce here partial indexes. They are used to express predicates that will be evaluated at the indexing time. Data is then indexed only if it matches the predicates.

This can actually be useful in several situations:

Queries without contiguous rows Queries with existence predicates Queries with constant values

Let’s dig in these cases and see how it can be useful!

Queries without contiguous rows
Let us consider a query looking for tasks created after 2019, that are NOT in the “work” category:

⚠️ The rows cannot be contiguous in the index because of the negative predicate, $neq (for not-equal). The index will have to scan all the rows to find those without the “work” category and filter out the ones without the correct date range.

By using a partial index with the negative predicate, this query becomes efficient as only the tasks without a “work” category will be indexed, by the created_at field.

The query is now:

💡 Note it is no longer necessary to index the category field: therefore, the index will be smaller, more efficient and will take less place on disk.

Queries with existence predicates
The category field might be optionnal in your To-do application. In this case, it might be useful to check its existence:

⚠️ This query will typically return nothing in CouchDB. This is because a document will be indexed only if the indexed fields exists in the document. So a task without a set category will never be indexed.

A partial index solves the issue, as it will simply evaluate the existence of the field on indexing time:

Queries with constant values

You might need to retrieve documents based on a field value that you know will never change. For instance, you might have a status field that is set to “ARCHIVED” when the task is archived.
To retrieve these tasks, you can directly use a partial index:

In this case, a partial index is useful to reduce the index size as you avoid to index the status field.

💡 Lightweight indexes take less space on disk and are more efficient both for lookups and updates, so this optimization could be worth it if you have a lot of data to handle.

💡 Partial indexes are useful to keep contiguous rows in indexes, to use existence predicates and to decrease indexes size. Depending on your use-cases and data volumetry, you could benefit a lot from partial indexes!

💡 In SQL, you can create partial index simply by specifying a WHERE clause in the index definition. See how to do it with PostgreSQL. In MongoDB, the principle and syntax is quite similar to CouchDB.

Cozy Contacts: a real-world use-case

Theory is great, but let us illustrate what we discussed by a concrete example and how the performances dramatically improved in our Contact application, thanks to partial indexes.

In Cozy, there are several ways to import contacts: you can typically import them directly from a vcard file or use the Google konnector to automatically synchronize contacts with your google account. And, because Cozy is an open-source platform, anyone can come up with a new way to import contacts, potentially with their own contact format.
Because of this, we implemented an asynchronous service that periodically checks when new contacts are added, and migrate them in the expected format for the Cozy Contacts app.

However, because this service is asynchronous, we cannot expect to have all contacts migrated when the user opens the application. So, we ended up with two queries run to retrieve contacts:

A query to retrieve migrated contacts (Q1) A query to retrieve non-migrated contacts (Q2)

Both those queries were checking the existence of a special document field for each contact to know if the contact is migrated or not. However, as we previously discussed, a missing field cannot be indexed in CouchDB. Therefore, Q2 was actually making a full-scan index to evaluate the non-existence of this field. As a result, this query was taking a lot of time to complete, as soon as the user had a lot of contacts.
And the worst part is that Q2 actually returns nothing most of the time, as the contacts are migrated only once!

So, Q2 was an excellent candidate for partial indexes: we implemented Q2.1 that finds contacts with the missing field through the partial index: this way, the document is still indexed when the field is not here. You can find its actual implementation here.

The diagram below show the performance impact by comparing the execution time of the three queries in the most common scenario: Q1 returns 1000 contacts while Q2 and Q2.1 returns nothing.
Q2 complexity is O(n) with n the total number of contacts, while both Q1 and Q2.1 are O(log n). Note Q2.1 does not even appear in the diagram because it only takes few ms to perform.

Finally, using a partial index improved the overall queries performances by a factor 2 for 1K contacts, and by a factor 8 for 10K contacts. Neat!

Conclusion

We hope this little index tour gave you some insights about performances and data indexing.
You can start how to learn developing a Cozy app here: https://docs.cozy.io/en/tutorials/app/
To learn more about how to manipulate data, you can reach here: https://docs.cozy.io/en/tutorials/data/

By the way, if you're located in France and you're interested about Cozy, we're hiring!


auth0

The CSA STAR Program: A Business Guide to Certification

Learn how to accelerate your sales cycle through the CSA STAR Program.
Learn how to accelerate your sales cycle through the CSA STAR Program.

KuppingerCole

The Next Level of Zero Trust: Software Security and Cyber Supply Chain Risk Management

by Martin Kuppinger The recent SolarWinds incident has shed a light on an area of cybersecurity that is not frequently in focus. Better said, it is “again has shed a light”, if we remember the Heartbleed incident that happened back in 2014. Back then, my colleague Alexei Balaganski wrote in a blog post that “software developers (both commercial and OSS) […] should not rely blindly on third-party

by Martin Kuppinger

The recent SolarWinds incident has shed a light on an area of cybersecurity that is not frequently in focus. Better said, it is “again has shed a light”, if we remember the Heartbleed incident that happened back in 2014. Back then, my colleague Alexei Balaganski wrote in a blog post that “software developers (both commercial and OSS) […] should not rely blindly on third-party libraries, but treat them as a part of critical infrastructure”.

What we need is a defined approach and consequent enforcement of what, in a slightly awkward manner, is called C-SCRM or Cyber Supply Chain Risk Management. This concept includes enforcing software security from the very beginning of its development cycle. It is not only about cloud services and COTS (commercial of the shelf) software but applies to any type of software that is procured or self-developed, including software in things and devices as well.

Consequently, we must extend the Zero Trust paradigm beyond networks, security systems, and identities and apply it to all types of software. Don’t trust software, period.

What does Zero Trust stand for?

Zero Trust has been defined over 10 years ago as a concept that focused on removing implicit trust from network architectures and enforcing strict identity verification and access controls for every user or device. “Trust but verify” might be a popular saying, but the motto of Zero Trust has always been “Don’t trust. Verify”. This is the quintessence of the Zero Trust approach that has evolved since then to other areas of security, beyond the network, including identity and device security.

Zero Trust is a paradigm, not a tool. It helps in architecting cybersecurity infrastructures and in defining concepts for operations. Given that security needs to be verified and enforced in multiple places – as the concept states – it involves a range of technologies to implement such a Zero Trust paradigm in an IT infrastructure.

What happened in the SolarWinds incident?

The SolarWinds incident was an attack that became public in December 2020 but had been running at least for several months beforehand. The installers for SolarWinds’ Orion monitoring platform had been backdoored. Unfortunately, that software is (or has been) running in many organizations, including major cloud data centers. Several other software vendors like Microsoft and FireEye have reported that they’ve been breached through the use of SolarWinds products.

What this attack demonstrated is that even large software vendors with strong cybersecurity background can be compromised due to the inherent software and cybersecurity supply chain risks. Unfortunately, most organizations trust the software they procure and use, instead of verifying its security. And, unfortunately, verification is not easy to do and has its limits. But it can be done, it must be done, and it results in increased security. However, some products you can purchase might go as far as explicitly demand that customers disable their antimalware tools before installing them. This is the complete opposite of the idea of Zero Trust.

What is Cyber Supply Chain Risk Management?

Before looking at the specifics of software security (secure software, securing software), this leads to the broader theme of C-SCRM. This discipline applies the broader Supply Chain Risk Management to software and IT services and addresses the inherent cybersecurity risks. This is not only about using SaaS services or COTS software, but also about software in connected devices and things such as sensors, machines, control devices, etc.

Supply Chain Risk Management looks at the risks regarding both the reliability of supply chains and risks and damages derived from suppliers, from misconduct (e.g. child labor) to quality issues.

C-SCRM tries to assess and mitigate the risks that come from IT, particularly software, specifically regarding the availability of IT and – as the name implies – security.

What does C-SCRM include?

Going into detail would be beyond the scope of this post. However, you may want to listen to this talk of my colleague Christopher Schuetze on the necessary components of an effective C-SCRM approach.

How to apply Zero Trust on Software Procurement

In essence, we can’t trust software, in whichever form we receive and use it without validating its integrity and security first. Software security must be consequently enforced and verified, including measures such as

Secure design and coding practices: While concepts like “secure by design” are being discussed for many years, they are still not implemented everywhere. Now is the latest time to start working on it. This also includes modern software testing principles. Software composition analysis: Any reused code such as open-source libraries must be constantly tracked. Where is it used? Which version is in use? Are there known vulnerabilities and patches for them? Which coding practices are in use by the suppliers of this code? Static and dynamic code analysis: There is a range of tools for static and dynamic code security, which can be used to identify security issues in code. These practices should be extended to areas like API design as well. Organizational controls: This all must become part of the ISMS (Information Security Management System), with defined and enforced controls. Security analytics: Last not least, security analytics can help to identify anomalies in software. While operational analytics has been a part of DevOps for years, it should be expanded to cover security telemetry and forensics as well – such integrated platforms are already available.

A good summary of this is provided by my colleagues Matthias Reinwarth and Alexei Balaganski in this episode of the KuppingerCole Podcast.

How to check for Software Security

To quote Alexei Balaganski again, from what he wrote back in 2014 as a recommendation following the Heartbleed incident:

“One of those approaches, unfortunately neglected by many software vendors, is hardening applications using static code analysis. Surely, antimalware tools, firewalls and other network security tools are an important part of any security strategy, but one has to understand that all of them are inherently reactive. The only truly proactive approach to application security is making applications themselves more reliable.

Perhaps unsurprisingly, code analysis tools […] don’t get much hype in media, but there have been amazing advances in this area since the years when I’ve personally been involved in large software development projects. “

There is a lot of solutions available to improve code security and mitigate risks in the software supply chain. It is about using these the right way, integrated into a holistic C-SCRM approach. You’ll find further information in this episode of the KuppingerCole Podcast on post-SolarWinds software security strategies.

The extended Zero Trust approach: Don’t blindly trust software

Zero Trust must be extended and cover software security, for software in any form (embedded, COTS, as-a-service) and regardless of whether it’s home-grown or externally procured. “Don’t trust. Verify” – this is as essential for software as for identity, devices, or networks.


Okta

Tutorial: Chef and Account Automation with Okta

Tip: This tutorial is part of our series on integrating Okta with popular infrastructure as a code solution. If you’re not into Chef, check out our Ansible, Puppet, and Terraform tutorials. Chef cookbooks are a great way to manage infrastructure at scale. However, like other configuration management tools, Chef works best when cookbooks don’t change often. This is easy to accomplish in typical

Tip: This tutorial is part of our series on integrating Okta with popular infrastructure as a code solution. If you’re not into Chef, check out our Ansible, Puppet, and Terraform tutorials.

Chef cookbooks are a great way to manage infrastructure at scale. However, like other configuration management tools, Chef works best when cookbooks don’t change often. This is easy to accomplish in typical server setup and configuration tasks like install Nginx and tweak conf file. However, it can get tricky with server managing server accounts and credentials as people join or leave your ops team and as you need to rotate server keys.

To tackle identity in servers with automation, Ops teams usually take two approaches: push SSH keys for every user or integrate servers with an LDAP or AD.

Pushing SSH Keys for every admin on every server has significant security implications and requires a lot of effort when an admin leaves the company or when keys need to be rotated. And that can go bad as soon as someone accidentally shares a secret key.

Integrating a large server infrastructure with LDAP or AD – usually done using PAM modules – requires a lot of work. LDAP and AD are monolithic – built to run on the intranet – and not good when you have servers in multiple places, requiring site-to-site VPNs, among other hoops. Do you want to run an HA middleware service blocking your ability to scale in the cloud? This is a pain point no Ops Engineer wants.

However, there’s a third approach: integrate Cloud SSO to secure servers. In this tutorial, we will do that by seamlessly injecting Okta into your Chef Infrastructure as Code to effectively Shift Identity Left:

Note: To follow this tutorial, you need to have an Advanced Server Access (ASA) team provisioned from your Okta Org. If you don’t have an existing ASA team, you can sign up for free here, which requires an Okta Administrator to configure.

Create a project and get an enrollment token in ASA

In Okta ASA, projects work as a collection of servers that share the same access and authorization controls. In a project, you define which users and groups from Okta can access your servers, when they can do so, and what they are allowed to do in the server (i.e. run only certain commands). Any changes in your project (users, group assignments, authorization) are periodically updated in your servers (providing idempotency for identity and access management).

Servers enroll in your project to apply the same security configuration using an ASA agent with a project enrollment token. The ASA agent periodically checks for updates in Okta to update the server configuration.

To get your servers running with Okta, let’s create a project and get an enrollment token:

Access Okta ASA as Administrator. Click Projects and then select or create a new project. Click Enrollment > Create Enrollment Token. Enter a name (i.e. chef-token) and click Submit. Copy the enrollment token:

Download and configure cookbook

In your terminal, navigate to your chef home and clone our sample cookbook:

cd $CHEF_REPO/cookbooks git clone https://github.com/okta-server-asa/asa-chef-example.git cd asa-chef-example

Optionally, review the cookbook contents:

The recipes/install-asa.rb is the default recipe of the cookbook. It: Installs the ASA server agent using the appropriate package for your distro (node['platform_family']) Defines the name of your server in ASA (canonical name) Enrolls your server into the ASA project using the enrollment token you got in the previous section Starts the ASA server agent The recipes/uninstall-asa.rb uninstalls the ASA server agent from your machine. The kitchen.yml provides specs for testing this sample cookbook in Chef’s Kitchen (with VirtualBox and Vagrant). The test/integration/default/default_test.rb provides specs for validating the sample cookbook. The attributes/default.rb file stores the enrollment token for registration in ASA.

Edit the attributes/default.rb file, replacing ENROLLMENT_TOKEN with the token from ASA:

default['asa_enrollment_token'] = 'ENROLLMENT_TOKEN'

To confirm your configuration has no typos, enter cookstyle attributes/default.rb:

# cookstyle attributes/default.rb Inspecting 1 file 1 file inspected, no offenses detected Test cookbook

Now that we have the cookbook set, let’s see it in action. To do this, I’ll use Chef Test Kitchen as a harness environment, and apply the cookbook to multiple Virtual Machines.

Tip: The sample cookbook has a script (kitchen.yml) that sets our test environment automatically using VirtualBox VMs managed with Vagrant.

Launch test kitchen

In your environment with Chef workstation, install VirtualBox and Hashicorp Vagrant. (If you have a mac and homebrew, you can run the commands):

brew install virtualbox --cask brew install vagrant --cask

Enter VBoxManage list bridgedifs | grep Name to get your computer network interface name that will be used by the test VMs (ideally, a network interface with the internet. In my case, its en0: Wi-Fi (AirPort):

# VBoxManage list bridgedifs | grep Name Name: en0: Wi-Fi (AirPort) Name: bridge0 VBoxNetworkName: HostInterfaceNetworking-bridge0 Name: llw0 VBoxNetworkName: HostInterfaceNetworking-llw0 Name: en12: USB 10/100/1000 LAN VBoxNetworkName: HostInterfaceNetworking-en12

Edit and save the kitchen.yml file:

In lines 25-29, define the list of servers/VMs you want to test. You can remove a specific server or distro by adding a comment (#+ space). Example: # - name: centos-7. In line 35, paste the network interface name that will be used to bridge network connections. For example: - ["public_network", { bridge: "en0: Wi-Fi (AirPort)" }]

To set up your kitchen, enter: kitchen create

Chef Test Kitchen will download and launch all VMs in VirtualBox. This may take some time.

To confirm your kitchen is running, enter: kitchen list

The list will show you the VMs with the last action Created:

# kitchen list Instance Driver Provisioner Verifier Transport Last Action Last Error asa-chef-ubuntu-1804 Vagrant ChefZero Inspec Ssh Created <None> asa-chef-centos-7 Vagrant ChefZero Inspec Ssh Created <None> asa-chef-debian-107 Vagrant ChefZero Inspec Ssh Created <None> asa-chef-opensuse-leap-152 Vagrant ChefZero Inspec Ssh Created <None> asa-chef-oracle-82 Vagrant ChefZero Inspec Ssh Created <None> Run cookbook in the test kitchen

To install and enroll your servers on ASA, enter kitchen converge.

Chef will execute the cookbook and enroll all your servers in ASA.

In ASA, you will see your server enrolled in your project:

To confirm the recipe worked, enter kitchen verify:

-----> Verifying <asa-chef-oracle-82>... Loaded tests from {:path=>".asa-chef-example.test.integration.default"} Profile: tests from {:path=>"/asa-chef-example/test/integration/default"} (tests from {:path=>"asa-chef-example.test.integration.default"}) Version: (not specified) Target: ssh://vagrant@127.0.0.1:2203 Service sftd ✔ is expected to be installed ✔ is expected to be enabled ✔ is expected to be running Test Summary: 3 successful, 0 failures, 0 skipped Finished verifying <asa-chef-oracle-82> (0m1.09s).

At this moment, your servers are enrolled in ASA. That means you can access your servers with users and groups from Okta associated with your project.

Test access to servers with Okta

Now that all servers are enrolled in Okta, let’s access the servers as a user:

Install the ASA agent in your workstation (required to access servers as a user): brew install okta-advanced-server-access --cask

To set up the ASA agent, enter sft enroll and follow the instructions in the screen.

To see your servers, enter sft list-servers.

# sft list-servers Waiting on browser... Browser step completed successfully. HOSTNAME OS_TYPE PROJECT_NAME ACCESS_ADDRESS asa-chef-ubuntu-1804 linux Frederico_Servers 192.168.0.101 asa-chef-centos-7 linux Frederico_Servers 192.168.0.102 asa-chef-debian-107 linux Frederico_Servers 192.168.0.103 asa-chef-opensuse-leap-152 linux Frederico_Servers 192.168.0.104 asa-chef-oracle-82 linux Frederico_Servers 192.168.0.105

To ssh into your server, enter sft ssh <name-of-your-server>:

# sft ssh asa-chef-ubuntu-1804 /home/frederico_hakamine #

Note: Wait… what Okta ASA does for the login?

Okta ASA secures access to Linux and Windows servers in SSH and RDP connections using the server agent (the same one that enrolled the server in your project earlier). The ASA server agent, in addition to subscribing your server to a project, also works alongside native OS features such as sudo, users, and openssh to control access during runtime and to capture any login events for audit inspection. Because the agent is light and does not require firewalls and monolith LDAP or privileged access servers, it can be easily distributed across any infrastructure (IaaS, VM, or physical) and embedded in your DevOps tools. To grant users access to servers, ASA operates a programmable Certificate Authority service as part of its SaaS, that issues ephemeral SSH Client Certificates for each authenticated and authorized request. The keys are issued only after ensuring both user and his/her device complies with the organization’s security policies. The use of ephemeral keys provides many benefits. It eliminates the use of static keys and credentials for server access, ensures that both users and machines are audited before any new ssh connection, simplifies access revocation, eliminates the risk of “super account overuse”, and simplifies access audit. What’s next?

After testing the cookbook on your Chef Kitchen, you can use it in your Chef Infra. To do so, tweak the cookbook (i.e. store the enrollment token on chef-vault or a data-bag), upload (knife upload asa-chef-example), and add the cookbook to specific roles and environments within your infrastructure:

You can also turn on additional features in Okta ASA, such as setup sudo grants, time-based access, use of bastion hosts, and SSH session capture just to name a few. All these features reduce the load on your cookbooks and allow provide consistent account configuration across multiple servers.

If you’d like to see more information like this, consider following us on Twitter, subscribing to our YouTube channel, or reading through some of our other DevOps articles!

A Developer’s Guide to Docker Container Security: A Developer Guide Add Docker to Your Spring Boot Application Build a Simple .NET Core App on Docker Using Okta Advanced Server Access & Terraform to Automate Identity & Infrastructure as Code Tutorial: Ansible and Account Automation with Okta Tutorial: Puppet and Account Automation with Okta

Tuesday, 09. February 2021

Meeco

Technical Team Lead

Before we dive into explaining this amazing opportunity, we don’t want to waste your time. So, if you don’t have Australian citizenship or permanent residency, or if you are representing a recruitment agency or an offshore/provider, then this is not for you. We understand you may be an awesome candidate, or have access to great candidates, but ... Read More
Before we dive into explaining this amazing opportunity, we don’t want to waste your time. So, if you don’t have Australian citizenship or permanent residency, or if you are representing a recruitment agency or an offshore/provider, then this is not for you. We understand you may be an awesome candidate, or have access to great candidates, but we will not consider such applications at this time. Meeco is looking for a Technical Team Lead for our Australian team, where you can help shape the API-of-Me. What is the API-of-Me you might ask? It is an emerging platform that enables the collection, protection and exchange of personal data. Our main objective is to provide a suite of developer tools and APIs to companies and organizations to allow them to build privacy first solutions. A secure data enclave that allows people to keep their data safe and share information with the people and organisations they trust. We are passionate about building a new economy where data is not refined, as is done with oil, for power and profit, of a few mega corporations. Rather, we are committed to finding a new equilibrium where people and organizations enter a win-win relationship by building mutual trust in how data is collected, protected and shared. Read our manifesto to learn more. Don’t hesitate to apply, no matter what your background is. We would like to build a diverse team, and we welcome everyone. Our team is spread across Australia and Belgium, and collectively we come from diverse cultural backgrounds, speaking a range of languages, each with a unique personal data story. About the Role We are a global team of 25 people based in Australia, Belgium and the UK. The platform team currently consists of 17, responsible for designing, implementing, supporting and documenting several services that, all together, form the backbone of our platform. Currently our technology stack consists mainly of several Ruby on Rails, Elixir and Java components for the backend services. For frontend, we primarily use Javascript and Dart for smartphone and decentralised applications, based upon blockchain and distributed ledger technology. We’re a company that embraces innovation, so it is likely that our technology stack will continue to evolve and change. We are not averse to adding other technologies if the need arises for specific services. We take pride in building software that is easy to understand, clean, well-structured and properly tested. We have formed our own opinions on how to write good code that are a result of programming for more than two decades. Sometimes, we have even found ourselves going against the popular vote on several topics. We favour the explicit over the implicit We treat frameworks as tools and not as an unquestionable authority We embrace the relational database and its possibilities and as such treat it as a first order citizen. The following will help you understand what you’ll be doing on a day to day basis: Being a technical leader; mentoring, triaging problems, offering a guiding hand Coming up with a detailed plan on how to approach high-level requirements including tasks and estimations Supporting the team to allocate and distribute the work according to individual capabilites Designing and implementing new services with business logic called from existing & new endpoints Investigating bug reports that inevitably occur, helping a colleague identify problems Reviewing and approving Pull Requests Implementing data integrations to enable end-users access public and private data, this might include open banking, social, health, wearable, weather, travel or IoT data Working with the latest W3C & DIF Standards for implementations for distributed ledger implementations including Verifiable Credentials, Distributed Identity including Self-Sovereign Identity Working on leading-edge payments solutions including micro-payments and tokens Contributing to a more transparent, open and trust-based personal data eco-system. About You We are looking for someone who has a deep understanding of how to write RESTful resources exposed as APIs, cares about clean code (a lot) and is ready to learn (a lot). It is a big plus if you have experience writing these web-based applications in Ruby on Rails and/or in building decentralized applications in blockchain technologies. We build applications for the internet of tomorrow, a place where personal data is placed right at the center, protected start to finish. This means we often tackle questions that have no established technical solutions. Google won’t help you as much in this case as it requires a combination of existing tech and emerging solutions. One class of technology that we are exploring and see great potential is in the distributed ledger space. A good understanding and professional experience building decentralised applications or dealing with DLT is a big plus. Meeco is an environment with more questions than answers. If you are drawn to original problems, which require imaginative solutions, you’ll feel right at home. This does require you to be open for and comfortable learning new things and develop proof-of-concepts on short timelines, identifying the tasks that matter the most. In addition to established experience as a developer, we expect that you have at least 2+ years experience in steering a team. using an agile methodology, be it scrum or a derivative. You are able to support our executive team (Technology, Product, Design and Commercial Officers) in writing up technical proposals, along with translating Meeco’s technical vision into practical goals that support the commercialisation of our products.  The following is our crazy wish-list. It is not necessary that you have all this experience, but the more proven competencies you have will increase your chances of success in the role: You have taken lead on directing and implementing solutions to complex, unbounded (technical) problems based on limited scope  You have a track record of setting direction for teammates You know how to take steps in improving the quality of existing and future software You are comfortable communicating your ideas both verbally as well as written. We love to create diagrams to describe future systems. You have used more than one programming language to build software, which can of course include hobby and passion projects We have a soft spot for functional programming, so maybe you have too We build both backend and frontend applications, so if you have any relevant frontend programming experience, especially with JavaScript, TypeScript and/or Angularjs would be awesome You have used languages with built-in concurrency models like Go, Erlang, or Elixir If you have a language you love, tell us about it! You have a good understanding of what goes on in a RDBMS, especially Postgresql. You are familiar with devops and Microsoft Azure would be a bonus.  You know how to use Docker and Kubernetes. We don’t expect you to know everything from the start and we have a team to show you the ropes. A good set of solid programming skills, natural leadership and to care about our mission is all you need. You might have a computer science or related degree, or you might not. This is not what we are about. We care about what you know now and how you are improving yourself every day. This is a job where we support remote work. However, we do have an office at Stone & Chalk at Docklands in Melbourne. Given the leadership and mentoring aspects of the role,– we would like you to be in the office at least three days a week. At other times you are free to work where you work best. We have team members in Adelaide, Melbourne and Sydney in Australia, London (UK) and spread out in Belgium. An important reality is that this can often mean early mornings or long days.  We have flexible hours, so you can design your day around being an early bird or night owl. For six months of the year the time difference with Europe can feel brutal, and then the other six months when Europe is on summertime it is much easier. However, being able to work across time zones and self directed is a critical aspect of this role. We value people who can work independently, are able to manage themselves, are supportive to each other and not afraid to ask for help when needed. We love you asking for help! What We Offer We pay a competitive start-up package based on your experience and seniority in line with the existing members in the team. As we are still building our foundation team, additional incentives such as Options are available for the right person. What We Have Achieved We are early pioneers in the personal data economy and have been campaigning for digital rights, data portability and data control since 2012. You can read more on our achievements and announcements on our blog.  Funding; our founder bootstrapped the company for the first two years before raising private equity. We have runway and revenues, so the focus for us now is building awesome products Brand & Reputation: We launched our first product into the market in 2014 and have built a strong brand in the personal data community.   Search for Meeco on-line and you will see our consistent focus in helping build a global personal data economy. If you would like to know more about our crazy, wonderful, roller-coaster start up journey check out this recent keynote at the MyData Conference in Helsinki, or these recent podcasts such as this on entrepreneurship or this on identity featuring our founder, Katryna Dow. Awards: We have won eight international awards in Australia, UK, Netherlands and Germany for personal data management, identity, innovation and FinTech services. Market: In December 2019 we launched our secure data vault inside KBC’s (Belgium) banking app and next month we are launching a media platform for kids called mIKs-it These are just two great examples of how enterprises and other start-ups are using our technology to developer better Privacy and Security by Design personal data solutions. Innovation: Through 2020 we doubled our engineering team along with some really interesting blockchain/distributed ledger projects which we are currently productising to launch throughout 2021. How to Apply By reading this you have reached the end, so if you feel you are the right person for the job, all you have to do is contact us at jointheteam@meeco.me Please include what you like about the role, why you may want to join, any questions along with your resume and confirmation of your Australian citizenship or residency. Last time we advertised we had hundreds of applications. Whilst we were delighted it took us a few weeks to consider the long-list and juggle interviews between Australia and Europe. If you meet the criteria, we will be in touch with you as soon as possible. We’re all looking forward to meeting you soon! 

The post Technical Team Lead appeared first on The Meeco Blog.


Global ID

The GiD Report#146— The Elon effect and the need for Trusted Communities

The GiD Report#146 — The Elon effect and the need for Trusted Communities Welcome to The GiD Report, a weekly newsletter that covers GlobaliD team and partner news, market perspectives, and industry analysis. You can check out last week’s report here. This week: Elon ❤ Bitcoin Podcast highlights This week in Big Tech The need for Trusted Communities Chart of the week (Big Te
The GiD Report#146 — The Elon effect and the need for Trusted Communities

Welcome to The GiD Report, a weekly newsletter that covers GlobaliD team and partner news, market perspectives, and industry analysis. You can check out last week’s report here.

This week:

Elon ❤ Bitcoin Podcast highlights This week in Big Tech The need for Trusted Communities Chart of the week (Big Tech profits) Stuff happens 1. The news everyone is talking about — Elon Musk goes big into Bitcoin (with Tesla).

From their SEC filing (via /j)

Thereafter, we invested an aggregate $1.50 billion in bitcoin under this policy and may acquire and hold digital assets from time to time or long-term. Moreover, we expect to begin accepting bitcoin as a form of payment for our products in the near future, subject to applicable laws and initially on a limited basis, which we may or may not liquidate upon receipt.
Photo: Pewdiepie/YouTube

So not only are corporations and institutions leaning into Bitcoin and crypto, you’ll be able to buy a Tesla with BTC.

It’s another giant leap toward mainstream adoption.

My Wall Street had some thoughts on how such moves might impact potential regulation:

I think the most bullish thing is to have at least one large public American company with significant holdings as the government undertakes writing regulations. It’s super bullish to have the government writing regs knowing that they don’t want to crush the market value of significant corporate holdings. It’s different if they think they’re just crushing drug dealers and money launderers.

Makes sense.

2. This week in Big Tech — podcast highlights edition.

Greg Kidd on breaking up Facebook, Lina Khan, and the best thing about America:

3. This week in Big Tech:

Facebook v. Apple:

Hoping to prevent users from being spooked by Apple’s upcoming privacy changes, Facebook is testing a notification that tells iPhone users how it uses their data to tailor ads to them, Axios’ Sara Fischer reports.
The big picture: The test is happening in light of upcoming changes to Apple’s privacy settings that will require iPhone owners to give apps — including those from Facebook — permission to collect data for ad targeting.

Relevant:

Tech workers take tentative steps toward unions Unions in the COVID era have been revitalized after decades of losing power Via /mayakFormer Google CEO Fears ‘Chilling Effect’ of Antitrust Probes Klobuchar introduces sweeping antitrust reform bill Facebook’s booming business, sinking reputation Facebook warns advertisers on Apple privacy changes Via /gregkiddWho is in Charge? Tech Elites or the Digital “Mob”? Josh Hawley proposes “preemptive” ban on Big Tech mergers Twitter Temporarily Blocked Accounts Critical Of The Indian Government Via /gregkidd — E19: Robinhood’s GameStop decision: Why did it happen and how can it be prevented in the future? 4. The need for Trusted Communities:

Wired thinks Facebook Groups are “destroying America” — in part due to low trust:

But as our research shows, those same features — privacy and community — are often exploited by bad actors, foreign and domestic, to spread false information and conspiracies. Dynamics in groups often mirror those of peer-to-peer messaging apps: People share, spread, and receive information directly to and from their closest contacts, whom they typically see as reliable sources. To make things easier for those looking to stoke political division, groups provide a menu of potential targets organized by issue and even location; bad actors can create fake profiles or personas tailored to the interests of the audiences they intend to infiltrate. This allows them to seed their own content in a group and also to repurpose its content for use on other platforms.

Relevant:

Facebook Groups Are Destroying America Facebook Knew Calls for Violence Plagued ‘Groups,’ Now Plans Overhaul NYTimes: How to fix Facebook Groups

Discord has its own trust issues:

Discord Scammers Lure Users to Fake Exchange With Promise of Free Bitcoin — CoinDesk 5. Chart of the week: 6. Stuff happens: Mitja Simčič provides secure digital identities with his startup in San Francisco Via /jvsDIF is opening up: Join for free Zelle® Closes 2020 with Record $307 Billion Sent on 1.2 Billion Transactions Via /jFrench DJ David Guetta says fair if festivals require vaccinations<