Last Update 8:56 AM February 06, 2023 (UTC)

Company Feeds | Identosphere Blogcatcher

Brought to you by Identity Woman and Infominer.
Support this collaboration on Patreon!

Monday, 06. February 2023

MyDEX

Artificial Intelligence? No. Authentic folly on a grand scale

Image generated by AI using openai.com/dall-e-2 This is one of a series of blogs exploring Hidden in Plain Sight: The Surprising Economics of Personal Data, the subject of Mydex CIC’s latest White Paper. Do you have high hopes (or fears) of a future dominated by artificial intelligence? If so, the story of IBM’s venture into AI with Watson Health provides a cautionary tale. Watson began 10 y
Image generated by AI using openai.com/dall-e-2

This is one of a series of blogs exploring Hidden in Plain Sight: The Surprising Economics of Personal Data, the subject of Mydex CIC’s latest White Paper.

Do you have high hopes (or fears) of a future dominated by artificial intelligence? If so, the story of IBM’s venture into AI with Watson Health provides a cautionary tale.

Watson began 10 years ago, in 2012, when IBM decided it could use AI to make medical treatment recommendations that, it would soon be claimed, would put doctors out of their jobs.

Watson would “quickly find hidden insights … in massive volumes of data” to revolutionise healthcare, claimed IBM. It would transform medical imaging, transform our ability to diagnose rare diseases and extend treatments for scourges like cancer to where doctors are scarce (in places like India). It would understand each individual’s genetic profile to create a new era of personalised health care. And, because it could read and absorb far more medical research than any doctor could ever do in a lifetime, it would transform the ability to provide medical advice. Doctors would be increasingly redundant.

IBM’s PR machine went into overdrive, as did a sycophantic gullible media.

Ten years later, IBM quietly sold Watson Health for $1bn. Wow! you might say. That’s a lot of money! But it was nothing compared to what IBM had poured down the drain: over $5bn spent acquiring as many databases as it could, plus the costs of the 7000 people employed on the project. Watson Health — the ‘moonshot’ that was going to revolutionise health care globally — was a total failure.

Even while IBM was touting Watson as a saviour of mankind, doctors were telling the company it was useless. Either it told them stuff they already knew or made recommendations that were inappropriate for individual patients. Its advice didn’t travel either. The recommendations it gave to medics in India just didn’t fit Indian circumstances because the AI was trained by American doctors in American hospitals.

A fundamental fallacy

Many reasons have been given for Watson Health’s failure. Perhaps, if it had been trained by Indian doctors in Indian hospitals, it would have been more helpful in India, for example. (Perhaps, though at what cost?).

Many other criticisms of AI are also unfair: racist and sexist policies, for example, where the real fault lies with the people managing the process, not the technology itself. But what lies behind the Watson fiasco is a fundamental misunderstanding of the nature of artificial intelligence itself: a belief that it is somehow replicating or replacing human intelligence.

This fallacy is now so widespread that it’s almost accepted wisdom. For example, both the UK and the Scottish governments are building AI and digital strategies that define AI in terms of computers doing “tasks commonly (or normally) requiring human intelligence”.

Watson-fed hype about the supposed feats of AI led Yuval Noah Harari (the supposedly learned author of the best-selling book Sapiens) to declare that with AI, “Humanity will create something more powerful than itself. When you have something that understands you better than you understand yourself, then you’re useless. Anything you can do, this system can do better.”

The UK Chancellor of the Exchequer Jeremy Hunt expressed the same thought when, in a recent speech, he asked “Who needs politicians when you have AI?”. No doubt he was joking, but the very fact that he could make such a quip illustrates how far the hype has gone. Because that’s what it is: hype, born of deep misunderstanding.

So-called artificial intelligence is nothing more or less than a machine — a computer — crunching lots of numbers using mathematical and logical functions in a strictly mechanical fashion. Er … that’s it. The term ‘artificial intelligence’ would lose its glamour (and its enormous value to hand-waving snake oil salesmen) if it said “dumb routine calculation at massive speed and scale”. But that’s what it is — and here is the essential point: such an ability to calculate does not equal human intelligence. AI does not ‘understand’ anything. The two are like chalk and cheese.

AI is good at solving logical, computable problems with clear, definite boundaries using clearly defined, fixed rules to achieve clearly defined, fixed answers. Like a game of chess, or recognising a face, or processing the way people use language. That’s immensely valuable. We at Mydex are enthusiastic about tools that do such things. But they are not ‘intelligent’ .

Human intelligence did not evolve to solve logic puzzles like chess. It evolved to help humans survive by adapting to their environments — in a world where the most important problems are open-ended, with unclear boundaries (where one problem is likely to blur into other problems), where there are no clearly defined best answers and where, if we do find some good rules and answers, chances are that things will change so we need to change them for tomorrow. Human intelligence is designed for a changing world full of other people, in other words.

Spouting nonsense

Human intelligence is biological: embedded in a living, feeling body that is the product of billions of years of evolution, that has its own individual history and lives in a society with its own history, culture, morals, values and norms.

AI has none of this. As a result, it has no common sense. No concept of causation. No ability to ask ‘what if?’ questions. No imagination. No creativity (unless programmed to combine random elements, which is a million miles away from purposeful, intuitive human creativity). No ability to use analogy or metaphor (e.g. noticing the ways in which X is ‘like’ Y in some ways, but not in others). No ability to ‘level jump’ — to question rules as well as operate by them. For example, it cannot stop what it is doing to ask “Hang on! Does this actually make sense?” (which is why it is sometimes very good at generating nonsense). No ability to switch attention (by definition, because it never actually pays attention to anything). No feeling. No empathy. No awareness of context. No ability to sense situations and judge appropriateness. And so on. And so on.

In fact, as well as displaying all the drawbacks and dangers of Big Data discussed in our last blog, AI is so dumb that it is a major feat of AI research — a testimony to researchers’ genius — that, when it is presented with a picture of a cup of tea on a saucer with a spoon, they can teach it to identify the boundaries of the tea cup; an even bigger feat to teach it that the tea cup, saucer and teaspoon are separate objects. Something that every human child knows instantly without even thinking about it.

Fish or fowl?

AI’s failures on these fronts isn’t because of a lack of access to data or computing power. It’s computation and human intelligence are qualitatively different things. Teaching an AI to think like a human is like teaching an eagle to breathe water and swim and a tuna fish to breathe air and fly. It’s not going to happen because neither eagle nor tuna fish are designed to do these things. No matter how much data, computing power or money you throw at computer-driven number crunching it can never replicate a human being’s thinking, because they are designed in different ways to do different things.

Talking of design, you may have noticed I used the term ‘an AI’ in the last paragraph. That’s deliberate. There is no such thing as ‘artificial intelligence’ in general, aside from a human discipline: a collection of (very human) skills and knowledge about computing. In practice, there are only specific AI tools designed to do specific things, such as recognise faces or play chess.

AI tools designed to do these highly specific tasks cannot suddenly decide to do something else such as natural language processing, just as a mechanical tool such as a drill or kettle cannot suddenly decide to become an aeroplane. Both AI tools and mechanical tools are designed and built (by humans) to do specific things, and only those things.

So when the UK and Scottish Governments declare that AI is about computers doing things that normally require human intelligence, they are spouting nonsense. In fact, the exact opposite is true. The reason why AI can be valuable is because it can do one thing that humans can not do — crunch huge amounts of numbers in no time at all to identify correlations and patterns. That’s fantastic. We should use that ability for all its worth. But as soon as we ask, hope or expect AI to do anything more than that, all we are doing is wasting time, effort and money and risking disaster. As with Watson Health.

A deep-rooted misunderstanding

To be fair to the UK and Scottish Governments, this deep, fundamental misunderstanding of the nature of the beast has long, venerable roots. It goes back to a seminal paper, published by the father of AI Alan Turing back in 1950 when he asked “Can machines think?”

In that paper, entitled ‘Computing Machinery and Intelligence’, he invented the term ‘machine learning’ and laid out what would become an AI research agenda for the next three quarters of a century. Pure genius. But the paper did NOT answer ‘Yes’ to the question ‘Can machines think?’.

Instead, in the very first paragraph, Turing ducked his own challenge to ask a different question instead: whether machines can imitate human thinking. Then he spent the next half the paper narrowing his ‘imitation game’ down, further and further, setting aside all the ways in which machines can not imitate human intelligence, to finally focus on one tiny area of overlap: making calcuations using numbers.

Turing’s paper is therefore seminal in two ways: in the research agenda it created, and in the endless misunderstandings it spawned. Today, we are having to deal with both consequences — including folly on a grand scale resulting in:

Billions of £/$/€ and decades worth of effort being spent on wild goose chases such as Watson Medical Immense harm being done by multiple misapplications of AI, arising from people believing it can do things that it can’t A failure to invest available resources in uses of data that could bring immense personal, social and economic benefits very quickly.

In this blog series we have shown how providing every citizen with their own personal data store can ease citizens’ access to services, cut the costs of providing these services, improve peoples’ experience of using them, and join them up to produce better outcomes — while also opening up new dimensions of innovation and accelerating and enriching the journey to net zero.

Yet, in an extreme case of selective attention — of counting bounces while ignoring gorillas — all these immediate, practical, definite benefits are being ignored in favour of expensive, futile pipedreams born of gullibility and ignorance. Just imagine the real, demonstrable benefits, as outlined above, that Mydex personal data stores could deliver if it had $5 billion, 7000 staff and a decade to invest in providing personal data logistics infrastructure to inform all service provision!

Conclusion

To stress once more: we at Mydex are all in favour of AI, deployed correctly, to do what it can do. We have high hopes of personal AI, which crunches individuals’ own personal data to help them make better decisions and manage their lives better — working for them, rather than doing things to them — while protecting their privacy.

Right now, as part of the Peoplehood project for example, we are collecting billions of data points from individuals’ smart meters to identify abnormal behaviour patterns for the old and frail — to generate alerts if they have a fall, for example.

This is all wonderful stuff. But it is not a panacea. Policy- and decision-makers’ fixation on Big Data and AI are one reason why the real, immediate, practical opportunities presented by personal data stores are being overlooked. But there is another reason too: mainstream economic theory is blinding people to the economic potential of personal data. That’s the subject of our next blog.

Artificial Intelligence? No. Authentic folly on a grand scale was originally published in Mydex on Medium, where people are continuing the conversation by highlighting and responding to this story.

Friday, 03. February 2023

FindBiometrics

Illinois Supreme Court Sets Statute of Limitations for BIPA Claims

The Illinois Supreme Court has ruled that the state’s Biometric Information Privacy Act (BIPA) has a five-year statute of limitations. The decision is the result of a proposed class action lawsuit against […]
The Illinois Supreme Court has ruled that the state’s Biometric Information Privacy Act (BIPA) has a five-year statute of limitations. The decision is the result of a proposed class action lawsuit against […]

Shyft Network

Shyft DAO Ambassador Program Proposal is Now Live

Shyft DAO is taking one of its first major steps — proposing an Ambassador Program aiming to create a vibrant community. We invite you to comment, cast your vote and be part of this special moment! 🎉🎊 The DAO proposal for required funds to launch the Shyft Ambassador Program is open for comment until 17:00 UTC on 4/2/2023. It will be followed by a Snapshot vote, beginning at 17:00 UTC on 5/2

Shyft DAO is taking one of its first major steps — proposing an Ambassador Program aiming to create a vibrant community. We invite you to comment, cast your vote and be part of this special moment! 🎉🎊

The DAO proposal for required funds to launch the Shyft Ambassador Program is open for comment until 17:00 UTC on 4/2/2023. It will be followed by a Snapshot vote, beginning at 17:00 UTC on 5/2/2023. The ask is for 300,000 SHFT tokens and $31,400.

The goal of the Shyft Ambassador program is to decentralize the protocol further, expand our reach and allow community members to contribute to Shyft’s future.

Shyft Ambassadors will promote the DAO and its initiatives, organize local events, mentor new joiners, create educational content, moderate Discord and Telegram channels, and more.

We encourage you to leave your feedback and vote on the proposal. Your vote will determine the future of the Shyft Ambassador Program and help guide our community’s growth.

We look forward to your active participation in this initiative 🙌🏼

Shyft DAO Ambassador Program Proposal is Now Live was originally published in Shyft Network on Medium, where people are continuing the conversation by highlighting and responding to this story.


FindBiometrics

Taiwan Cracks Down On Biometrics Talent Poaching: Identity News Digest

Welcome to FindBiometrics’ digest of identity industry news. Here’s what you need to know about the world of digital identity and biometrics today: CCPA Gets New Consumer Protections The California […]
Welcome to FindBiometrics’ digest of identity industry news. Here’s what you need to know about the world of digital identity and biometrics today: CCPA Gets New Consumer Protections The California […]

This week in identity

E21 - Saviynt Raise $205M / Radiant Logic to acquire Brainwave GRC / SiberX Toronto / Future of Cyber Manchester

This week Simon and David review some interesting moves in the identity governance and administration space.  First up Saviynt raised $205 million (along with founding CEO Sachin Nayyar returning as CEO after a stint at Securonix) to bolster their Enterprise Identity Cloud offering.  Next up they discuss Radiant Logic entering into a definitive agreement to acquire French IGA specialist

This week Simon and David review some interesting moves in the identity governance and administration space.  First up Saviynt raised $205 million (along with founding CEO Sachin Nayyar returning as CEO after a stint at Securonix) to bolster their Enterprise Identity Cloud offering.  Next up they discuss Radiant Logic entering into a definitive agreement to acquire French IGA specialist Brainwave GRC.  What does this tell us about the global IAM and IGA space?  Where will they head to?  Will more funding and acquisitions happen in 2023?  They also review SiberX CISO Forum in Canada and the Future of Cyber conference held in the UK.


Extrimian

Exploring the Impact of AI and Blockchain in the Digital World

Artificial Intelligence (AI) is revolutionizing the blockchain ecosystem, enabling digital identity and computerized intelligence. As AI technology advances, it has the potential to transform the way we interact with blockchain networks and how we manage our digital assets  In this blogpost, we’ll explore how AI is driving innovation in the blockchain space and what implications... The post

Artificial Intelligence (AI) is revolutionizing the blockchain ecosystem, enabling digital identity and computerized intelligence. As AI technology advances, it has the potential to transform the way we interact with blockchain networks and how we manage our digital assets

 In this blogpost, we’ll explore how AI is driving innovation in the blockchain space and what implications it could have on our lives. We’ll also look at some of the use cases of AI in blockchain such as machine learning and automated transactions. Finally, we’ll discuss some of the challenges that need to be addressed before this technology can reach its full potential.

 

Why is there so much talk about AI and Chat GPT lately? What are its benefits and risks?

Chat GPT (Generative Pre-trained Transformer) is a form of artificial intelligence that is becoming increasingly popular in the world of chatbots. It is a natural language processing system that uses machine learning to generate human-like responses to user queries. This technology has been developed to allow chatbots to understand and respond more accurately to the conversations they engage in.

This is a powerful tool and has the potential to revolutionize how businesses interact with customers. With its natural language processing capabilities, it can respond to customer queries in real-time, providing them with personalized responses and even recommending products or services based on their conversations.

Additionally, Chat GPT is able to understand the emotional context of conversations, allowing it to better respond to customers’ emotional needs.The use of artificial intelligence and Chat GPT is becoming increasingly popular, with many businesses turning to it to help improve customer service.

While the technology does have its benefits, it also has potential dangers. For example, Chat GPT can be used to manipulate users, as it can tailor its responses to certain information it has gathered from the user.

Introduction: What is AI Artificial Intelligence and Blockchain and How Can they be Used to Transform Businesses?

As we have already said, AI and blockchain are two of the most cutting-edge technologies that have the potential to revolutionize businesses. AI Artificial Intelligence is a branch of computer science that enables machines to think and act like humans. On the other hand, blockchain is a distributed ledger technology that provides a secure way for digital transactions.

When used together, AI and blockchain can help businesses stay ahead of the competition by providing them with a competitive edge over their competitors. They can be used to automate mundane tasks, improve data security, enhance customer experience, and develop new applications quickly and efficiently. Businesses can leverage these technologies to gain an edge in their respective industries by leveraging their advantages such as improved data security, faster transaction processing times, and cost savings.

The Benefits of Combining AI Artificial Intelligence & Blockchain for Businesses

The combination of AI Artificial Intelligence and Blockchain is becoming a popular trend in the business world. This technology can help businesses to improve their data security, reduce operational costs, and create innovative applications. As AI Artificial Intelligence and Blockchain become more integrated, businesses will be able to leverage these technologies for better customer experience, cost savings, and improved data security. With AI Artificial Intelligence and Blockchain working together, businesses can create secure applications that are faster and more efficient than ever before. Furthermore, these technologies can help companies to protect their sensitive data from cyber-attacks while providing customers with a seamless user experience.

How AI Artificial Intelligence & Blockchain are Transforming Industries like Finance, Healthcare & Retail

Artificial Intelligence (AI) and Blockchain technology are two of the most revolutionary technologies of our time. They are helping to transform a wide range of industries, from finance and healthcare to retail.

The banking industry has been one of the first to benefit from AI and blockchain technology. Banks can now use AI-based systems to detect fraud faster than ever before, while blockchain-based smart contracts automate financial transactions, reducing costs and time spent on manual processes.

In the healthcare industry, AI is being used for medical diagnostics and treatment recommendations, while blockchain technology is being used to securely store patient data. Meanwhile in the retail sector, AI-powered chatbots are providing customers with personalized shopping experiences, while blockchain is helping retailers track shipments more accurately.

These are just some of the ways that AI Artificial Intelligence & Blockchain are transforming industries like finance, healthcare & retail – a trend that will only continue in the years ahead.

Best Practices for Leveraging AI Artificial Intelligence & Blockchain in Businesses

AI and blockchain technology are transforming the way businesses operate. The Artificial Intelligence can help to automate and streamline processes, while blockchain technology can provide secure, transparent data storage and transactions. For businesses looking to take advantage of these technologies, it is important to understand the best practices for leveraging them effectively.

From understanding the different types of AI and blockchain applications that are available to developing strategies for implementation, there are many steps that businesses should take in order to maximize the benefits of these technologies. Additionally, organizations need to consider how they will protect their data from external threats while also ensuring compliance with applicable laws and regulations. By following best practices for leveraging AI and blockchain in business operations, companies can ensure they are taking full advantage of these powerful technologies.

10 Fresh AI Projects What Is AI and its Use Cases?

This is a technology that enables machines to learn, think and act like humans. It is an advanced form of computing that uses algorithms to process large amounts of data and generate insights. AI has been used in various industries such as healthcare, finance, retail, and automotive.

AI can be used for a variety of tasks such as natural language processing (NLP), computer vision, predictive analytics, robotics, and more. This techcan help businesses automate mundane tasks and make decisions faster with more accuracy. Examples of AI projects include self-driving cars, facial recognition systems, virtual assistants like Alexa or Siri, automated customer service chatbots, and more.

The use cases for AI are endless – from helping with medical diagnosis to providing personalized recommendations for customers – the possibilities are endless! With the right strategy and implementation plan in place businesses can leverage the power of AI to gain competitive advantage in their industry.

AI Projects for Healthcare and Medicine

Artificial Intelligence is playing an increasingly important role in the healthcare and medicine sector. AI projects are being developed to help diagnose diseases, analyze medical images, and even assist in surgery. AI-powered robots are being used to perform complex medical procedures with greater accuracy and precision than ever before.

AI is also being used to develop medical imaging systems that can detect anomalies in patient scans quickly and accurately. This technology can be used to diagnose a wide range of conditions from cancer to heart disease more quickly and accurately than ever before. AI-powered systems are also being developed for drug discovery, which could revolutionize the way that new treatments are developed for diseases.

AI in Business & Finance

This technology is revolutionizing the way businesses and financial institutions are operating. AI-driven machine learning algorithms are being used to process large amounts of data, detect patterns and trends, and make predictions. This is leading to breakthroughs in financial analytics, fintech innovation, and smarter decision making.

AI has the potential to improve efficiency in many areas of finance including investment analysis, risk management, fraud detection, customer service automation, portfolio optimization, and more. As AI technology continues to evolve at a rapid pace it will become increasingly important for businesses and financial institutions to stay ahead of the curve by investing in AI-driven solutions.

AI in Education

AI (Artificial Intelligence) has the power to revolutionize education and make learning more accessible, interactive, and enjoyable than ever before. This technology is enabling students from all walks of life to access amazing educational resources regardless of their geographic location or economic status. AI also allows for personalized learning experiences based on an individual’s abilities as well as providing solid data and insights into student’s performance that can be used by teachers to better meet each student’s needs. All in all, there are so many possibilities with this rapidly-evolving technology!

From utilizing voice recognition software such as Amazon Alexa and Google home that provide quick answers to common questions; virtual reality programs that create immersive classrooms; smart tutors offering comprehensive support tailored explicitly toward every learner’s progress; game-based assessment tools designed to evaluate understanding in a fun way – there really is no shortage of engaging methods for leveraging cutting edge Artificial intelligence for educational purposes today! Whether your child attends public school or you have opted for homeschooling due to the covid 19 pandemic – AI could be just what they need right now when it comes taking those first steps towards developing meaningful relationships with computers while expanding their knowledge base through exciting new technologies.

AI-Based Solutions for Transportation & Logistics

With the help of artificial intelligence (AI) and machine learning (ML), organizations can now automate and optimize their processes to make transportation and logistics operations more efficient and cost-effective. AI-based solutions can help with route optimization, predictive maintenance, and more. 

Route optimization is one of the most popular AI-based solutions for transportation and logistics. AI-based route optimization software can analyze a wide range of factors, such as traffic, weather, driver availability, customer demand, and more, to determine the most efficient and cost-effective route for a delivery. This helps transportation and logistics companies save time and money, while also reducing their environmental impact. 

AI-based predictive maintenance is another popular use case for transportation and logistics. Predictive maintenance software can monitor the condition of vehicles and other equipment in real-time and alert operators when maintenance is needed. This helps to prevent breakdowns and reduce the cost of repairs, as well as increase safety and reduce accidents.

Autonomous vehicles are also being used to revolutionize the transportation and logistics industry. Autonomous vehicles are equipped with AI-based systems that enable them to navigate safely on their own

Robotics & Autonomous Machines

When it comes to robotics and autonomous machines, AI is a powerful tool that can help unlock the potential of these machines. AI can be used to help robots and machines to think, learn, and make decisions – leading to more efficient and intelligent operations. AI can be used in a variety of ways in robotics and autonomous machines, from helping with logistics automation to powering autonomous vehicles and smart transportation systems. Here are a few of the best AI use cases for robotic and autonomous machines. 

Logistics Automation with AI: AI can be used to power logistics automation, helping to streamline the process of transportation and delivery. Through automated route planning, AI algorithms can help to optimize the delivery of goods by reducing cost and time. AI can also be used to detect and identify objects and obstacles, helping robots and autonomous machines navigate their environment safely and efficiently. 

Autonomous Vehicles: Autonomous vehicles are becoming increasingly common, and AI is an essential tool in helping to power these vehicles. AI algorithms can help autonomous vehicles identify and respond to their environment, as well as make decisions in real-time. This can help to increase the safety and efficiency of autonomous vehicles, while also reducing costs.

Computer Vision-based Projects Using AI

Computer vision-based projects using AI have opened up a variety of possibilities for businesses, allowing them to automate and optimize processes while reducing costs and gaining insights into customer behavior. With the advancements in AI and computer vision, businesses can now use computer vision to gain valuable insights and implement automated processes.

One of the most popular use cases for computer vision projects is logistics automation. AI-powered computer vision can be used to automate the tracking and monitoring of goods, including inventory and delivery tracking. AI-based systems can also be used to automate the sorting of items when they arrive at warehouses or distribution centers. This automation can save businesses time and money, as well as reduce errors in the process.

Another popular use case for computer vision projects is autonomous vehicles. AI-based computer vision can be used to develop autonomous vehicles that can navigate roads and traffic without human intervention. These vehicles can be deployed in a variety of scenarios, from delivery services to passenger transport. AI-powered computer vision can help detect objects in the environment, recognize traffic signs, and make decisions about which route to take.

Finally, AI-based computer vision can be used to create smart transportation systems. Computer vision can be used to detect and identify objects in

Internet of Thing What is the future of the relationship of artificial intelligence in the field of blockchain technology?

As the technological landscape continues to evolve, we are seeing the emergence of a new generation of technologies that are transforming the way we interact with the world. Artificial intelligence (AI) and blockchain technology are two of the most impressive and disruptive technologies to emerge in recent years. Both are highly versatile and offer a wide range of potential applications. But what is the future of the relationship between AI and blockchain? How can these two technologies work together to drive innovation and create new opportunities?

The potential of AI in the field of blockchain technology is significant. AI can be used to automate complex processes, such as the execution of smart contracts. This automation could revolutionize the way that data is shared and stored in the blockchain. Additionally, AI can be used to create more efficient and secure authentication and authorization systems, which are necessary for blockchain networks to function properly.

AI also has the potential to revolutionize the logistics industry. Autonomous vehicles and smart transportation systems can be used to automate the delivery of goods and services. AI can be used to optimize the routing of these vehicles, resulting in the efficient and cost-effective delivery of packages. Additionally, AI can be used to streamline the process of tracking goods and ensuring that they are delivered on

The post Exploring the Impact of AI and Blockchain in the Digital World appeared first on Extrimian.


YeshID

Day 5: Securing your data - Managing and monitoring applications access

Audit all apps, determine which are trusted and then limit scopes for all other applications. Why? One of the largest surface areas for...

Audit all apps, determine which are trusted and then limit scopes for all other applications.

Why?

One of the largest surface areas for data to be unknowingly leaked is through an unmanaged application through OAuth in Google Workspace. OAuth is a widely used protocol for granting access to resources. When a user grants an application access to their Google Workspace data through OAuth, they may not be fully aware of the scope of access they are giving to the application, and the application may have access to more data than the user intended. Additionally, if the application is not properly secured or is malicious, it may misuse the access it has been granted and potentially leak the user's data.

Guidance

You will first want to audit all connected apps, determine which ones are still needed, mark them as trusted, and prevent all other applications from accessing any high-risk/restricted scopes. Before restricting scopes, communicate with your employees about the change and provide a documented path for having new connections approved.

Reviewing Apps

Head to Security > Access and Data Control > API Controls and review your third-party API app access list. This will allow you to view every app in your organization in addition to what scopes have been granted.

Set Apps As Trusted

Look to see if there is an internal list of applications that have been approved. If they are allowed to access data from Google, mark them as Trusted. This will prevent them from being blocked if and when you move to block access for untrusted apps.

Set Access Levels for APIs

Note: Setting access levels will start restricting access. This will prevent employees (including yourself!) from connecting apps to Google that request any scopes you restrict.

Under Security > Access and Data Control > API Controls > Google Services - View List, set the access levels that you want to use for each application. For example, if you store proprietary data in Google Drive, set Drive to be a Restricted App so that only trusted apps can authenticate with Drive scopes.

We recommend restricting the following:

Drive Mail Calendar Google Workspace Admin Vault Groups

Once complete, add a custom message under Settings to give users a meaningful error and path forward - include a link to your IT helpdesk contact information.

On the main API Controls page you can optionally restrict all third-party API access, but this will also block sign-in scopes. If you would prefer to have awareness of all apps using Google to sign in, this may be the best path.

In Conclusion

We hope that these tips have helped to demystify where to start with securing your Google Workspace environment! Remember, Rome was not built in a day. Start with an initial audit, explore your environment, and do the appropriate amount of testing and planning to make these changes a success. While some hurdles may be a challenge, it is far easier to lay the groundwork for security best practices when you are a smaller company versus when you reach hundreds, or even thousands, of employees. There is no time like the present to gift yourself some peace of mind.

Here’s to a year of good health, good company, and an even better Google Workspace setup than the year before!


KYC Chain

10 Key KYC Considerations for Crypto Launchpads

Crypto Launchpads are an integral component of the growing crypto industry and are also vulnerable to the range of dangers facing the broader industry. By ensuring that all users have been properly vetted through robust KYC processes, Crypto Launchpads will be able to protect themselves from potential fraud, regulatory sanctions and fines further along the line.  The post 10 Key KYC Conside

Metadium

Metadium’s new partnership with Pangea Inc. promises further expansion in the metaverse

Dear Metadium community, We are happy to announce our first partnership of 2023: Pangea Inc. Through this partnership; both companies will cooperate in developing new business opportunities in the Metaverse, including NFT and SBT projects. Pangea Inc. has worked on multiple blockchain projects, including an artwork NFT platform, a K-content platform, and a P2E project. Pangea’s most recent proje

Dear Metadium community,

We are happy to announce our first partnership of 2023: Pangea Inc. Through this partnership; both companies will cooperate in developing new business opportunities in the Metaverse, including NFT and SBT projects.

Pangea Inc. has worked on multiple blockchain projects, including an artwork NFT platform, a K-content platform, and a P2E project. Pangea’s most recent project is SBT.PIZZA, a service that helps brands, institutions, and influencers to easily issue NFTs to consumers and fans and build strong communities. Via SBTs, the receivers can authenticate or commemorate experiences related to the issuers.

SBT.PIZZA is currently collaborating with Yonsei University to issue SBTs to students who satisfy qualifications like attending specific lectures, seminars, or activities.

With this partnership, Metadium will continue expanding its reach in the metaverse by adding more projects and uses of its technology.

NFTs and SBTs?

SBT stands for soulbound token, a publicly verifiable and non-transferable non-fungible token (NFT) that represents an individual’s credentials, affiliations, and commitments. SBTs cannot be bought or sold. They have to be earned.

Vitalik Buterin first introduced the concept in a paper titled “Decentralized Society: Finding Web3’s Soul,” where he defined SBTs as NFTs that are “publicly visible, non-transferable (but possibly revocable-by-the-issuer)” and the wallets that hold these SBTs as Souls.

The Metadium team will soon release an article where we go over SBTs in detail and how they compare with DIDs and VCs.

메타디움, 판게아와 메타버스 신규 비즈니스 개발 협력

안녕하세요. 메타디움 팀입니다.

메타디움의 2023년 첫 파트너십을 기쁜 마음으로 공개합니다. 메타디움과 판게아는 이번 파트너십을 통해 NFT와 SBT 프로젝트를 포함한 신규 메타버스 비즈니스를 개발하기 위해 협력할 예정입니다.

판게아는 미술 NFT 플랫폼, K-콘텐츠 플랫폼, P2E 프로젝트를 포함한 다수의 블록체인 프로젝트에 참여했습니다. 판게아의 가장 최근 프로젝트는 SBT.PIZZA로, 브랜드, 기관, 인플루언서들이 소비자와 팬들에게 NFT를 쉽게 발행하고 끈끈한 커뮤니티를 구축할 수 있도록 돕는 서비스입니다. 소유자는 SBT를 통해 발행자와 관련된 경험을 인증하거나 기념할 수 있습니다.

SBT.PIZZA는 현재 연세대학교와 협력하여 특정 강의, 세미나 또는 활동에 참석하는 등의 자격을 충족하는 학생들에게 SBT를 발급하고 있습니다.

메타디움은 이번 파트너십을 계기로 메타버스 분야에서 더 많은 프로젝트와 기술 활용을 더해 영역을 지속적으로 확대해 나갈 예정입니다.

NFT, SBT?

SBT는 Soulbound Token의 약자로, 개인의 자격, 소속 및 업적 등을 나타내는 공개적으로 검증 가능하고 양도가 불가능한 NFT(Non-Fungible Token)입니다. SBT는 양도나 거래가 불가능하며, 획득 후 지갑에 귀속됩니다.

이 개념은 비탈릭 부테린이 발표한 <탈중앙화 사회 : 웹3의 영혼을 찾아서(Decentralized Society : Finding Web3’s Soul)>라는 논문에서 처음 소개되었습니다. 그는 SBT를 “양도할 수 없으며 공개적으로 볼 수 있는(그러나 발행인에 의해 철회될 수도 있는)” NFT로 정의하였으며, SBT를 저장하는 지갑을 Soul이라고 칭했습니다.

메타디움 팀은 곧 SBT를 자세히 다루고 DID 및 VC와 비교하는 아티클을 게시할 예정입니다.

Metadium’s new partnership with Pangea Inc. promises further expansion in the metaverse was originally published in Metadium on Medium, where people are continuing the conversation by highlighting and responding to this story.


FindBiometrics

TECH5’s Biometric Tech Supports Remote Voting in Oman

TECH5’s biometric technology played an important role in municipal elections in Oman, thanks to the company’s strategic partnership with digital identity verification startup Uqudo. Based in Dubai, Uqudo offers MEA countries a […]
TECH5’s biometric technology played an important role in municipal elections in Oman, thanks to the company’s strategic partnership with digital identity verification startup Uqudo. Based in Dubai, Uqudo offers MEA countries a […]

Thursday, 02. February 2023

FindBiometrics

You Have Five Years to File Your BIPA Lawsuit: Identity News Digest

Welcome to FindBiometrics’ digest of identity industry news. Here’s what you need to know about the world of digital identity and biometrics today: Illinois Supreme Court Sets Five-Year Filing Window […]
Welcome to FindBiometrics’ digest of identity industry news. Here’s what you need to know about the world of digital identity and biometrics today: Illinois Supreme Court Sets Five-Year Filing Window […]

Entrust

Automating security for Linux servers and applications

Innovation, speed, and scale are the fundamental building blocks for success in tech today. Automation... The post Automating security for Linux servers and applications appeared first on Entrust Blog.

Innovation, speed, and scale are the fundamental building blocks for success in tech today. Automation solutions help businesses implement these core principles and enable them to fulfill rapidly evolving customer needs across different ecosystems.

In a Linux ecosystem, bash scripts consisting of a series of shell commands help automate repeated tasks. As a command language interpreter (CLI) tool, it is used in an array of applications in the Linux ecosystem. Combined with Cron Job scheduler, these scripts are often used in DevOps to build business workflows and processes.

You can now utilize these bash scripts to also integrate PKI-based security into your workflows and protect your servers and applications. Digital security ecosystems have evolved to ensure DevOps can leverage their existing knowledge to collaborate with the new tools and make implementing security a smooth, scalable process that matches their speed.

Until now, PKI expertise was a necessary prerequisite for successfully securing applications and servers, making developers wary of integrating security measures in their DevOps builds/processes. Today, it is possible to leverage out-of-the-box tools developed by PKI experts to implement security end-to-end to protect your business.

Entrust CA Gateway (CAGW) Command Line Interface (CLI) tool is one such tool that can be used to automate security for Linux servers and applications. It uses CAGW’s REST APIs to automate certificate and key generation, issuance, and lifecycle management to secure your endpoints at the time of deployment itself. Using this CLI tool it is possible to automate various complex cryptographic operations and integrate security into existing workflows and processes. It is an interactive tool that can be modified based on your requirements.

The CAGW CLI tool can automate the execution of these tasks:

Generating the Certificate Signing Request (CSR) with subject using OpenSSL Listing various certificate authorities (CAs) in the system Listing all profiles for a CA Requesting and enrolling new certificates with CSR Certificate revocation by serial number Bulk certificate issuance

The technical documentation and the bash scripts for the CAGW CLI tool are available on this Entrust GitHub page.

A suite of Entrust PKI products can be leveraged in the DevOps environment to simplify the automation of security essentials. A DevSecOps approach is necessary to ensure your network, servers, and applications are secure at the time of deployment itself. With the threats and attacks modern businesses face, security cannot be an afterthought. And it is now possible to implement security in DevOps without sacrificing the pace of development.

Watch this video to see how you can work with the Entrust CA Gateway CLI Tool and integrate it with your existing workflows. Learn how to automate certificate and key requisition, issuance, and lifecycle management to secure your Linux servers and applications the instant they are deployed.

The post Automating security for Linux servers and applications appeared first on Entrust Blog.


Using load balancers to automate security and mitigate the network impact

Modern businesses need to adopt a culture of agile collaboration to deliver results and exceed... The post Using load balancers to automate security and mitigate the network impact appeared first on Entrust Blog.

Modern businesses need to adopt a culture of agile collaboration to deliver results and exceed market expectations. DevOps enables organizations to innovate at speed, collaborate efficiently, and deliver results faster.

While helping organizations build better, DevOps often sidesteps implementing security in its need for speed. Traditionally, security has been the complex component that would hit the brakes in the DevOps race. But today, security solutions can match the speed and flexibility needs of DevOps and protect businesses at the same time.

Entrust automation solutions enable DevOps to implement security from day one with scalable solutions that are easy to integrate, iterate, and automate. Entrust Certificate Authority Gateway (CAGW) provides a RESTful interface to access multiple CAs like Entrust – public and private, Microsoft CA, and AWS for automating certificate provisioning, issuance, deployment, and lifecycle management.

Certificate and key issuance are the first steps in security. But to secure an application end-to-end you need complete visibility and control over the various keys, certificates, and identities in use. Entrust automation solutions like Certificate Hub and CA Gateway provide the speed and scalability with an automation layer enabling DevOps to leverage best security practices without slowing them down.

TLS/SSL certificates and cryptographic keys establish trust in identities and communications. But TLS/SSL encryption/decryption at high volumes can negatively impact a web server’s performance. TLS/SSL offloading is a process by which this encryption/decryption process is moved from a web server to a separate device earmarked for processing TLS/SSL.

F5 BIG-IP is a popular hardware or software-based local traffic manager that “offers a high-performance network load balancer with blazing-fast TLS/SSL offloading and TLS/SSL inspection to secure data from end-to-end between clients and servers. To leverage F5’s network capabilities, organizations need to provision and manage certificates and cryptographic keys for the numerous clients and servers connected to the load balancer.

Entrust CA Gateway Ansible Module enables users to fully automate security implementation in the F5 ecosystem. This module helps to automatically implement the TLS/SSL offloading policies, from certificate issuance to certificate deployment at the desired endpoints. Together with other Entrust services like Certificate Hub, it provides users complete visibility, real-time monitoring, and control of the certificates and keys in their network, enabling them to mitigate business disruptions.

Watch this video to learn how you can configure and implement the Entrust CA Gateway Ansible module and integrate security successfully in your existing workflows and processes.

The post Using load balancers to automate security and mitigate the network impact appeared first on Entrust Blog.


Ocean Protocol

Capitalize with Ocean Protocol: A Predict ETH Tutorial

Our ongoing Predict ETH Challenge →http://bit.ly/3XrXzHu “How do I predict future asset prices?” is the multi-billion dollar question on every trader’s mind. Artificial intelligence can help to accurately predict asset prices. Indeed, the most robust predictive trading algorithms use machine learning (ML) techniques. On the optimistic side, algorithmically trading assets with predictive ML models c
Our ongoing Predict ETH Challenge →http://bit.ly/3XrXzHu

“How do I predict future asset prices?” is the multi-billion dollar question on every trader’s mind. Artificial intelligence can help to accurately predict asset prices. Indeed, the most robust predictive trading algorithms use machine learning (ML) techniques. On the optimistic side, algorithmically trading assets with predictive ML models can yield enormous gains à la Renaissance Technologies… Yet algorithmic trading gone awry can yield enormous losses as in the latest FTX scandal.

If you code an accurate price prediction model, then the next logical step is to trade with it. Sounds easy enough, right? But putting your faith in a trading algorithm is no easy task. Not only does it take significant coding & financial prowess to avoid (massive) trading errors, but it takes quite the emotional leap of faith too.

Don’t have the stomach for risk? Maybe you want 0% downside. Lucky for you, there’s a solution for that. Enter, Ocean Protocol.

Ocean Protocol’s open-source tools enable you to sell your algorithms without risk of others stealing your trade secrets. How?

This is a tutorial! We show, don’t tell ;)

Here we’ll teach you from start to finish how to create an algorithm for predicting ETH crypto prices and then upload your algorithm privately to the Ocean Market.

By the way, this is a purely educational tutorial — we’re not financial advisors! We authors give no warranty about the performance of this algorithm, nor should anything in this article be considered financial advice.

Moving on to the fun stuff…

Setting up our environment

First, we’ll set up our environment with a Prophet machine learning model to forecast prices. Easy peasy.

Profit with ‘Prophet’

Prophet is a forecasting algorithm originally developed by Facebook that uses additive models. In other words, Prophet sums several different functions that each fit pieces of a historical dataset to create one complete model for predicting future values. This forecasting model can be used for any kind of time-series data, but in this example we will use it to predict ETH crypto prices.

‘Explain Prophet’s additive models to me like I’m five years old.’

Prophet sums: 1) a combination of trend functions (including non-linear ones) with 2) seasonal inputs (including yearly, monthly, weekly and daily seasonalities) and 3) holiday effects (customizable to particular regions!). The result is a practical, real-world model that’s much more sophisticated than simple linear extrapolation. Pretty cool, no?

Prophet is implemented in Python, a widely used programming language for machine learning and artificial intelligence. It’s also open-source!

Find the full Prophet source code at https://facebook.github.io/prophet/

Installing Prophet

To install Prophet, you can use the pip command in your Terminal/Git Bash or an equivalent command based on your computer’s package-management system. We’ll install with pip here for ease of use with Python:

$ python -m pip install prophet

That’s it! Nothing complicated here.

If you have a fickle environment and encounter issues while installing, we’re here to help! Chat with the Ocean Protocol core team on our Discord with any questions!

Your Crystal Ball: Or how to make price predictions

All the code for creating the algorithm to predict ETH is here in a Jupyter notebook.

It’s time to use machine learning to forecast prices. We’ll be implementing a basic ML pipeline to forecast future prices for the Ethereum-Tether (ETH/USDT) cryptocurrency pair.

What is an ML pipeline? A machine learning pipeline usually consists of these steps:

Data gathering and processing Machine learning modeling Model evaluation Model inference

Let’s collect and transform ETH price data from a reliable source into a format that we can use with our Prophet model.

ETH/USDT prices data can be downloaded from several places, but in this tutorial we’ll use Binance API for a simple solution to getting hourly ETH/USDT prices.

In your terminal, start the Python console.

$ python

Then, enter the following steps each in a new line:

>>> import requests
>>> import pandas as pd
>>> url = "https://api.binance.com/api/v3/klines?symbol=ETHUSDT&interval=1h&limit=1000"
>>> r = requests.get(url)
>>> cex_x = r.json()

Note: The Binance API only works outside of the U.S. American users will need to either run a VPN or use a different data source.

In the snippet above, cex_x is an object of 500 data points, one for every hour, on the hour. Each data point has a list of 6 values: (0) Timestamp (1) Open price (2) High price (3) Low price (4) Close price (5) Volume. We will use the hourly “Close price” to make our price predictions.

More information regarding the Binance API is available in their documentation.

Initially, the timestamp is a Unix timestamp in milliseconds. We need to convert that Unix timestamp into a Python datetime object. We’ll convert the Unix time interval from milliseconds to seconds, then use a loop to convert the Unix timestamp into a list of Python datetime objects:

>>> from datetime import datetime
>>> uts = [xi[0]/1000 for xi in cex_x]
>>> ts_obj = [datetime.utcfromtimestamp(s) for s in uts]

The variable ts_obj contains the list of Python datetime objects. Remember, these datetime objects correspond to the Close prices of ETH/USDT coin pair on hourly intervals, but are not the Close prices themselves… We need to create a Python dataframe containing both the hourly timestamps and their Close prices.

Let’s create the Python Pandas dataframe using the first two lines of code below. Then, we will split the dataframe into a training set and a test set using the third and fourth lines of code below.

What is a training set and a test set? A training set is the data fed into the ML algorithm to model the behavior of the data for generating a forecast. A test set is the data your ML algorithm uses to see if its forecast is accurate. If the forecast isn’t accurate, then the algorithm revises its model. Thus, the machine “learns” from the test data.

We’ll use most of the hourly Close price data as our training set, but we’ll save the last 12 hours of Close prices to test our model.

>>> cls_price = [float(xi[4]) for xi in cex_x]
>>> dataset = pd.DataFrame({"ds": ts_obj, "y": cls_price})
>>> train = dataset.iloc[0:-12, :]
>>> test = dataset.iloc[-12:, :] ‘Fit’ for a model

The first step to using Prophet is instantiating a Prophet object. We create a Prophet object called model in the first two steps below. Then, we just call the fit function on our training set to create our forecasting model.

>>> from prophet import Prophet
>>> model = Prophet()
>>> model.fit(train)

Et voilá, our Prophet model is created and we can start to predict hourly ETH/USDT crypto prices!

But how accurate is our model, really?

Cross Validation Testing

One way to significantly improve our ML model’s accuracy is by using cross validation. Cross validation will help us with two things: 1) selecting the additive functions correctly that create the model and 2) making sure that the model doesn’t fit the training data too closely to reduce noise.

Additive functions individually model data’s behavior and add together in sequence to create one unified model for the data. Noise is the small, high frequency fluctuations in ETH/USDT prices over time. Since our Prophet model is fitted perfectly to our data including all that noise, Prophet’s forecasted prices will likely deviate from general trends in the data because noise generates a greater number of additive functions in the model.

What we need is a way to smooth out our Prophet model’s functions to eliminate some of the noise and predict more generally the direction of future ETH/USDT prices.

How does cross validation work?

Cross validation takes the training set and splits it into chunks for re-training and testing the model. Starting from an initial time in the past, each chunk has an equivalent timespan called a Horizon to evenly cross validate. The very first horizon is used to train the model, then the second horizon is used to test your model. This pattern repeats in chunks until a final Cutoff time in the training set.

The ML algorithm tests one last time its model on the data after the cutoff. Then, the error is computed and the ML model does its final tuning.

All that to say, we can cross-validate with just a few elegant lines of code:

>>> from prophet.diagnostics import cross_validation
>>> df_cv = cross_validation(model, initial='30 days', period='1 hour', horizon = '12 hours')>>> df_cv = cross_validation(model, initial=’30 dys’, period=’1 hour’, horizon = ’12 hours’)

Computing the Mean Squared Error (MSE) is just as easy:

>>> from prophet.diagnostics import performance_metrics
>>> df_p = performance_metrics(df_cv)

You can see in the chart below that the MSE of the prediction increases as we try to predict further in time, which makes sense because uncertainty increases over time.

Now that we have a working and cross validated ETH price prediction model, let’s see how closely it modeled the data overall using the NMSE metric.

How good are we at predicting future ETH prices?

We’ll calculate the Normalized Mean Squared Error (NMSE) to assess the accuracy of our cross validated Prophet model. Why? NMSE is the de facto measurement for comparing the accuracy between various ML models, so it will give us a good sense of how we stack up with the competition. Let’s compute the NMSE:

>>> import numpy as np
>>> forecast = model.predict(test)
>>> mse_xy = np.sum(np.square(np.asarray(test["y"]) - np.asarray(forecast["yhat"])))
>>> mse_x = np.sum(np.square(np.asarray(test["y"])))
>>> nmse = mse_xy / mse_x

In this code, the variable nmse is the error of the model performing in the unseen test set. The following chart shows a 12 hour ETH price prediction. The dotted line divides the historical data used for training and the unseen testing data.

And indeed, things are on the up and up 🙂 It’s time to monetize our trade secret ETH price prediction algorithm…

Ocean Protocol

Ocean Protocol builds open-source tools to monetize datasets and algorithms on the blockchain. Publishers of datasets and algorithms can use Ocean Protocol tools to mint their own data NFTs (including metadata) and grant access to these NFTs using data tokens.

You can learn more about Ocean Protocol in the Ocean Academy.

In this tutorial, we’ll create a data NFT to store our ETH price prediction algorithm and grant access to it for paying consumers.

Installing Ocean is just one command:

$ python -m pip install ocean-lib

If you encounter issues, ping us on Discord and the Ocean team will help you! Our official docs are also a reference.

Let’s set up our environment by creating an Ocean object on our blockchain of choice. We’ll use Polygon in this example for its low gas fees.

>>> from ocean_lib.ocean.ocean import Ocean
>>> from ocean_lib.web3_internal.utils import connect_to_network
>>> from ocean_lib.example_config import get_config_dict
>>> connect_to_network(“polygon”)
>>> ocean = Ocean(get_config_dict())

Next, we have to connect our wallet (Alice impersonates the owner of the wallet in the steps below)

>>> # Assuming the private key comes from env var
>>> import os
>>> from brownie.network import accounts
>>> alice_private_key = os.getenv(‘REMOTE_TEST_PRIVATE_KEY1’)
>>> accounts.clear()
>>> alice_wallet = accounts.add(alice_private_key)

Now we’re ready for compute-to-data. Drumroll, please!

Compute-to-Data

Compute-to-data (C2D) is a functionality within the Ocean tech stack that allows data consumers to buy the results of computation on a dataset (i.e. AI model or other data outputs) rather than purchasing the dataset directly.

In this case, data publishers can upload an algorithm to the Ocean Marketplace by creating an asset through the Ocean Library. The output of this algorithm can then be sold to other users in the marketplace. In other words, data and algorithm owners can monetize this IP while preserving the privacy of the data contents.

First, the algorithm should be uploaded to a repository — Github for example. Then, the URL from the algorithm is used in the following way to create the asset in the marketplace:

>>> ALGO_url = “https://raw.githubusercontent.com/<repository>/<file>.py"
>>> (ALGO_data_nft, ALGO_datatoken, ALGO_ddo) = ocean.assets.create_algo_asset(name, ALGO_url, alice_wallet, wait_for_aqua=True)
>>> print(f"ALGO_ddo did = '{ALGO_ddo.did}'")

The URL of the algorithm is only visible to Ocean Protocol’s technology, but the URL is encrypted to the users who buy the asset. The key printed above is the identifier for the asset created in the marketplace.

We’re done! The output DID is your algorithm asset’s digital identifier that proves it exists now on the Polygon blockchain. This asset can now be bought on the Ocean Market to compute on data and output those predictions to purchasers.

Congratulations!

Conclusion

We trained a forecasting ML model to make predictions about the future values of ETH/USDT crypto pair. We explored how to transform algorithms into sealable assets using the Compute-to-Data (C2D) feature. The end result? Those who follow this tutorial can sell predictions for ETH/USDT for any point in time through the Ocean Market using C2D, allowing you to monetize your IP. This example is only one of the many possible applications of the Ocean Protocol. To find more visit Ocean Academy!

Capitalize with Ocean Protocol: A Predict ETH Tutorial was originally published in Ocean Protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.


Identosphere Identity Highlights

Summary 119: Attacking SSI • Global Supply Chain with Knowledge Graphs • JOSE WG Reanimated

Who's Hiring? Whos Funding? What Who's Who thinks about What and Why! Upcoming events, Recordings from previous events! 2 years going strong, the original SSI newsletter. Support us on Patreon!!!
Welcome to Identosphere • We Gather, You Read Please Contribute to our efforts by PayPal, or Patreon! We’ll keep aggregating industry info. Be sure your blog has an rss feed so we can better track and feature your content! Send us a message at newsletter@identosphere.net if you have an RSS feed you want us to track for this digest. Upcoming 

GS1 Global Forum 2/13-16

Heroes of Data Privacy – the practise-oriented data privacy conference in Vienna LionsGate 5/24 Vienna, Austria

DID:Day 3/1 around ETHDenver

APAC Digital Identity unConference 3/1-3 Bangkok, Thailand

Thoughtful Biometrics Workshop virtual unConference 3/13-17 [Registration]

Internet Identity Workshop 4/18-20, Mountain View, CA

Curious about ethical use of personal data? Meet MyData Awards 2023 MyData

MyData Awards 2023 call is now open to recognise and celebrate human-centric services that put the individual at the centre of digital solutions.

Hiring Learn how to bring a responsible data approach to your work with our cohort-learning programme Engine Room Attacking SSI  Decentralized Identity Attack Surface – Part 2 Shaked Reiner

Infominer thinks its incredible this is our first article detailing the attack surface of ssi. Been wishing for something like this.. Need more!

This is the second part of our Decentralized Identity (DID) blog series. In case you’re not familiar with DID concepts, we highly encourage you to start with the first part. This time we will cover a different DID implementation — Sovrin. We will also see what a critical (CVSS 10) DID vulnerability looks like by reviewing the one we found in this popular implementation.

Enterprise Key Takeaways From CIAM Survey Ping Identity

Ping Identity strives to understand the challenges consumers face and the expectations they have for the brands they interact with. The Ping Identity CIAM Survey titled “The Balancing Act: Earning Trust Through Convenience and Security” has given us a lot of information about consumers’ relationship with their identity

Supply Chain See the Global Supply Chain with Knowledge Graphs and UN Web Semantics Transmute

This article was based on Transmute Solutions Architect Nis Jespersen’s ‘UN/CEFACT Linked Data’ presentation from December 2022. Leading the UN Web Vocabulary project, I presented at the December 2022 UN/CEFACT Forum. 

When E-Commerce and In-Store Collide: What You Need to Know to Stay Relevant Next Level Supply Chain Podcast with GS1

Your customers want more information – and that could include your product catalog. Join us as we chat with Mike Massey, CEO at Locally, a business that gives access to real-time inventory to nearby shoppers using e-commerce tactics

Trusted and extensible product master data hub based on the OriginTrail Decentralized Knowledge Graph Origin Trail

Trusted and extensible product master data hub based on the OriginTrail Decentralized Knowledge Graph and GS1 standards Product master data — the foundation for supply chain visibility

Public Sector

[youtube] Building on EBSI [live event]

Mobile Driving Licence (mDL): Exploring ISO 18013-5&7 Andrew Hughes, Northern Block

“No, I don’t trust you” - Implementing Zero-Trust Architecture in the world of Self-Sovereign Identity (SSI) Trust Over IP

how the public sector should consider SSI, we examine the efforts of the US federal government to adopt zero trust architecture (ZTA) and its implications for using decentralized identity.  As stated in federal guidance, “The foundational tenet of the Zero Trust Model is that no actor, system, network, or service operating outside or within the security perimeter is trusted.

Explainer

The Basics of DIDs ep.1 VESSI Labs and Kouki Minamoto

Gear Up for the Future with Decentralized Identity KuppingerCole, by Warwick Ashford

3 Key Considerations in Your Passwordless Journey 1Kosmos

Going Passwordless – Separating Identity and Authentication Kuppinger Cole, by Alejandro Leal

Personal Data

Google and Facebook: Steam Engines of the Information Age MyDEX - Steam had its day. Then the world moved on.

What has data got to do with net zero? MyDEX - At first glance it may not seem that data, especially personal data, has much to do with climate change.

Data Privacy Day is the time to ramp up your Board-level cybersecurity expertise Entrust

Use Cases

Future State: Digital Credentials for Healthcare Spruce - Digital credentials and self-sovereign identity have the potential to revolutionize the healthcare industry by providing a more secure and efficient way to store and share important health information.

In God We Trust: Everything Else, we VERIFY #3TB - Are these cases for better crime detection? Or just the correct use cases for verifiable credentials?

Employee Verification Simple and Secure Indicio

A Digital Identity Stack to Improve Privacy in the IoT - Lockstep S. Wilson, N. Moustafa and E. Sitnikova,

Policy

Cybersecurity Policy Forum: Identity, Authentication and the Road AheadCybersecurity Policy Forum FIDO Alliance - 2023 brings a new year and a new Congress – but America is still struggling with many of the same old problems when it comes to digital identity and authentication.

Recap: 2023 Identity, Authentication and the Road Ahead #IDPolicyForum FIDO Alliance - tremendous transformation in 2023 as lawmakers and regulators alike struggle to help protect individual privacy and improve access to services 

“Always bet on good regulation.” IDnow - the era of black-market operators is coming to an end, and which gambling trends to bet on in 2023.

iGrant.io Your Data, Your Choice

The Swedish Data Protection Authority (IMY) decided in a case involving Nordax Bank AB due to a complaint. The context is that Nordax failed to initially comply with requests from a complainant for access and erasure under Art. 15 and 17 GDPR, on the grounds that the company does not process or store the complainant's personal data. 

Standards JFF (Jobs for the Future) Plugfest 2 ValidatedID

JFF (Jobs for the Future) and W3C aim to build an ecosystem for verifiable credentials wallets regarding learning and employment, and Verifiable Credential (VC) issuers and Decentralized Identifiers (DIDs).

[github] did:pkh Method Specification  Wayne Chang, Charles Lehner, Juan Caballero, Joel Thorstensson

did:pkh is similar in many ways to did:key, except that did:pkh is optimized for identifiers derived from hashes of public keys according to well-known algorithms (commonly referred to as "public key hashes", because, in most cases, they are a public key hashed according to a standard hash function).

Unifying Trust Registries and Trust Lists to Answer the Question of “Who Can You Trust?” Indicio

whose verifiable credentials are trustworthy in any given use case? Two solutions to this challenge—Trust Registries and Trust Lists— have emerged from two different organizations.

Organization

WAO! We’re almost seven We Are Open

JSON Object Signing and Encryption (JOSE) Working Group Reanimated Mike Jones

Energy Web joins the OPENTUNITY consortium to open electricity ecosystems to decarbonize European grids Energy Web

Identity Insights - First Ever Hyperledger Aries Bifold Summit James Ebert

NADA Joins MOBI to Accelerate Zero Trust Innovations for Information Security and Business Automation MOBI

Hyperledger Mentorship Spotlight: GVCR: Secure Verifiable Credential Registries (VCR) for GitHub & GitLab Hyperledger Ursa

Company Stories

Get Your DID: Now You Can Pay with KILT and PayPal KILT

KILT Website Features Enterprise and Consumer Onramps to DIDs KILT

Credential Engine CEO Scott Cheney joins Velocity board Velocity Network

Five Verifiable Data Predictions for 2023 Indicio

Experience Secure and Private Communication With DaaS (DID-as-a-Service) InfinitySwap

Wider Team had a verifiable 2022 Wider Team

Our small band of strategy consultants gave back to our professional communities in 2022. Here's the recap of our digital identity, ethics, manufacturing, and supply chain talks, papers, standards work, and workshops.

Indicio Wins a New Government of British Columbia Code With Us Challenge Indicio

Thanks to winning one of the latest challenges, Indicio will upgrade the Hyperledger Indy SDK in all Hyperledger Indy and Aries software. This includes updating all BC Gov Aries Cloud Agent Python (ACA-Py) installations still using the Indy SDK to use Aries Askar

The YeshID kitchen: where security and usability meet YeshID

Our chefs are experienced. They have tasted good dishes and bad ones. They experiment and innovate mixing old ingredients and new, and delight when they create something delicious that no one has ever seen, tasted, or smelled before.

DWeb

Mastodon for Developers: Everything You Need to Know auth0 - Learn how to use Mastodon effectively as a developer.

The Rise of Decentralized Social Networks with Farcaster’s Dan Romero bankless

Activity Streams graphical model reb00ted

[tweet thread] Reimagining the Social Landscape | State of the DAOs BanklessWriters

In this thread, you will discover the difference between the web3 social landscape and that of web2 [...] The social landscape needs to change. This is where the web3 social landscape comes in. It gives us hope as two of the biggest opportunities in web3 are to change the way communities are formed and the platform we use to come together.

For Sale [tweet] Domain name http://atprotocol.com is for sale! NameOnline Web 3

What are LSDs? Defiant - Liquid Staking Derivatives! LSDs.

ConsenSys Launches MetaMask Learn — The Next Step in Democratizing Web3 ConsenSys

Web3 Is About Creating the Infrastructure for Digital Transformation Indicio

Converge22 Recap: Next Gen User Experiences for Web3 Circle

Announcing did:day - An Exploration of Decentralized Identity at ETHDenver BUIDLWeek SpruceID

An Exploration of Decentralized Identity at ETHDenver BUIDLWeek. Web3 has enabled countless users to take control of their financial assets across the web, and we aim to take this a step further - allowing users to control their identity and data.  

SSX Product Update - Optimization Updates, New Features, and More Spruce Systems

We launched SSX in November to provide developers with the easiest way to integrate Sign-In with Ethereum. We are continuously working on a positive developer experience, and additional features to enable builders to work with emerging decentralized identity paradigms.

TezID 💖 Altme

Our collaboration with Altme allows for the integration of SSI verifiable credentials on the Tezos blockchain through TezID. This feature enables Tezos applications to verify user information directly on-chain, improving the user experience, privacy and enabling more customized experiences and services.

Thanks for Reading

Read more \ Subscribe: newsletter.identosphere.net

Please support our efforts by Patreon or Paypal

Contact \ Submission: newsletter [at] identosphere [dot] net


Spruce Systems

Future State: Consumer Data Online

Self-sovereign identity gives users the ability to store and manage their own identity and data online. This new identity layer of the internet will fundamentally shift the relationship that users have with the applications and services they interact with.

Self-sovereign identity gives users the ability to store and manage their own identity and data online. This new identity layer of the internet will fundamentally shift the relationship users have with the applications and services they interact with. When users access websites, they can provision access to their self-managed data, rather than relying on the company that operates said website to store and manage their personal information.

For example, we can use Bob, an everyday person who uses an ever-increasing number of different online applications and services for various purposes. If you're a human in the modern age reading this, you probably are like Bob in many ways. In the current model for identity on the internet, Bob's online data is stored in siloes across all the different servers for the companies he has accounts with.

Bob does not own this data—if his favorite photo-sharing social media application decides to shut down his account, he loses all of his memories stored there and the personal connections he has built over time. In a proposed future state of the internet powered by self-sovereign identity, Bob would own and manage all of his own data, then provision access to each of the applications and services whenever he chooses.

This shift has several benefits for both users and enterprises. For users, self-sovereign identity gives them more control over their personal data. They can choose which data to share and with whom, and they can revoke access at any time. This gives users greater privacy and security, while also improving user experience by allowing for more personalized experiences and reducing the number of unique usernames and passwords that need to be managed.

For enterprises and companies, self-sovereign identity has clear benefits–more personalized experiences for users, lower compliance costs associated with storing data, and reduced risks of data breaches.

Better User Experiences

By allowing users to share the data they choose, enterprises can create more tailored experiences that are relevant and engaging for the user. Providing highly personalized experiences can increase customer conversion rates, while also improving customer satisfaction and retention overall. A recent report from McKinsey found that companies that excel at personalization see 40% higher revenue than their competitors. Measuring performance across US industries, the report estimates that shifting to top-quartile performance in personalization would generate over $1 trillion in value. Current data aggregation practices to inform personalization strategies are incredibly costly, with billion-dollar companies specializing in collecting and selling customer data for personalization and targeted advertising.

With self-sovereign identity, we can enable a system where users themselves can selectively disclose their information for a more customized experience, which can increase conversion and retention rates while decreasing the costs of personalization. This may be a tough pill to swallow for companies earning millions (and billions, in some cases) by tracking and aggregating consumer data. However, leveling the playing field to restore user control of personal information will ultimately create unique branding and memorable experiences in the digital world.

Currently, our online preferences and behaviors are stored in siloes managed by separate companies. There are incredible opportunities unlocked if that data becomes portable. Users can then share a combination of their preferences and other identity attributes within a new application to customize their initial experience immediately. The key to remember here, however, is that the choice to share that information should remain with the user.  

Reducing Compliance Costs

Decentralized identity can also help enterprises and companies reduce their costs of compliance. By storing less personal information of users on their servers, they can reduce the amount of data that needs to be stored and protected. This can lower costs associated with data management, in addition to reducing costs for maintaining compliance with various data protection regulations, like GDPR and CCPA.

GDPR (the General Data Protection Regulation) is a law passed by the EU, effective in May 2018, that enforces guidelines for any organization that targets or collects data related to people in the EU. PWC surveyed 200 C-Suite executives (CIOs, CISOs, CMOs, CPOs, and General Counsels) from US companies with more than 500 employees about their preparations for GDPR compliance in 2017 before the law became effective. The survey results showed that 68% of respondents had budgeted between $1 million and $10 million for GDPR compliance. Nine percent (9%) of the survey respondents expected to spend over $10 million to address GDPR obligations. The fines for violating the GDPR can be as high as the greater of €20 million or 4% of global revenues, plus damages owed to affected individuals seeking compensation.

Other jurisdictions, like the State of California, are following suit, passing their own versions of consumer data protection regulations in an effort to curtail the mass surveillance on citizens by Big Tech. The California Attorney General released an economic impact assessment, which estimated the total cost of initial compliance with the CCPA (California Consumer Privacy Act) would approach a staggering $55 billion. As regulators globally address public concerns about data privacy and processing, the costs for companies maintaining compliance will become even more significant, if they do not reframe their philosophy on the data collection practices at a fundamental level.

A new identity layer of the internet, which allows users to self-opt into disclosing their personal data and manage how their data are shared further, will significantly decrease the overhead and administrative costs associated with managing user data in company servers.

No Data to Breach

In addition to improved user experience and lower costs associated with data storage and compliance, self-sovereign identity can help prevent one of the biggest PR crises a large company can face–a data breach.

By reducing the amount of personally identifiable information (PII) on company servers, the honeypot of large internal databases storing millions of records representing user data, like names, addresses, passwords, and credit card details, is eliminated. This means that hackers will be less likely to target companies’ servers, and if they do, there will be less sensitive information for them to steal. While this also protects companies from the PR nightmare of a data breach, it also protects consumers by reducing the risks that their private information becomes compromised and potentially used for nefarious purposes.

The cost of data breaches can be significant for companies. According to a study by IBM, the average cost of a data breach for a company in the United States is $9.44 million, compared to the global average of $4.35 million. This total cost represents expenses such as legal fees, lost business, and damage to the company's reputation. For “mega-breaches” with between 50 to 60 million records affected, the average cost is $387 million.

In addition to the financial cost, data breaches can also lead to negative press and loss of consumer trust. McKinsey surveyed 1,000 consumers in North America and found that 71% of respondents would stop doing business with a company if it mishandled sensitive data. Data breaches can have a significant impact on a company's bottom line, as well as its reputation. This risk is mitigated in a new digital identity paradigm powered by verifiable credentials. If users are able to self-manage their data and provision access to websites only when needed, the data aren’t housed in centralized databases, which act as a honeypot for hackers or other bad actors looking to sell user credentials or data on the dark web.

Self-sovereign identity enabled by verifiable credentials can give users more control of their data online and afford them more customizable and personalized experiences as they interact with companies online, if they choose. Companies can see reduced compliance costs for enterprises and reduced risk of data breaches by reducing the amount of consumer data stored on their servers. By adopting self-sovereign identity solutions, companies can protect themselves and their customers from the significant costs and negative press associated with data breaches.

There is a wide range of industries that rely on data storage and record-keeping that can be supercharged with verifiable credentials and similar technologies. We previously wrote a blog post about how digital credentials can be applied within the healthcare industry. We will explore other industries in future blogs to come–so subscribe and follow along on this Verifiable Credentials Future State journey with us!

About Spruce: Spruce is building a future where users control their identity and data across all digital interactions.


Ocean Protocol

Data Farming DF22 Completed, DF23 Started, Reward Function Tuned

Stakers can claim DF22 rewards. DF23 runs Feb2–9, 2023. The Reward Function has been tuned including DCV rank. 1. Overview The Ocean Data Farming program incentivizes the growth of data consume volume in the Ocean ecosystem. It rewards OCEAN for stakers who allocate liquidity to curate data asset with high data consume volume (DCV). To participate, users lock OCEAN to receive veOCEAN,
Stakers can claim DF22 rewards. DF23 runs Feb2–9, 2023. The Reward Function has been tuned including DCV rank. 1. Overview

The Ocean Data Farming program incentivizes the growth of data consume volume in the Ocean ecosystem. It rewards OCEAN for stakers who allocate liquidity to curate data asset with high data consume volume (DCV).

To participate, users lock OCEAN to receive veOCEAN, then allocate veOCEAN to promising data assets (data NFTs) via the DF webapp.

DF Round 22(DF22) is part of DF Beta. DF22 counting started 12:01 am Feb 2, 2023 and ended 12:01 am Feb 9, 2023. 75K OCEAN worth of rewards were available. LPs can now claim rewards at the DF webapp Claim Portal.

DF23 is part of DF Beta. Counting started 12:01am Feb 2, 2023.

DF23 has one major change planned, the Reward Function (RF) has been updated in two specific ways:

DCV-bound rewards at asset level (rather than global) Reward on DCV rank. Rather than have OCEAN rewards go to datasets pro-rata on DCV, rewards go according to DCV rank. This will spread rewards to more data assets.

The rest of this post describes how to claim rewards (section 2), DF23 overview (section 3), and DF23 details focusing on RF update (section 4). The Appendix shares the RF, in code.

2. How To Claim Rewards

As an LP (staker), here’s how to claim rewards:

Go to DF webapp Claim Portal Connect your wallet Rewards are distributed on Ethereum mainnet. Click “Claim”, sign the tx, and collect your rewards

Rewards will accumulate over weeks so you can claim rewards at your leisure. If you claim weekly, you can re-stake your rewards for compound gains.

Rewards will accumulate over weeks so you can claim rewards at your leisure. If you claim weekly, you can re-stake your rewards for compound gains.

3. DF23 Overview

DF23 is part of DF Beta. DF Beta’s aim is to test the effect of larger incentives, learn, and refine the technology. DF Beta may run 10–20 weeks. In any given week of DF Beta, the total budget may be as low as 10K $OCEAN or as high as 100K $OCEAN

Some key numbers:

Total budget is 75,000 $OCEAN. (This is an increase from 50,000 $OCEAN in DF18). 50% of the budget goes to passive rewards (37,500 $OCEAN) — rewarding users who hold veOCEAN (locked OCEAN) 50% of the budget goes to active rewards (37,500 $OCEAN) — rewarding users who allocate their veOCEAN towards productive datasets (having DCV). Ocean currently supports five production networks: Ethereum Mainnet, Polygon, BSC, EWC, and Moonriver. DF applies to data on all of them.

As usual, the Ocean core team reserves the right to update the DF rewards function and parameters, based on observing behaviour. Updates are always announced at the beginning of a round, if not sooner.

DF23’s reward function builds on the prior reward function that was used in DF9-DF22, which are described in the DF9 launch post. The next section describes the specific updates.

3. Update to Reward Function for DF23 3.1 DCV-bound rewards at asset level

DF9 introduced the constraint to “bound total rewards by total DCV * mult”. The multiplier keeps shrinking, such that by DF29 it’s no longer profitable to wash-consume.

We’re pleased with the outcome of this: it’s incentivized data farmers to drive DCV.

We can make a small change to tune it further. The idea is: rather than bounding total rewards by DCV (global), bound them at the asset level. (Just like RF bounds <= 125% APY at the asset level.)

Here’s the updated macro algorithm for rewards distribution:

As usual, distribute rewards pro-rata to each asset on its DCV. Then, for each asset: (a) Compute rewards (b) Further bound rewards to the asset by 125% APY and by its DCV * mult

Benefits:

People can’t rely on a single actor to do wash consume for everyone; instead they need to do it themselves. This catalyzes more engagement from more people. More robust RF, because it’s specific to each asset: generalizing on the previous point, it likely bounds other undesirable behaviors that we haven’t yet identified Conceptually simpler implementation, side-by-side with 125% bounding 3.2 Reward on DCV rank

DF9 modified the RF such that it first sent rewards to each dataset pro-rata to its DCV, then for a given dataset it sent rewards to stakers based on their stake (veOCEAN allocated to the dataset).

The challenge was: 1–3 datasets could swamp the DCV, and readily get >99% of active rewards. This is a lost opportunity to incentivize others to publish their datasets and get DCV on those.

We can address this issue as follows: rather than send OCEAN to datasets pro-rata on the the DCV value itself, send based on the dataset’s DCV rank. The dataset with highest DCV (rank 1) gets the most; then second-highest (rank 2) gets the second-most; etc. This will spread rewards to more data assets.

This begs a question: what % of rewards should go to rank 1, to rank 2, etc?There are many options here. We approached this question in the following fashion. First, we implemented a prototype. Second, we parameterized the problem according to two variables:

max_n_rank_assets: how many assets get nonzero rewards. E.g. 20, 50, or 100 rank_scale_op: scale rank linearly (LIN), logarithmically (LOG), etc.

Then we performed a study to explore the effect of these variables.

The images below explore the effect of max_n_rank_assets, when LIN scaling. As an example, on the left, assets with rank > 100 do not get any rewards. (Assets with DCV = 0 never get rewards, of course.)

Effect of max_n_rank_assets = 10 (left), 50 (middle), and 20 (right). Scaling = LIN.

The images below explore the effect of scaling (LIN vs SQRT vs LOG), for is when max_n_rank_assets = 100. Going from LIN to SQRT, the dropoff after the top-ranked asset is sharper. Going to LOG, it’s sharper yet. In LIN, the top-ranked asset gets 2% of OCEAN reward; in SQRT it’s about 2.7%; in LOG it’s about 4.8%.

Effect of scaling: LIN, SQRT, and LOG. max_n_rank_assets = 100

The full study is in these slides.

Using the study as a guide, we chose these final parameters: scaling = LOG, and max_n_rank_assets = 100.

Conclusion

DF22 has completed. To claim rewards, go to DF webapp Claim Portal.

DF23 begins Feb 2, 2023. It ends Feb 9, 2023 at 12:01am.

DF23 is part of DF Beta. Reward budget has increased to 75K $OCEAN, and active rewards calculations have new tunings: (1) DCV-bound rewards at asset level, (2) Reward on DCV rank.

Appendix: Reward Function, in Code

For thoroughness, we share the RF itself.

Previous blog posts have expressed it in mathematical equations. However, since our code implements it cleanly, we’ll share that here.

Thus, below is the RF, directly from calcrewards.py in the Ocean Protocol df-py repo. Look for “main formula”, then work backwards to the inputs (mostly numpy arrays).

def _calcRewardsUsd(
S: np.ndarray,
V_USD: np.ndarray,
C: np.ndarray,
DCV_multiplier: float,
OCEAN_avail: float,
do_rank: bool,
) -> np.ndarray:
"""
@arguments
S -- 2d array of [LP i, chain_nft j] -- stake for each {i,j}, in veOCEAN
V_USD -- 1d array of [chain_nft j] -- nftvol for each {j}, in USD
C -- 1d array of [chain_nft j] -- the LP i that created j. -1 if not LP
DCV_multiplier -- via calcDcvMultiplier(DF_week). Is an arg to help test.
OCEAN_avail -- amount of rewards available, in OCEAN
do_rank -- allocate OCEAN to assets by DCV rank, vs pro-rata
@return
R -- 2d array of [LP i, chain_nft j] -- rewards denominated in OCEAN
"""
N_i, N_j = S.shape # corner case
if np.sum(V_USD) == 0.0:
return np.zeros((N_i, N_j), dtype=float) # perc_per_j
if do_rank:
perc_per_j = _rankBasedAllocate(V_USD)
else:
perc_per_j = V_USD / np.sum(V_USD) # compute rewards
R = np.zeros((N_i, N_j), dtype=float)
for j in range(N_j):
stake_j = sum(S[:, j])
DCV_j = V_USD[j]
if stake_j == 0.0 or DCV_j == 0.0:
continue for i in range(N_i):
perc_at_j = perc_per_j[j] stake_ij = S[i, j]
perc_at_ij = stake_ij / stake_j # main formula!
R[i, j] = min(
perc_at_j * perc_at_ij * OCEAN_avail,
stake_ij * TARGET_WPY, # bound rewards by max APY
DCV_j * DCV_multiplier, # bound rewards by DCV
) # filter negligible values
R[R < 0.000001] = 0.0 if np.sum(R) == 0.0:
return np.zeros((N_i, N_j), dtype=float) # postcondition: nans
assert not np.isnan(np.min(R)), R # postcondition: sum is ok. First check within a tol; shrink if needed
sum1 = np.sum(R)
tol = 1e-13
assert sum1 <= OCEAN_avail * (1 + tol), (sum1, OCEAN_avail, R)
if sum1 > OCEAN_avail:
R /= 1 + tol
sum2 = np.sum(R)
assert sum1 <= OCEAN_avail * (1 + tol), (sum2, OCEAN_avail, R) return R Further Reading

The Data Farming Series post collects key articles and related resources about DF.

Follow Ocean Protocol on Twitter, Telegram, or GitHub for project announcements. And chat directly with the Ocean community on Discord.

Data Farming DF22 Completed, DF23 Started, Reward Function Tuned was originally published in Ocean Protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.


Indicio

The First Ever Hyperledger Aries Bifold Summit

The post The First Ever Hyperledger Aries Bifold Summit appeared first on Indicio.
Hyperledger Aries Bifold is a project focused on building decentralized identity wallets for the purposes of receiving, holding, and presenting verifiable credentials. While there are other exciting use cases for the technology down the road, the immediate goal is to provide open-source tools and wallets for the community. 

By James Ebert

The Hyperledger Aries Bifold Users Group recently hosted its first ever summit to discuss technical priorities for the coming year. The most pressing issues were architectural — how to easily contribute and pull changes in from the project and how to better facilitate customization for individual solutions. 

Some of the other important issues raised over the two-day summit included: What does the Bifold community want out of the next Aries interop profile – essentially, what technologies does the community want to be standardized for interoperability? A related question was “where does overlay capture architecture (OCA) fit into the interoperability profile?” This conversation led to several good ideas and is being continued in the Bifold Working Group bi-weekly meetings.

Other questions became action items for additional research, specifically around certain tools and integrations that the group would like to accomplish for the project. One such action item is in regard to Storybook — a tool for UI design and iteration, another one was Expo — a tool used to simplify the development process. 

The most important outcome from the summit is that the project will move toward a focus on a modular, component-based architecture. The goal is to provide building blocks for creating digital wallets, rather than a full application that people can adapt to their purposes. This will provide more flexibility to the end developer trying to build using the software that already has their own requirements and use-cases they need to meet.

What’s next?

The group will also gradually experiment with implementing this new architecture: the consensus was that we don’t want a full refactor but rather a slower inclusion of more features over time. The next steps are getting some of the basics in place — for example, migrating to using Yarn instead of Node Package Manager (NPM) for package management, and setting up new standards for project configuration and repo management. 

There is certainly plenty still to be done but I think that you could comfortably call the first Hyperledger Aries Bifold summit a success. There was a great turn out from the community, the important conversations for moving the technology forward happened, and now we have a clear development direction for the group and action items to tackle in 2023.

If you want to get involved with this project we always welcome people getting involved! You can find all the information you need here.

If you are interested in learning more about wallets, you can learn more about the Indicio solution Holdr+ here.

And if you have general questions about the technology or creating your own wallet for personal use you can get in touch with the team of experts at Indicio here.

The post The First Ever Hyperledger Aries Bifold Summit appeared first on Indicio.


FindBiometrics

[UPDATED] Feb 15 Virtual Summit Sessions Announced: Digital ID in Healthcare, Financial Services, Travel

On February 15, 2023, FindBiometrics is presenting “The Road Ahead for Biometrics and Identity” — a full day of panels interview-style sessions with identity experts covering a broad spectrum of […]
On February 15, 2023, FindBiometrics is presenting “The Road Ahead for Biometrics and Identity” — a full day of panels interview-style sessions with identity experts covering a broad spectrum of […]

YeshID

Day 4: Standardize on Chrome and Adjusting Settings

We recommend standardizing on Chrome as your internal browser. Why? By standardizing on one browser, IT administrators can more easily...

We recommend standardizing on Chrome as your internal browser.

Why?

By standardizing on one browser, IT administrators can more easily deploy and manage security policies, updates, and troubleshoot issues. This can save time and resources for IT staff. It can help to improve security by reducing the attack surface and enabling IT to focus on securing one browser instead of multiple. Also, IT staff can be more familiar with the security features, vulnerabilities and best practices of that browser.

From the compliance perspective, some industries may have regulations or compliance standards that specify which browser must be used, standardizing on one browser can help organizations to meet these requirements. And finally, standardizing on one browser can help to reduce costs associated with purchasing and maintaining multiple browsers.

Guidance

It can be difficult to get employees to sign in to Chrome with their enterprise accounts in order to apply settings. We recommend first ,deploying an enrollment token to your managed devices, which will apply Chrome settings to them regardless of whether a user has been signed in.

Once some form of browser management in place, head into Devices > Chrome > Settings > Users & Browsers and set your security baselines. If you’re not sure where to begin, start by taking a look at the ,CIS Benchmaks for Chrome. Any major changes should first be applied to an OU of testers (such the IT team) and then expanded out from there.


OWI - State of Identity

The Privacy-Enhancing Technologies We Need Today

On this week's State of Identity podcast, host Cameron D'Ambrosi is joined by ID5 CEO, Mathieu Roche to explain how identity solutions are a means to enforce data protection mechanisms rather than go against them. They present and explain what ID5 does in contrast to the surveillance advertising narrative.

On this week's State of Identity podcast, host Cameron D'Ambrosi is joined by ID5 CEO, Mathieu Roche to explain how identity solutions are a means to enforce data protection mechanisms rather than go against them. They present and explain what ID5 does in contrast to the surveillance advertising narrative.


1Kosmos BlockID

How To Use Biometrics with FIDO

Enabling biometrics with FIDO can help create more security for your logins and reduces the risk of login attacks from succeeding. What is FIDO biometrics? FIDO biometrics are a way to authenticate a user via their face, fingers, or voice; this lends itself to a user having a passwordless experience with their login. What Are … Continued The post How To Use Biometrics with FIDO appeared first on

Enabling biometrics with FIDO can help create more security for your logins and reduces the risk of login attacks from succeeding.

What is FIDO biometrics? FIDO biometrics are a way to authenticate a user via their face, fingers, or voice; this lends itself to a user having a passwordless experience with their login.

What Are FIDO Passwordless Standards?

One of the more significant cybersecurity threats today is the inherent weakness of passwords. In the earliest days of computing, having a simple username and password or PIN combinations proved suitable for security.

In modern cybersecurity and authentication, however, the “always on” nature of the Internet and the proliferation of cloud-based apps and mobile devices have shown us that password authentication is weak against several common attacks like phishing or other social engineering approaches.

There has been a massive shift in authentication from common password systems towards multi-factor authentication (MFA) and passwordless solutions. But, as with many innovations, this move introduced several new issues, namely the fragmentation of the implementation and maintenance of these standards.

The Fast IDentity Online (FIDO) alliance was formed to address two key issues related to strong authentication and adoption:

Passwordless Authentication: The stated mission of the FIDO alliance is to reduce our reliance on passwords by shifting technology toward passwordless systems. Interoperability: While passwordless authentication is a step towards strong authentication, it won’t provide much benefit if it isn’t used. Therefore, FIDO promotes a shared and open industry standard for adopting passwordless authentication.

To provide the required interoperability sufficient for an industry standard, FIDO developed several specifications on their own and in conjunction with the World Wide Web Consortium (W3C):

Universal Authentication Framework (UAF): This is the passwordless standard for FIDO. UAF architecture provides a framework through which users can authenticate themselves, without a password, through the exchange of asymmetric encrypted key pairs between a FIDO server and a client application or device with the user (most commonly a phone or tablet). Universal 2nd Factor (U2F): U2F is a narrower authentication implementation that allows developers to include a second (or more) factor for authentication. U2F allows MFA over UAF passwordless verification, password systems, or USB verification devices. FIDO2: The FIDO2 project results from the FIDO Alliance’s work with the W3C. It combines the Web Authentication API, or WebAuthn (a standardized, secure authentication interface for web applications using public-key cryptography) and the Client to Authenticator Protocol, or CTAP2 (a protocol that allows cryptographic authenticators like USB keys or smartphones with other devices).

While this combination of technologies might overlap, they all serve the primary function of bringing secure, passwordless security to hardware devices and web applications.

What Are FIDO-Certified Biometrics?

Simply put, FIDO biometrics are just FIDO-certified biometric devices and technologies that can function on top of FIDO authentication. These biometrics devices are built to conform to FIDO (thus, open) standards to provide strong authentication security that can complement several kinds of identity verification and management requirements.

The benefit of FIDO, and this FIDO biometrics, are that they are open and certified by the Alliance, meaning that they will operate under common standards and share interoperability with other FIDO protocols.

Some of the FIDO-certified types of biometrics include:

Fingerprints Voice Facial Recognition Iris Scans Biometric Component Certification Program

To ensure the interoperability of the FIDO biometric component technology, the FIDO Alliance uses a certification process that’s independent of its other programs. This means that the creators of biometric components can apply for FIDO certification for that component without having a fully-functional, FIDO-certified authentication application or platform.

There are, however, rules for component integration–that is, if a biometric component is certified, there are still essential rules around how other authentication platforms can integrate that component if they seek FIDO certification.

Additionally, these certification requirements will depend on the type of FIDO certification. Authenticators seeking certification at FIDO levels 1 or 2 only have the option, not the requirement, to integrate biometrics. Levels 3 or higher, however, require certified biometrics.

In either case, FIDO provides a concrete process by which biometrics components are certified:

Application: Developers seeking certification must apply for a FIDO Alliance account to access the Biometric Dashboard. This Dashboard allows them to navigate the certification process, including initiating their application.

The FIDO Alliance biometric component certification secretariat will review this application, accept it, reject it, or ask for further documentation.

Biometric Testing: The developer submits the component to a FIDO-accredited laboratory for testing. The laboratory will combine online and offline testing with live subjects. The laboratory will also provide an Allowed Integration Document that outlines the changes necessary for authenticators to integrate the biometric component properly. Laboratory Reports: Once the testing is complete, the laboratory will provide a laboratory report to the developer and to the certification secretariat, including the Allowed Integration Document. Certification Requests: After the report is approved, the developer completes a certification request. This step also calls for the developer to provide metadata that describes the component. This metadata must include the Biometric Certification level, the Self-Attested False Accept Rate, and the Self-Attested False Reject Rate. Certification Issuance: If the FIDO Alliance approves the certification, then the certificate is issued to the developer. The developer may submit their metadata to the FIDO Metadata Service. They will also pay their certification fee ($10,000 for members and $13,000 for non-members). What Are the Benefits of Moving to FIDO Biometrics? Standardized Passwordless Authentication: Passwordless authentication is a critical step in identity and access security. FIDO provides a robust and maintained way to implement passwordless authentication without having to lock into a single vendor or service provider. Flexible Hardware-Based Authentication: A strength of FIDO standards is that it supports hardware-based token authentication, including using USB keys, keycard authentication, and hardware tokens.

With FIDO biometrics, you’ll also get the benefits of stronger security using physical traits (for example, including a fingerprint scan in a USB security key or leveraging FIDO-compliant facial recognition in a Windows laptop.

Regulatory Compliance: Biometrics, hardware-based authentication, and MFA are all components of most regulatory frameworks.

Furthermore, the National Institute of Standards and Technology, or NIST, governs almost every federal and defense cybersecurity standard and is a member of the FIDO Alliance. Working with FIDO-compliant technology will go a long way to aligning your infrastructure with security standards.

Interoperability: FIDO biometrics won’t require you to lock in with a single provider or vendor. Additionally, the FIDO Metadata Service provides an extensive directory of compliant technology so you can trust any component or solution you adopt. Deploy FIDO-Compliant Authentication with 1Kosmos BlockID

1Kosmos BlockID is a certified FIDO technology that provides enterprise users with decentralized and passwordless authentication. Our solution is based on blockchain technology and uses simple UX and mobile devices to ensure that your employees can quickly and easily identify themselves across your company’s platforms and digital assets.

With 1Kosmos, you get the following benefits:

SIM Binding: The BlockID application uses SMS verification, identity proofing, and SIM card authentication to create solid, robust, and secure device authentication from any employee’s phone. Identity-Based Authentication: We push biometrics and authentication into a new “who you are” paradigm. BlockID uses biometrics to identify individuals, not devices, through credential triangulation and identity verification. Cloud-Native Architecture: Flexible and scalable cloud architecture makes it simple to build applications using our standard API and SDK. Identity Proofing: BlockID verifies identity anywhere, anytime and on any device with over 99% accuracy. Privacy by Design: Embedding privacy into the design of our ecosystem is a core principle of 1Kosmos. We protect personally identifiable information in a distributed identity architecture, and the encrypted data is only accessible by the user. Private and Permissioned Blockchain: 1Kosmos protects personally identifiable information in a private and permissioned blockchain, encrypts digital identities, and is only accessible by the user. The distributed properties ensure no databases to breach or honeypots for hackers to target. Interoperability: BlockID can readily integrate with existing infrastructure through its 50+ out-of-the-box integrations or via API/SDK.

Try 1Kosmos biometric capabilities–easily demo our app experience in 3 steps.

The post How To Use Biometrics with FIDO appeared first on 1Kosmos.


Ontology

Meet the Team: Ontology’s Americas Ecosystem Lead, Erick Pinos

What’s your name and where are you from? My name is Erick Pinos and I’m the Americas Ecosystem Lead at Ontology. I grew up in New Jersey. I’m half Costa Rican and half Ecuadorian. Tell us a bit about yourself. What did you study? What are your hobbies? I studied Computer Science and Business Management at MIT. At the moment, my hobbies are centred around virtual reality. I love to pla
What’s your name and where are you from?

My name is Erick Pinos and I’m the Americas Ecosystem Lead at Ontology. I grew up in New Jersey. I’m half Costa Rican and half Ecuadorian.

Tell us a bit about yourself. What did you study? What are your hobbies?

I studied Computer Science and Business Management at MIT. At the moment, my hobbies are centred around virtual reality. I love to play with VR and try out new experiences involving games, 360º videos, VR-guided meditation, and more. I also love to cook, and I really enjoy experimenting with different ingredients and recipes to see how each affects the flavours and textures of what I’m making. It’s a fun application of chemistry and a good way to save money and eat healthier at the same time.

What kind of work do you do on a day-to-day basis?

On a day-to-day basis, I scout new leads for potential Ontology partners, whether it be a dApp for the Ontology EVM, or an integration partner for one of Ontology’s projects. There are a wide range of projects where we’re constantly seeking out new integration partners to strengthen Ontology’s offerings and use cases. These projects include ONTO Wallet, Ivy Market, Wing Finance, and Orange Protocol. I have regular calls across all time zones with various internal working groups and potential leads. So, I’m kept quite busy!

In your opinion, what makes Ontology stand out from other blockchains?

I think Ontology’s focus on decentralized identity and data makes it stand out. It was one of the first blockchains to have a working implementation of decentralized identity, complete with decentralized identifiers and verifiable credentials that fit the W3C standard. That, coupled with Ontology’s extremely fast turnaround time for product deployments and updates, makes it a strong contender in the blockchain space.

What is the most exciting part of the project you’re working on for Ontology?

The most exciting part is seeing the many new Web3 projects that are popping up all over the space tackling various different problems. I like that working in outreach and lead generation gives me the opportunity to talk to various different founders and start-ups and to constantly be learning about the cutting edge of Web3, Metaverse, DeFi, DeSci, DAOs, and more.

What has been the most important moment in your career so far?

The most important moment for my career so far was taking the time to really understand Ethereum in 2016. Despite being involved in Bitcoin since 2014, my ideas in the space were limited to scalability infrastructure or payment systems for Bitcoin. After truly understanding how smart contracts worked, however, my mind was opened to all the possibilities of financial and non-financial systems that could be fully run on a blockchain.

What are you most excited for in your future at Ontology?

I am most excited to see how the Web3 space will unfold, given that most agree that technological progress occurs during a bear market. I’ve been a huge fan of the metaverse and I would like to continue to have an active role in helping more adoption occur.

As we mark Ontology’s five-year anniversary, where do you see Ontology and Web3 going in the next five years?

I see further developments in the decentralized identity and data space, led mainly by DAOs that have a growing user base and a growing need to properly manage and safekeep the identity and data being generated by their users via various day-to-day activities. I can see Ontology being a leader in this space, given its longstanding focus on decentralized identity and data.

Follow us on social media!

Ontology website / ONTO website / OWallet (GitHub)

Twitter / Reddit / Facebook / LinkedIn / YouTube / NaverBlog / Forklog

Telegram Announcement / Telegram English / GitHubDiscord

Meet the Team: Ontology’s Americas Ecosystem Lead, Erick Pinos was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


Tokeny Solutions

Tokeny’s 2023 Product Roadmap: Scaling RWA tokenization

The post Tokeny’s 2023 Product Roadmap: Scaling RWA tokenization appeared first on Tokeny.

Product Focus

Tokeny’s 2023 Product Roadmap: Scaling RWA tokenization

This content is taken from the monthly Product Focus newsletter in February 2023.

We hope you are well. For the first edition of our Product Focus newsletter in 2023, we would like to share our main product development priorities for this year. Our focus is on four key areas: scalability, connectivity, security, and composability.

Scalability empowers asset owners to efficiently tokenize and manage real world assets (RWAs) and securities, reducing costs and improving accessibility. Our self-service SaaS platform already allows our customers to start tokenization without any tech support. This year, we are taking it a step further by enabling issuers to manage advanced settings directly via the Servicing platform. We are also enhancing the UX and adding multiple new features, which will lower operational costs and increase investor access to digital securities.

For example, we are working on several features to enable large-scale operations for open-ended funds, such as REITs and UCITs. By simplifying blockchain tools for traditional investors and offering fractionalized assets at affordable prices, tokenized asset issuers can boost the accessibility of assets with automated workflows.

Connectivity ensures that tokenized assets are interoperable with the necessary service providers and market solutions to enrich data and facilitate liquidity. Our solutions use the token standard ERC3643. As it is now widely adopted, the number of ecosystem players is rising. Our role is to ensure that our customers can easily connect to any service provider, for example, valuation agents, centralized exchanges, decentralized exchanges, as well as DeFi protocols to not only improve liquidity but also to open up new revenue streams (e.g. DeFi services, Billboard ads, etc.).

Security is a top priority to keep data and assets safe from potential vulnerabilities. To continue protecting our customers from any form of technical risks and data breaches, we are taking a multi-pronged approach by conducting more thorough security tests and audits, obtaining certifications such as SOC2, and improving extra measures such as 2FA authentication. In addition, our solutions will be compatible with the upgradable ERC3643 V4 to ensure that issuers always have full control over their assets as it provides the capability to fix any vulnerabilities that may arise.

Composability guarantees that issuers’ tokenized assets will never be blocked on one single blockchain network, providing a future-proof tokenization approach. Thanks to the portability of ERC3643, tokenized assets can switch to any EVM blockchain. Currently, our solution is already multi-chain, supporting Ethereum and Polygon. This year, we plan to integrate with more EVM blockchains based on market demand. We’ll also provide services to bridge assets from private networks to EVM public blockchains as institutions transition from private networks to public ones.

Our outlook for the coming year is exciting, and we look forward to continuing to receive your support as we strive to improve our products in order to assist you in achieving your tokenization goals.

Subscribe Newsletter

This monthly Product Focus newsletter is designed to give you insider knowledge about the development of our products. Fill out the form below to subscribe to the newsletter.

Other Product Focus Blogs Tokeny’s 2023 Product Roadmap: Scaling RWA tokenization 2 February 2023 Our product highlights from 2022 30 December 2022 Upgradable ERC3643 V4 for tokenized securities 28 November 2022 DINO: the largest distribution network for tokenized securities 31 October 2022 How we are building an interoperable ecosystem for private markets 26 August 2022 ERC3643 Onchain Factory 22 July 2022 The Sound of the ERC3643 Community 21 June 2022 From Start-up to Scale-Up: How Tokeny Prepared for Large-Scale Tokenization 11 May 2022 Introducing LEGO-like onboarding solutions for digital asset investors 6 April 2022 ComplyDeFi – The Compliance SDK For DeFi Protocols 28 February 2022 Tokenize securities with us

Our experts with decades of experience across capital markets will help you to digitize assets on the decentralized infrastructure. 

Contact us

The post Tokeny’s 2023 Product Roadmap: Scaling RWA tokenization appeared first on Tokeny.


IDnow

Regulations for online gaming companies in Europe – an overview.

As technology evolves, regulations in various industries also need to do the same. Although Europe is regarded as a “single state” in relation to its economic significance and is governed by the EU, individual countries are still responsible for passing their own laws. This is especially true within the gambling industry since no regulation or […]

As technology evolves, regulations in various industries also need to do the same. Although Europe is regarded as a “single state” in relation to its economic significance and is governed by the EU, individual countries are still responsible for passing their own laws. This is especially true within the gambling industry since no regulation or legislation standardizes betting throughout the continent. And the same goes for the United Kingdom. Though no longer part of the EU, the gambling market in the UK is still strongly connected to the rest of the EU gambling landscape.

The global gambling market is estimated to reach roughly $876 billion by 2026, evolving at a CAGR of 3.6% over the assessment period. The European online gambling market is expected to grow by 9.20% by 2025, fueled mainly by software and hardware innovations and the rising popularity of gambling. As such, the gambling industry in Europe shows no sign of slowing down. Still, with scams, gambling addiction, money laundering cases, underage gambling, and other pressing issues, how is gambling regulated across the European Union? What do online gaming companies need to be aware of?

Regarding compliance and regulations, it is crucial to note that they vary throughout Europe. That said, KYC (Know Your Customer) requirements are standard regardless.

A well-regulated market.

The European Gaming and Betting Association (EGBA) was formed to keep the casino industry under control. Their main objective was to promote a sustainable iGaming sector in Europe to help players enjoy a safe and fun experience inside a well-regulated market.

Based in Brussels, the association has worked to shape gambling regulations in the region. Online gaming companies associated with EGBA abide by a large set of industry standards designed to complement the numerous licensing guidelines they already adhere to in European nations.

Such European industry standards incorporate Responsible Remote Gambling Measures introduced by the European Committee for standardization and are executed to achieve the below objectives:

Ensuring the safety of vulnerable gamers The prevention of underage gambling via age verification Countering criminal and fraudulent behavior Safeguarding customer privacy and the protection of private information Accurate and prompt payments for customers Responsible and truthful marketing through advertising regulations Fair and responsible gaming The prevention of money laundering Ensuring a safe, secure, and reliable operating environment Dedication to customer support and satisfaction

A growing number of European Union countries have established licensing systems allowing more than one operator to provide services on the market. Under EU law, no system is favored over another.

Gambling regulation in European Union countries involves diverse regulatory frameworks. In several judgments, the Court of Justice of the European Union has legitimized the compliance of national regulatory systems with European Union Law.

The Commission promotes the efforts of EU nations to upgrade their national online gambling legal systems, particularly in administrative cooperation among gambling regulatory authorities. It also offers support to protect gamblers and vulnerable individuals, including minors.

Regulations by EU country.

Each European country essentially shapes its gambling regulations. Below are the regulations in some of the most prominent countries in that region.

France

Numerous forms of gambling are legal in France, and three central bodies regulate them. The Francaise de Jeux handles betting games and lotteries, the Pari Mutuel Urbain deals with horse racing, and ARJEL is associated with online gambling.

ARJEL also issues licenses for horse racing betting and forms of gambling that use decks of cards, i.e., poker. It also enforces online gambling regulations. The regulation makes it mandatory for licensees to verify players’ identity, address, age, bank information, and other data.

Technological solutions to verify players’ identities, such as AI applications like VideoIdent, are permitted within the requirements. The adoption of PACTE in April 2019 safeguards minors by fining online gambling companies offering free gambling.

United Kingdom

Though no longer part of the EU, the UK Gambling market is still very much connected to the rest of Europe. The UK Gambling Commission regulates gambling activity in the region, offering some of Europe’s most stringent gambling regulations. Gamblers in the UK spend 14.5 billion GBP annually, and this number is expected to grow. Based on such findings, the body added a limit to the amount gamblers can place on fixed bets in betting shops. Betting shops in the UK can typically be found in high streets up and down the country, and before these limits were introduced, players could place up to 100 GBP per bet. Currently, players can only wager up to 2 GBP. This resulted in great losses for betting shops, and William Hill had to close over 100 of its betting shops showing how much this decision has changed the betting landscape.

Operators must also verify a customer’s age before allowing them to deposit funds into a gambling account. Regulators encourage operators to use artificially intelligent applications such as AutoIdent to identify gamblers. Non-compliant behavior, such as illegal advertisements, failure to identify age, etc., can lead to severe financial consequences for online gaming companies.

The UKGC may also revoke gambling licenses if the Commission feels that the online gaming company is not abiding by the rules or mistreating customers.

Originally set to release in 2019, the UK Gambling Act white paper is a set of rules put forward by the UK government for the protection of online gamblers. Delayed for the fourth time, the white paper also aims to put the UK Gambling Commission under scrutiny to ensure it has sufficient investigation sanctioning powers, and whatnot.

Read on to learn more about Online Gambling regulations in the UK – an overview

Germany

The legal framework governing online gaming is referred to as the Interstate Treaty for Gambling (ISTG) or Glücksspielstaatsvertrag, GlüStV, introduced in 2012. Germany is known to have some of the most complex gambling regulations in Europe. The nation has also made numerous changes in the last few years. Recently, the country banned all online gaming platforms save for those offering horse racing betting. EGBA attempted to dispute this decision since it was not adhering to EU rules.

In 2010, the European Court of Justice ruled that the country’s industry was monopolized and had to be made liberal. When the ISTG was introduced, it allowed private companies to offer gambling services.

The new Interstate Treaty on Gambling 2021 came into force in July 2021. It distinguished between online casino games and virtual slots. It offers each state the choice to impose a state monopoly or issue a limited number of licenses to private gambling companies.

Sports betting operators must demonstrate that they are willing to offer safe services in line with the AML measures. Operators are required to authorize all players and ensure all minors are excluded. Video verification lets online gaming companies enter the market in line with online gaming regulations.

Switzerland

Switzerland has two levels of gambling laws, i.e., federal and cantonal. The country offers casino licenses via the Swiss Federal Casino Commission (ESBK). The Swiss Lottery and Betting Board handles sports betting, lotteries, and fixed odds.

To urge operators to pay their fair share of taxes, the Money Gaming Act was passed in 2018 to block the IP addresses of offshore gambling sites. The law became effective in January 2019, and all unlicensed gambling was banned by the end of June.

Austria

Licenses for online gaming companies are issued by individual states in Austria, with no regulations at a federal level. Regulations only cover sports events in most states, but some jurisdictions also allow bets on other types of events. Live betting is prohibited in some states.

Technically speaking, state laws specifically regulate land-based betting. That said, licensed operators can also provide their services online.

The Austrian Ministry of Finance recently announced plans to reform the gambling regulatory framework. The proposal also suggests raising taxes on gambling-based revenues and establishing an authority tasked explicitly with enforcing regulations and providing licenses.

Learn more in our Online Gambling regulation in DACH – an overview

Malta

Malta became the first European Union Member State to offer services to gaming companies and is now the most prominent remote gambling jurisdiction internationally. Based on that, a broad range of online gaming services and facilities are available in the industry.

Founded in 2001, the Malta Gaming Authority was set up to regulate various forms of gambling in the nation. This includes land-based activities like casinos and their online counterparts, including B2B and B2C services.

Isle of Man

The Isle of Man Government thoroughly promotes the development of gambling on the island, whether land-based or online. The region is committed to offering a robust regulatory environment backed by a stable government and various attractive business advantages.

In terms of regulation, the island’s Gambling Supervision Commission (GSC) is an independent gambling operation regulator established in 1962. Aside from licensing and regulating land-based gambling operations, the Commission also regulates all iGambling activities.

Gibraltar

Gibraltar is among the world’s leading jurisdictions in the iGaming industry and an internationally recognized innovator in the field. Gibraltar offers all kinds of casino games, slots, poker, machine games, and most other number games without limits. That said, online operators require separate licenses for betting activity.

Two entities, i.e., GRA (Gibraltar Regulatory Authority) and the GBGA (Gibraltar Betting and Gaming Association), work collaboratively to ensure the region’s gaming laws.

The GRA issues licenses, ensuring operators comply with established regulations and monitoring possible risks threatening Gibraltar’s iGaming industry. On the other hand, GBGA assumes the role of an advisory body, supporting the region’s licensed gaming companies while certifying they operate responsibly.

Spain

Gambling has been legal in Spain since 1276. That said, gambling has been banned and re-legalized numerous times since 1276. It has only been continuously legal since 1977, with the exception of the lottery, which has been continuously legal since 1763.

Currently, the definitive piece of gambling legislation is the Spanish Gaming Act 2011 (i.e., Ley 13/2011, de 27 de mayo, de regulación del Juego) and its amendments.

Gambling in Spain is regulated at the national level and by the 17 self-governing regions, wrestling in a complex legal landscape. However, certain forms of gambling, such as American and French roulette, blackjack, craps, etc., are legal in all 17 regions.

Cyprus

Most forms of gambling in Cyprus were illegal until 2012. However, in 2012, the Republic made online and land-based sports betting legal. The NBA (National Betting Authority), established under Betting Law 2012, examines applications, and licenses, audits and supervises betting shops and online gambling operators.

The Cyprus Gaming and Casino Supervision Commission, also known as the Cyprus Gaming Commission, was set up in 2015 under the Cyprus Casino Control Law.

As a result, the 2015 Cyprus Casino Control Law allowed a license to be offered to a sole operator to establish an Integrated Casino Resort along with four “satellite” casinos. The Integrated Resort is Cyprus’ most prominent tourism development.

Alderney

Part of the British Channel Islands, but a self-governing dependency of the British Crown, the States of Alderney have their own separate commission that oversees gambling on the island. The Alderney Gambling Control Commission (AGCC) is the official regulatory authority for gambling in Alderney. It reviews gaming licenses and certification applications and ensures all laws are observed.

Become compliant in the EU & UK with IDnow.

IDnow can assist online gaming companies via its EU-compliant AutoIdent solution to make onboarding quicker and more convenient. Gaming companies must comply with regulations, and AutoIdent helps make this process easier by streamlining the onboarding process.

Current companies in the gaming industry that trust and use our services include Etain, Lottoland and Admiral.

By

Roger Redfearn-Tyrzyk
Director Global Gambling & Sales UK at IDnow
Connect with Roger on LinkedIn

Online Gambling regulations – Europe Download our ebook to gain insight into becoming compliant with European regulations as a gaming platform and learn about the most important regulations for operators in the EU & UK. Get your free copy


Infocert

InfoCert named ‘technology leader in 2022’ by Quadrant Knowledge

The post InfoCert named ‘technology leader in 2022’ by Quadrant Knowledge appeared first on Infocert.digital.

FindBiometrics

Innovatrics’ OCR Tech Meets Thai Alphabet Challenge

Innovatrics is detailing some of the great lengths the company has gone to in order to facilitate document recognition through its identity verification platform. Calling itself a world leader in […]
Innovatrics is detailing some of the great lengths the company has gone to in order to facilitate document recognition through its identity verification platform. Calling itself a world leader in […]

Wednesday, 01. February 2023

FindBiometrics

Biometric Registration Spreads in Africa: Identity News Digest

Welcome to FindBiometrics’ digest of identity industry news. Here’s what you need to know about the world of digital identity and biometrics today: Nigeria Police Force Collects Officers’ Biometrics Administrators […]
Welcome to FindBiometrics’ digest of identity industry news. Here’s what you need to know about the world of digital identity and biometrics today: Nigeria Police Force Collects Officers’ Biometrics Administrators […]

Shyft Network

Shyft Dev Deep Dive Monthly — Jan. 2023

Shyft Dev Deep Dive Monthly — Jan. 2023 Welcome to Shyft Dev Deep Dive, our monthly developer update, where we share the new Shyft-wide features and technical changes. Shyft Veriscope finished 2022 on a strong note with the introduction of Address Proofs that enable VASPs (read: crypto businesses) to prove ownership or control of an address before personal identification information (PII) i
Shyft Dev Deep Dive Monthly — Jan. 2023

Welcome to Shyft Dev Deep Dive, our monthly developer update, where we share the new Shyft-wide features and technical changes.

Shyft Veriscope finished 2022 on a strong note with the introduction of Address Proofs that enable VASPs (read: crypto businesses) to prove ownership or control of an address before personal identification information (PII) is shared between VASPs.

Address Proofs sets Shyft apart from its peers as Shyft Veriscope now solves for both address attribution (i.e., who owns a hosted wallet address without having to ask end-users) and secure, peer-to-peer data transmission between VASPs.

In the last month, several additional enhancements have been added to Veriscope. These are detailed below. As always, if you have questions or would like a demo, please reach out to us at veriscope@shyft.network. We look forward to hearing from you!

Veriscope Releases (Jan-2023) Node Synchronization Upgrade

Veriscope Relay Node has now been upgraded to Nethermind v1.15.0, which has made data synchronization with the Shyft Blockchain must faster.

Event transactions, too, take considerably less time to complete, as the time it takes to load and save them to Veriscope has decreased significantly. Events refer to the events emitted from transactions on the Shyft blockchain.

Earlier, the above-mentioned processes used to take 12+ hours to complete, but the Node Synchronization Upgrade has decreased it to 4–6 hours. Click here for more details on Node Synchronization.

Veriscope UI Update

With the new Veriscope UI Update, Discovery Layer Key-value pairs are now sorted A-Z. A new settings page has also been introduced to provide easy access to user settings (email, password, 2FA), webhook URL, and API tokens.

IVMS Update

Keeping in line with the IVMS specification, we added additional IVMS fields to Shyft Veriscope. With this, IVMS values saved to Veriscope are now shown in the IVMS form input fields.

VASPs can also export their IVMS data as either Originating VASP or Beneficiary VASP in the required IVMS format. Click here for more details on it.

Blockchain Analytics Update

In January, we made two blockchain analytics updates. The API endpoint for Merkle Science has been corrected, and the API endpoints for each Blockchain Analytics service provider are now shown in the Veriscope user interface, next to where the VASPs can input their API keys.

Conclusion

Starting the year with significant improvements to Shyft Veriscope, the Shyft Product & Engineering team proved again that what matters the most is user experience. We believe user experience must remain frictionless even while complying with stringent regulations. And the Shyft team is committed to it.

Stay tuned for next month’s update, where we will share more exciting developments and updates.

Shyft Network powers trust on the blockchain and economies of trust. It is a public protocol designed to drive data discoverability and compliance into blockchain while preserving privacy and sovereignty. SHFT is its native token and fuel of the network.

Shyft Network facilitates the transfer of verifiable data between centralized and decentralized ecosystems. It sets the highest crypto compliance standard and provides the only frictionless Crypto Travel Rule compliance solution on the blockchain while ensuring user data is protected.

Visit our website to read more: https://www.shyft.network, and follow us on Twitter, LinkedIn, Discord, Telegram, and Medium. Also, sign up for our newsletter to keep up-to-date.

Shyft Dev Deep Dive Monthly — Jan. 2023 was originally published in Shyft Network on Medium, where people are continuing the conversation by highlighting and responding to this story.


FindBiometrics

[Updated] FindBiometrics and Acuity Market Intelligence to Unveil Biometric Digital Identity Prism at Feb 15 Virtual Event

On February 15, 2022, FindBiometrics is hosting “The Road Ahead for Biometrics and Digital Identity” Virtual Summit — a full day of online panels and interview style sessions about the […]
On February 15, 2022, FindBiometrics is hosting “The Road Ahead for Biometrics and Digital Identity” Virtual Summit — a full day of online panels and interview style sessions about the […]

YeshID

Day 3: Securing your Apps - Review and Restrict Settings in Google Workspace Apps

Restricting unnecessary data sharing is key to any information security strategy. Why? It helps to minimize the potential impact of a...

Restricting unnecessary data sharing is key to any information security strategy.

Why?

It helps to minimize the potential impact of a data breach or unauthorized access to sensitive information. The less data that is shared, the fewer opportunities there are for hackers or other malicious actors to access and exploit that data. When you restrict unnecessary data sharing, you can also reduce the risk of data breaches caused by human error, such as an employee accidentally sending sensitive information to the wrong person.

Additionally, sharing data with third-party companies and service providers can also pose a security risk, as you are entrusting the security of that data to another organization. Restricting unnecessary data sharing helps you to be more selective about which companies you share data with and to ensure that they have adequate security measures in place to protect that data.

Finally, restricting unnecessary data sharing is also important from a compliance perspective as it helps organizations to comply with regulations such as GDPR and HIPAA that require organizations to protect personal data of individuals and to limit the sharing of sensitive information.

Guidance

Before making any changes, we recommend using a tool like ,GAM or other reporting tool with deep hooks into Google Workspace to better understand how many people are utilizing these features and what the impact of your changes will be.

Gmail

Disable settings such as Automatic Email Forwarding under End User Access to ensure that users cannot automatically send all emails outside of the organization or to unauthorized services. Review and adjust Safety settings. We’d recommend settings all of them to ON, with the exception of “Protect against unauthenticated emails” to avoid legitimate flagging emails from misconfigured domains as spam.

Calendar

Prevent users from sharing anything other than free/busy information with external sharing for primary and secondary calendars. Depending on the organization, consider also disabling sharing for all information internally as well, or apply a rule to disable it for a group/OU of users with privileged calendar information, such as executives or your People team.

Drive

With Drive, you will need to think through how your employees are utilizing Google Drive to interface with external parties, like customers. In some cases companies may be presenting a Google Drive file as a public link to publish content online. Remember that settings can be applied to groups or OUs to more granularly scope permissions. The settings below represent an ideal:

Disable the ability for users to make files and published web content visible to anyone with a link Only users in your domain can share content outside of your org General access default is set to be private to the owner by default

Spruce Systems

Tutorial: Build a Basic Token-Gated App with SSX, RainbowKit, and Alchemy

This example will show developers how to build and enable token-gated access in their dapp with SSX based on holding an ENS name. Additionally, it will show a developer how to also use SSX with RainbowKit and Alchemy.

SSX provides developers with the easiest way to integrate Sign-In with Ethereum, enable DAO logins, resolve ENS names, and more. As we continue working on SSX, we want to show developers how they can quickly get started with easy-to-run example projects. Gating access to content based on a user's NFTs or tokens is a core-building block for building rich dapp experiences. For example:

If a DAO wanted to limit content based on users with a certain amount of token-based voting power. If an NFT PFP project wanted to offer exclusive content to token holders. If a project wanted to enable early access to an application based on holding an NFT.

The following example will show developers how to build and enable token-gated access in their dapp with SSX based on holding an ENS name. Additionally, it will show a developer how to also use SSX with RainbowKit and Alchemy.

You can also follow along in our documentation here.

Run our Completed Example

Want to jump in and see a token-gated example in action quickly? Use the following commands to download the SSX repository:  

git clone https://github.com/spruceid/ssx cd ssx Ensure SSX packages are installed from the ssx directory by running yarn install Navigate to the example directory (cd examples/ssx-test-token-gated) Add an Alchemy API key to .env In your terminal, run: yarn install yarn start

Once run, you will be presented with a dapp using RainbowKit that prompts you to connect your wallet and Sign-In with Ethereum.

After signing in, if the wallet used owns an ENS name, you will be presented with the gated content. If not, the dapp should display "No ENS name found"

The following guide will teach you how to create a token-gated dapp enabled by SSX from our create-ssx-dapp package.

Create the Dapp Yourself

The initial setup will be done using SSX's dapp creation tool. Type the following in your terminal to get started:

yarn create @spruceid/ssx-dapp token-gated-example

For this example, we will be using the following explicit options in the setup tool:

Typescript Leave the other options empty when prompted The create-ssx-dapp tool Set up the Alchemy SDK and RainbowKit

The Alchemy SDK dependency can be installed with the following command in the directory of your dapp:

yarn add alchemy-sdk

RainbowKit is also used in this example. Add the required dependency via the following command:

yarn add @rainbow-me/rainbowkit wagmi

Additionally, you will also need to add the ssx-react dependency. To add it, use this command:

yarn add @spruceid/ssx-react

Head to src/index.tsx and add the following to plug things in initially:

/** src/index.tsx **/ import { RainbowKitProvider, getDefaultWallets } from '@rainbow-me/rainbowkit'; import { goerli, mainnet, configureChains, createClient, WagmiConfig, } from 'wagmi'; import { alchemyProvider } from 'wagmi/providers/alchemy'; import { publicProvider } from 'wagmi/providers/public'; import { SSXProvider } from '@spruceid/ssx-react'; const { chains, provider } = configureChains( [mainnet, goerli], [ alchemyProvider({ // This is Alchemy's default API key. // You can get your own at https://dashboard.alchemyapi.io apiKey: `${process.env.REACT_APP_ALCHEMY_API_KEY}`, }), publicProvider(), ] ); const { connectors } = getDefaultWallets({ appName: 'SSX ENS Token Gated Example', chains, }); const wagmiClient = createClient({ autoConnect: true, connectors, provider, });

Wrap your <App /> component with <WagmiConfig />, <RainbowKitProvider/> and <SSXProvider/> components, making it look like the following:

/* src/index.tsx */ root.render( <React.StrictMode> <WagmiConfig client={wagmiClient}> <RainbowKitProvider chains={chains}> <SSXProvider> <App /> </SSXProvider> </RainbowKitProvider> </WagmiConfig> </React.StrictMode> );

At src/App.tsx some changes are required to hook SSX and RainbowKit together.

Add the following imports and update the App component as the code below:

/* src/App.tsx */ /* Add useEffect to the existing useState bracket */ import { useEffect, useState } from 'react'; import '@rainbow-me/rainbowkit/styles.css'; import { useConnectModal, useAccountModal, ConnectButton, } from '@rainbow-me/rainbowkit'; import { useSSX } from '@spruceid/ssx-react'; import { SSXClientSession } from '@spruceid/ssx'; import { useSigner } from 'wagmi'; /*....*/ function App() { /* SSX hook */ const { ssx } = useSSX(); /* RainbowKit ConnectModal hook */ const { openConnectModal } = useConnectModal(); /* RainbowKit Account modal hook */ const { openAccountModal } = useAccountModal(); /* Some control variables */ const [session, setSession] = useState<SSXClientSession>(); const [loading, setLoading] = useState<boolean>(false); const { data: provider } = useSigner(); useEffect(() => { if (ssx && loading) { /* Sign-in with SSX whenever the button is pressed */ ssx .signIn() .then(session => { console.log(session); setSession(session); setLoading(false); }) .catch(err => { console.error(err); setSession(undefined); setLoading(false); }); } }, [ssx, loading]); useEffect(() => { if (!provider) { setSession(undefined); setLoading(false); } else { setLoading(true); } }, [provider]); const handleClick = () => { /* Opens the RainbowKit modal if in the correct state */ if (openConnectModal) { openConnectModal(); } /* Triggers the Sign-in hook */ setLoading(true); }; return ( <div className="App"> <div className="App-header"> <img src={logo} className="App-logo" alt="logo" /> <span>SSX</span> {openAccountModal && provider ? <ConnectButton /> : <></>} </div> <div className="App-title"> <h1>SSX Example Dapp</h1> <h2>Connect and sign in with your Ethereum account</h2> </div> <div className="App-content"> {!openConnectModal && provider ? ( <> <AccountInfo address={`${session?.address}`} /> </> ) : ( <button onClick={handleClick}>SIGN-IN WITH ETHEREUM</button> )} </div> </div> ); }

Finally, replace your CSS in src/App.css with the following:

/* src/App.css */ .App { text-align: center; display: flex; flex-direction: column; align-items: center; color: white; height: 100vh; } .App-content button { border: none; width: 100%; padding: 16px 24px; color: white; background: linear-gradient(107.8deg, #4c49e4 11.23%, #26c2f3 78.25%); border-radius: 12px; cursor: pointer; font-weight: 500; font-size: 16px; transition: all 150ms ease 0s; margin: 16px 0px; } .App button:disabled { pointer-events: none; opacity: 0.7; } .App button:hover { transform: scale(1.01); } .App-header { width: calc(100% - 128px); text-align: left; padding: 16px 64px; display: flex; align-items: center; background-color: #212121; } .App-header span { font-weight: 600; font-size: 32px; margin-right: auto; } .App-title { margin-top: auto; } .App-title h2 { font-weight: 400; font-size: 16px; color: #667080; } .App-logo { height: 40px; pointer-events: none; margin-right: 16px; } .App-content { margin-bottom: auto; width: 450px; max-width: 100%; background-color: rgba(39, 39, 39, 0.7); backdrop-filter: blur(2px); border-radius: 12px; padding: 30px; } .App-content h1 { font-size: 32px; line-height: 48px; } .App-account-info { margin-top: 16px; padding: 16px 8px; border: 1px solid #555555; border-radius: 12px; text-align: left; } .App-account-info b { color: #667080; }

The dapp has now been configured in a very basic state and will let you simply sign in using RainbowKit.

Adding Token Gating

To token gate, we need to add some additional code and configuration.

First, let's configure the alchemy-sdk. Add the following to src/App.tsx:

/* src/App.tsx */ import { Network, Alchemy } from 'alchemy-sdk'; /**....**/ const alchemyConfig = { /* This is the same you used previously for RainbowKit */ apiKey: process.env.REACT_APP_ALCHEMY_API_KEY, /* Change this to the appropriate network for your usecase */ network: Network.ETH_MAINNET, }; const alchemy = new Alchemy(alchemyConfig); /**....**/

With the SDK configured, we now need to add the logic to verify if the signed-in address owns a token. In this case, we'll be gating based on an ENS name.

To accomplish this, tokens owned by the address must be fetched and filtered for ENS names. ENS names are under the contract 0x57f1887a8bf19b14fc0df6fd9b2acc9af147ea85, so a variable with that value needs to be added.

Add the following after the import statements:

/* src/App.tsx */ const ENS_CONTRACT = '0x57f1887a8bf19b14fc0df6fd9b2acc9af147ea85';

Including the modifications above to src/App.tsx, we can now add the additional logic needed to fetch tokens:

/*....*/ import { Network, Alchemy } from "alchemy-sdk"; const alchemyConfig = { /* This is the same you used previously for RainbowKit */ apiKey: process.env.REACT_APP_ALCHEMY_API_KEY, /* Change this to the appropriate network for your usecase */ network: Network.ETH_MAINNET, }; const alchemy = new Alchemy(alchemyConfig); const ENS_CONTRACT = '0x57f1887a8bf19b14fc0df6fd9b2acc9af147ea85'; /*....*/ function App() { /* SSX hook */ const { ssx } = useSSX(); /* RainbowKit ConnectModal hook */ const { openConnectModal } = useConnectModal(); /* RainbowKit Account modal hook */ const { openAccountModal } = useAccountModal(); /* Some control variables */ const [session, setSession] = useState<SSXClientSession>(); const [loading, setLoading] = useState<boolean>(false); const { data: provider } = useSigner(); const [ownEnsName, setOwnEnsName] = useState(false); useEffect(() => { if (ssx && loading) { /* Sign-in with SSX whenever the button is pressed */ ssx.signIn() .then((session) => { console.log(session); alchemy.nft.getNftsForOwner(`${ssx.address()}`) .then((nfts) => { const ownENS = nfts.ownedNfts .filter(({ contract }) => contract.address === ENS_CONTRACT)?.length > 0; setOwnEnsName(ownENS); setSession(session); setLoading(false); }); }) .catch((err) => { console.error(err); setOwnEnsName(false); setSession(session); setLoading(false); }); } }, [ssx, loading]); useEffect(() => { if (!provider) { setSession(undefined); setLoading(false); } else { setLoading(true); } }, [provider]); const handleClick = () => { /* Opens the RainbowKit modal if in the correct state */ if (openConnectModal) { openConnectModal(); } /* Triggers the Sign-in hook */ setLoading(true); } return ( <div className="App"> <div className="App-header"> <img src={logo} className="App-logo" alt="logo" /> <span>SSX</span> {openAccountModal && ownEnsName && provider ? <ConnectButton /> : <></>} </div> <div className="App-title"> <h1>SSX Example Dapp</h1> <h2>Connect and sign-in with your Ethereum account</h2> </div> <div className="App-content"> {!openConnectModal && ownEnsName && provider ? ( <> <AccountInfo address={`${session?.address}`} /> <br></br> <>You own an ENS name.</> </> ) : ( <> <button onClick={handleClick}>SIGN-IN WITH ETHEREUM</button> <br></br> {!openConnectModal && !ownEnsName && provider && !loading ? ( <> <AccountInfo address={`${session?.address}`} /> <br></br> No ENS name found. </> ) : ( <></> )} </> )} </div> </div> ); } export default App;

Now you can gate any content with ENS names just by checking this ownEnsName variable.

What's Next?

Gating access to content isn't only limited to tokens. You can even gate access based on different forms of on-chain activity like:

The number of trades a user has made on Uniswap. How much lending activity a user has on Aave. The number of times a user voted on-chain in a DAO's governance process.

It doesn't stop there - this can even be extended to gating based on off-chain credentials, and more! We'll continue highlighting these examples and show how much is possible by using SSX.

Happy building!

Spruce lets users control their data across the web. Spruce provides an ecosystem of open-source tools and products for developers that let users collect their data in one place they control, and show their cards however they want.

If you're curious about integrating Spruce's technology into your project, come chat with us in our Discord.


Ocean Protocol

Introducing the first Ocean Protocol privacy-preserving data challenge: Air Quality in Catalunya

Ocean Protocol is announcing a new data challenge focused on tackling one of the most pressing environmental issues of our time: air pollution. This challenge is using Compute-to-Data, Ocean’s privacy-preserving computation, to effectively allow participants to run compute jobs on the data provided and get useful compute results. With a total prize pool of $6,000 payable in OCEAN, we’re cal

Ocean Protocol is announcing a new data challenge focused on tackling one of the most pressing environmental issues of our time: air pollution.

This challenge is using Compute-to-Data, Ocean’s privacy-preserving computation, to effectively allow participants to run compute jobs on the data provided and get useful compute results.

With a total prize pool of $6,000 payable in OCEAN, we’re calling all data scientists, environmental experts, and anyone who is passionate about making a positive impact on the environment to explore the data provided, and join us in finding innovative solutions.

Air pollution is a serious health concern that affects millions of people worldwide with a range of health problems. These include respiratory issues, heart disease, and stroke–causing 7 million premature deaths each year. The Air Pollution challenge is a great opportunity for individuals and teams to use their skills and creativity to make a real-world impact. Participants are encouraged to use Ocean Protocol’s technology to decentralize access to their algorithms, and share their solutions with privacy-preserving technology.

Through this challenge, we aim to analyze the evolution of air pollutants in Catalunya (i.e. Catalonia) over the past three decades and develop algorithms to predict air pollutant concentrations.

You are asked to perform three tasks:

Make a global analysis of the air quality in Catalunya. Build algorithms to predict the concentration of pollutants in the air using Ocean Protocol’s Compute-to-Data technology. Write a report presenting your approach, results, conclusions and recommendations.

Participants can access the dataset provided for this challenge from the Ocean Market by clicking the following link and downloading the published data asset. The same dataset can also be used through Compute-to-Data by the algorithm of your choice at the following address.

The submission deadline is February 14 @ 23:59 PM UTC.

For information on the challenge and how to participate, read more here: http://bit.ly/3Dp6B0s

This challenge is hosted on DesightsAI.

All valid non-winning entries submitted will receive an award of 500 OCEAN tokens.

The evaluation panel will determine if entries submitted are valid, submitted in good faith, and meet minimum requirements as specified in the challenge description.

Need support?

Chat with us in our Discord, via our dedicated #data-challenges channel.

About Ocean Protocol

Ocean Protocol is a decentralized data-sharing ecosystem spearheading the movement to unlock a New Data Economy, break down data silos, and open access to quality data. Ocean’s intuitive marketplace technology allows data to be published, discovered, and consumed in a secure, privacy-preserving manner by giving power back to data owners, Ocean resolves the tradeoff between using private data and the risks of exposing it.

Introducing the first Ocean Protocol privacy-preserving data challenge: Air Quality in Catalunya was originally published in Ocean Protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.


Liquid Avatar Technologies

Liquid Avatar Technologies powers Aftermath Islands’ Proof of Humanity engaging users to collect 1 billion+ platform resources creating 1.2 million+ NFTs for its Metaverse

Company’s successful 100+ day early-stage launch creates new commercial opportunities for expansion of verifiable credentials, digital goods and services and the Metaverse. February 1, 2023 –Toronto, Canada / Bridgetown, Barbados – Liquid Avatar Technologies Inc. (CSE:LQID / OTC:LQAVF / FRA:4T51) (“Liquid Avatar” or the “Company”) is pleased to announce the continued expansion of use of […]

Company’s successful 100+ day early-stage launch creates new commercial opportunities for expansion of verifiable credentials, digital goods and services and the Metaverse.

February 1, 2023 –Toronto, Canada / Bridgetown, Barbados – Liquid Avatar Technologies Inc. (CSE:LQID / OTC:LQAVF / FRA:4T51) (“Liquid Avatar” or the “Company”) is pleased to announce the continued expansion of use of the Liquid Avatar Mobile App and Meta Park Pass, providing Proof of Humanity, allowing users to remain anonymous, but still verify that they are a real person, through blockchain based verifiable credentials, working to eliminate fraudulent activities, duplicate and fake accounts and bots.  Deploying the technology to its controlled subsidiary’s Aftermath Islands Metaverse Play to Earn game, Lost Kingdom of T’Sara, the Meta Park Pass has been successfully utilized allowing users to sign in with their biometrics, eliminating username and passwords, proving that all users are genuine people, with only 1 account, and supporting the collection over 1 billion gaming resources, resulting in the creation of over 1.2 million Resource Pack NFTs for use in the Aftermath Islands Metaverse.

Users are engaging for upwards of 70 minutes per day on the platform and have collected a variety of in-game resources, that they have then converted to Resource Pack NFTs that will be used, crafted, traded, and consumed in the Aftermath Islands Metaverse as part of platform activities, allowing the NFTs and the underlying blockchain technology to support in-game inventory management.  These also include the ability to work, play, learn, earn, entertain, create, and socialize within the Aftermath Islands Metaverse.

Aftermath Islands Metaverse Limited has generated over $2 million in retail sales since late 2021 and continue to generate sales on a monthly basis.  As the Company works to launch its full Metaverse platform, these revenues are recorded, under IFRS guidelines, as deferred revenue, until such time that the Aftermath Islands Metaverse meets further launch requirements.  However, the Company has had access to the proceeds for working capital.

In late Q4 2022, the Company also successfully launched its first Aftermath Islands Metaverse servers that provide users with a high-fidelity graphics experience, similar to realistic console gaming, with no download, providing an easy-to-use experience similar to streaming services, on almost any mobile, tablet or computing device.  The open platform is available for early engagement at play.aftermathislands.com.

The Aftermath Islands Metaverse also seamlessly incorporates the ability for organizations and enterprises to leverage their current websites and eCommerce activities by allowing users to click on an item in the Aftermath Islands Metaverse, creating a new browser tab that display’s the organization’s website, eCommerce programs and social media.  Aftermath Islands is one of the first to completely integrate Web2 and Web3 activities in a single click.

As one of the first of its kind, the Liquid Avatar Mobile App can also be used to create a host of other credentials that have been already tested for government level access for travel and age verification.  The Company is also designing credentials with uses that include online access, validation and marketing in the Metaverse and throughout the Web3 ecosystem.  While the Company is ready to deploy to convenience stores and other age restricted programs, current market and economic priorities have caused delays of the rollout of Ontario’s and other jurisdiction’s Digital Identity programs.  Liquid Avatar is currently expanding its programs to other online and commercial opportunities that are eager to use verifiable credentials to ensure that their users are real, unique individuals and have only a single account.

David Lucatch, CEO of Liquid Avatar Technologies Inc. and Managing Director of Aftermath Islands Metaverse Limited commented, “Despite the various market challenges, we have launched and continue to accelerate verifiable credentials. Use of these credentials by Aftermath Islands and its programs continues to reinforce our focus, not only on building a Metaverse with wide appeal, but promoting the positive sentiment that we are seeing from users that have embraced the Liquid Avatar Mobile App and the Meta Park Pass as participants in an ecosystem that insures users are real and that they don’t have to worry about fake or multiple account holders, phishing attempts, bots and other nefarious actors.  Our core efforts remain squarely on digital identity, avatars, digital good and services and the Metaverse and we believe that by providing seamless, no download experiences, wide availability on a host of devices and the integration of Web2 with Web3, we will continue to see revenues, engagement, and growth.”

The Company is also continuing to work on its LQID Metaverse Rewards payment card program and is awaiting final network and payment processor approvals.  The Liquid Shopz program is currently part of the LQID Rewards program, and the Company is currently reviewing stand alone vendor and affiliate programs for its Metaverse activities and user programs.  The Company will provide further updates as they become available.

The Company would also like to express its appreciation to outgoing board member, Mr. Jeff Mesina for his contribution to the Company’s growth and we appreciate the consideration and support he has provided over the last several years.  Mr. Mesina has resigned to pursue other endeavours and the Company wishes him every success.

The Company would also like to provide an update on its August 3, 2022, announcement regarding a commitment letter Aftermath Islands Metaverse had received from global investment group LDA Capital.  Due to the prolonged decrease in the prices and reduction in market capitalization that several sectors have been experiencing over the last several months, and in conjunction with the evolving regulatory conditions, Aftermath Islands Metaverse has chosen not to actively pursue this opportunity at this time. 

To review a demo of the Liquid Avatar Mobile App and the Meta Park Pass please visit https://www.youtube.com/watch?v=gQdt9IqA37o&ab_channel=LiquidAvatar

For more information on Liquid Avatar Technologies Inc., please sign up for our newsletter and email list at: https://hello.liquidavatar.com/liquid-avatar-updates

About Liquid Avatar Technologies Inc. – www.liquidavatartechnologies.com

Liquid Avatar Technologies Inc. focuses on the verification, management and monetization of Self Sovereign Identity, empowering users to control and benefit from the use of their online identity.

The Liquid Avatar Mobile App, available in the Apple App Store and Google Play, is a verified Self Sovereign Identity platform that empowers users to create high quality digital icons representing their online personas. These icons will allow users to manage and control their digital identity and Verifiable Access and Identity Credentials, and to use Liquid Avatars to share public and permission based private data when they want and with whom they want.

The Liquid Avatar Verifiable Credentials Ecosystem (LAVCE) has been developed to support all participants in a digital credential ecosystem, including the Holder, Issuer and Verifier, using state-of-the-art blockchain and open standards technologies initially as a node on the Indicio Network. The Company is a voting and steering committee member of the Trust over IP Foundation, founding and steering committee member of Cardea, a Linux Foundation Public Health project, member of the Good Health Pass collaborative, DIACC and the Covid Credentials Initiative (“CCI”).

The Meta Park Pass is a W3C verifiable credential designed for interoperability.  The Meta Park Pass contains a verified phone number with country code and an AI reviewed self-attested age and can be expanded to include other unique credentials.  Users will be able to login into multiple supported Metaverses with little friction and the platform knows they are a real and unique user.  Additional credentials can grant access as each Metaverse matures and provides additional services and experiences. 

The Company has a suite of early-stage programs that support the Liquid Avatar Mobile App program, including Liquid Shopz, a cash back and reward program that has over 600 leading online merchants, which is in the pre-launch phase, and is working to release its own branded network payment card in the United States, the LQID Card, with the world’s first Metaverse Rewards program which remains in development.

The Company’s subsidiary, Oasis Digital Studios, is a creative and development agency that is focused on providing digital goods and services expertise to its clients.

Liquid Avatar Technologies Inc. is publicly listed on the Canadian Securities Exchange (CSE) under the symbol “LQID” (CSE:LQID).

The Company also trades in the United States, on the OTCQB under the symbol “LQAVF” and in Frankfurt under the symbol “4T51”.

If you have not already joined our mailing list and would like to receive updates on Liquid Avatar Technologies Inc., please visit https://hello.liquidavatar.com/liquid-avatar-updates.

For more information, please visit www.liquidavatartechnologies.com

About Aftermath Islands Metaverse Limited – www.aftermathislands.com

Aftermath Islands Metaverse Limited is a Barbados corporation, which is 50% owned and is controlled by Oasis Digital Studios Limited, a wholly owned subsidiary of Liquid Avatar Technologies Inc (CSE:LQID / OTC:LQAVF). 

Aftermath Islands Metaverse is an open-world, realistic graphic virtual world where users can buy, develop, trade, and sell Virtual Land (VL), property and items, like buildings, crafted items, transport, and other items all through in-game collectible NFTs, a non-fungible token that represents the ownership of virtual and other items. Built on cutting-edge blockchain technologies and using GPU cloud servers, pixel streaming, and the high-fidelity graphics of Unreal Engine 5, Aftermath Islands delivers rich, no download, browser-based experiences for users on desktops, mobile devices and tablets.

From play-to-earn games, including the Lost Kingdom of T’Sara, to online experiences, collaboration, immersive entertainment, and more, Aftermath Islands brings live streaming, high-definition graphics, exemplary interactivity, real-world mechanics, and countless new services and experiences to players all around the world.  The platform is built on the philosophy of decentralization and economic inclusivity and promises to provide captivating experiences that allow people around the world to earn their way into virtual land ownership.

For more information about Aftermath Islands, please visit www.aftermathislands.com

Contact:

ir@liquidavatar.com

David Lucatch

Chief Executive Officer

Forward-Looking Information and Statements

This press release contains certain “forward-looking information” within the meaning of applicable Canadian securities legislation and may also contain statements that may constitute “forward-looking statements” within the meaning of the safe harbor provisions of the United States Private Securities Litigation Reform Act of 1995. Such forward-looking information and forward-looking statements are not representative of historical facts or information or current condition, but instead represent only the Company’s beliefs regarding future events, plans or objectives, many of which, by their nature, are inherently uncertain and outside of the Company’s control. Generally, such forward-looking information or forward-looking statements can be identified by the use of forward-looking terminology such as “plans”, “expects” or “does not expect”, “is expected”, “budget”, “scheduled”, “estimates”, “forecasts”, “intends”, “anticipates” or “does not anticipate”, or “believes”, or variations of such words and phrases or may contain statements that certain actions, events or results “may”, “could”, “would”, “might” or “will be taken”, “will continue”, “will occur” or “will be achieved”.

The forward-looking information and forward-looking statements contained herein include, but is not limited to, statements regarding the adoption of the metaverse, the Lost Kingdom of T’Sara and the extent of future reach of services across countries, statements regarding the timing of or the success of the launch of a full Metaverse platform, statements with respect to adding AI to the Aftermath Islands Metaverse, or if the Proof of Humanity will contribute any economic benefit for the Company  The assumption made by the Company in making these statements is that the Metaverse will be a viable and growing opportunity that the Company can capitalize on through the deployment of its products to drive an economic benefit.

By identifying such information and statements in this manner, the Company is alerting the reader that such information and statements are subject to known and unknown risks, uncertainties and other factors that may cause the actual results, level of activity, performance, or achievements of the Company to be materially different from those expressed or implied by such information and statements. In particular, if Liquid Avatar Technologies Inc. or Aftermath Islands fails to fund its operations or execute on its business plan, new credentials that are developed or the opportunities with the metaverse will not have any benefit for the Company.

Although the Company believes that the assumptions and factors used in preparing, and the expectations contained in, the forward-looking information and statements are reasonable, undue reliance should not be placed on such information and statements, and no assurance or guarantee can be given that such forward-looking information and statements will prove to be accurate, as actual results and future events could differ materially from those anticipated in such information and statements. The forward-looking information and forward-looking statements contained in this press release are made as of the date of this press release, and the Company does not undertake to update any forward-looking information and/or forward-looking statements that are contained or referenced herein, except in accordance with applicable securities laws.


Forgerock Blog

Our Orchestration Journey

A 13-year quest to save our customers money and improve their security. "Most people overestimate what they can achieve in a year and underestimate what they can achieve in ten years." This quote, often attributed to Bill Gates, came to mind as I looked back on the last 13 years at ForgeRock. It continues to amaze me how our customer focus first led us to redefine the IAM space, and then to int
A 13-year quest to save our customers money and improve their security.

"Most people overestimate what they can achieve in a year and underestimate what they can achieve in ten years." This quote, often attributed to Bill Gates, came to mind as I looked back on the last 13 years at ForgeRock. It continues to amaze me how our customer focus first led us to redefine the IAM space, and then to introduce many innovations that have since become universally acknowledged as critical capabilities.

For ForgeRock's birthday, I wanted to take a trip down memory lane with a focus on a specific area that is near and dear to my heart: our orchestration journey (or is it our Journeys journey?).

It all started when some of our earliest customers wanted to provide different login experiences for different types of users. We took inspiration from the Linux Pluggable Authentication Modules (PAMs) and introduced the ability to add Java code to the login flow. We called this capability Chains, and our customers loved it because it allowed them to change login behavior as they desired. However, it required a lot of custom Java coding.

To reduce the customer development time and make orchestration more flexible, we took it to a whole new level with a drag-and-drop designer UI and the ability to branch out in multiple directions based on different conditions. Our engineers called this orchestration engine Trees (branching…Trees, get it?). Even though we named the product Intelligent Authentication and built in many capabilities around context and device data collection, dynamic decisioning, and continuous authentication, many customers still call it Trees. We also started shipping a number of pre-built actions called Nodes that make it easy for customers to build orchestration journeys for different needs without coding.

But ForgeRock customers didn't stop there. They decided to use our orchestration engine to not only integrate with their home-grown systems but also with third-party vendors for multi-factor authentication (MFA), user and entity behavior analytics (UEBA), and identity verification, just to name a few. When we found out that many customers were repeatedly building the same set of integrations, we decided to save them some time and money by delivering integration capabilities to everyone. That's how our Trust Network started more than five years ago. Back then, the notion of having a Marketplace was still new in the IAM space. We did it anyway and today we have the largest ecosystem of integration partners with 200+ integrations available from the ForgeRock Marketplace. This is the reason a leading bank threw away their homegrown orchestration and decided to start leveraging ForgeRock Orchestration.

When the IAM market evolved and customers started asking for a single platform to provide full identity fabric capabilities along with governance, we delivered ForgeRock Intelligent Access. Intelligent Access extended our orchestration engine to simplify user self-service capabilities, such as registration, forgotten password, password reset, and so on. [Have you ever created an account on a site, and it immediately turns around and asks you to enter the just-created login and password again? That's because the site's identity management system that created the account and the access management system that secures the access are not talking to each other properly. I don't like that experience at all. With an integrated platform like ForgeRock, your users don't have to experience that friction.]

A couple of years after we launched our cloud service, our customers asked us to add a UEBA engine to it. Since we accumulate many signals as part of the cloud service during authentication, the addition of UEBA made a lot of sense, so we launched our Autonomous Access service. Even there we made it easy to consume the capabilities delivered by a powerful AI engine with easy-to-use nodes that can be integrated into any orchestration journey with a simple drag-and-drop UI.

Over the years, we have also added many orchestration capabilities based on customer feedback — along with our own security and product best practices — resulting in multiple patents in this area. What I love most about these investments is the sheer number of nodes we now ship with our service, the volume of pre-built journeys that incorporate security and usability best practices, and our improvements to the architecture that have helped many customers save countless hours of development and integration time. We hear from new customers again and again that they chose ForgeRock because of our strong orchestration capabilities. 

But we are not stopping there. Our vision is an orchestration engine that drives all the users' IAM journeys, whether for authentication or for access request approvals and beyond. We're building a future where every action within an IAM platform, from configuration management to application onboarding, will be driven by orchestration. By sticking to our founding principles, built on a persistent focus on our customers' needs, we'll continue to innovate on their behalf. 

Want to learn how our years of investment and expertise can help you save money and improve security?  Read about the latest ForgeRock Orchestration capabilities here.


Ontology

Ontology Monthly Report — January 2023

Ontology Monthly Report — January 2023 The Ontology community has reached 10,000 followers on CoinMarketCap. Let’s move forward the next 10k! Developments/Corporate Updates Development Progress We are 100% done with the Rollup VM design. The White Paper will be published soon. We are 98% done with the Rollup L1<->L2 cross-layer communication. We are 98% done with the Rollup
Ontology Monthly Report — January 2023

The Ontology community has reached 10,000 followers on CoinMarketCap. Let’s move forward the next 10k!

Developments/Corporate Updates Development Progress We are 100% done with the Rollup VM design. The White Paper will be published soon. We are 98% done with the Rollup L1<->L2 cross-layer communication. We are 98% done with the Rollup L1<->L2 Token Bridge. We are 99% done with the L1 data synchronization server. We are 99% done with the L2 Rollup Node. We are 92% done with the L2 blockchain browser. We are 15% done with the Evm bloom bit index optimization. We are 15% done with the high ledger memory usage optimization. Product Development ONTO App v4.4.6 integrated KCC and Canto Chains, added asset swap on Cronos and Aurora Chains, and enabled Vision Chain assets in Red Packet. ONTO has published the December monthly report, summarizing a series of functional optimizations including integrated Ripple Ledger, Litecoin, Celo and ENULS Chains, and added asset swap on Harmony and Boba Network. ONTO has integrated ENULS test net, users can simply interact with the ENULS ecosystem dApps with ONTO wallet and share a portion of the ENULS ecosystem airdrops. ONTO partnered with Baby Wealthy Club and listed Baby Rich Coin, global users can use ONTO Wallet to manage their $BRC asset now. ONTO partnered with IoTeX to host a Telegram Quiz campaign of $100 USDC rewards. Follow the @ONTO Wallet Official Announcement on Telegram for more details. On-Chain Activity 160 total dApps on MainNet as of January 30th, 2023. 7,293,677 total dApp-related transactions on MainNet, an increase of 50,851 from last month. 18,263,822 total transactions on MainNet, an increase of 98,749 from last month. Community Growth & Bounties This month, several Ontology Community Calls and discussions were held on Discord and Telegram, focusing on topics such as “Blockchain and Decentralization”, “The Review of 2022”, “The Dimensions of Reputation” and “CEX vs DEX”. Community members shared their views actively, and participants also got the chance to win Loyal Member NFTs. We held our Monthly Quiz led by Ontology Harbinger Benny, community members actively raised questions and shared 100 ONG rewards. As always, we’re active on Twitter and Telegram where you can keep up with our latest developments and community updates. To join Ontology’s Telegram group and keep up to date, click here. Recruitment

At Ontology, we are always looking to expand our team. We currently have a list of open roles and are looking to hire ambitious and hardworking individuals (see below). Check out our website for full details.

Technical Director Senior Front-end Engineer Golang Developer Out & About — Event Spotlight

It was all hands on deck this month with a string of news reports and developments:

Ontology EVM was supported by Nabox, the multi-chain DID gateway to Web3. Global users can now securely manage Ontology EVM assets, dApps and NFTs with Nabox Wallet.

Ontology has published the latest Tech Viewpoint “Let’s make an NFT”, starting with the two roles of painter and architect to help you join the metaverse!

As part of our OWN Insights series, Ontology published the latest OWN101 with the topic of GAS. On the Ontology chain, ONG is used to pay for transacting on chain.

Ontology published the “Meet the Team” series and interviewed Ontology’s Head of Community Humpty Caldero. He has shared his thoughts about Web3 in the next five years: “What we’re going to see is more people owning their identity and data whilst creating monetization opportunities. Both privacy and self-sovereignty are important to people”.

日本語 한국어 Española Français Slovenčina Tiếng Việt Hindi русский Tagalog සිංහල Türk Italiano বাংলা Follow us on social media!

Ontology website / ONTO website / OWallet (GitHub)

Twitter / Reddit / Facebook / LinkedIn / YouTube / NaverBlog / Forklog

Telegram Announcement / Telegram English / GitHubDiscord

Ontology Monthly Report — January 2023 was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


MyDEX

Are Insights Always Good?

Image generated by AI using openai.com/dall-e-2 This is one of a series of blogs exploring Hidden in Plain Sight: The Surprising Economics of Personal Data, the subject of Mydex CIC’s latest White Paper. There’s a famous psychology experiment where subjects are asked to watch a video of people playing basketball and count the number of times they pass the ball. You can see the video here. If
Image generated by AI using openai.com/dall-e-2

This is one of a series of blogs exploring Hidden in Plain Sight: The Surprising Economics of Personal Data, the subject of Mydex CIC’s latest White Paper.

There’s a famous psychology experiment where subjects are asked to watch a video of people playing basketball and count the number of times they pass the ball. You can see the video here. If you haven’t already done it, try it. It’s fun. It only takes a minute.

The astonishing thing is that a very high proportion of those counting the passes simply don’t notice a gorilla that walks through the game and stands there, right in front of them. Plain as plain can be. They just don’t see it. When asked, many of them even deny it: they are so intent on looking for something else they can’t believe that a gorilla was standing there, right in front of their eyes. This is the phenomenon of selective attention, and it’s extremely powerful.

In this blog series, we’ve explored many aspects of the economics of personal data, showing how and why personal data stores offer huge opportunities for positive economic transformation. But if this opportunity is so big, why aren’t organisations and governments clamouring for it?

A narrative of our times

Selective attention is a key reason. One particular narrative has come to dominate the conversation to such a degree that many people simply assume it accounts for the whole picture; counting passes and not seeing the gorilla in front of them. This is the narrative of Big Data and it goes something like this.

“Data is needed to create the insights that drive improved and innovative services. The more data there is to analyse, the better the insights will be, the greater the benefits. Therefore, everything should be done to help organisations amass as much data as they can.”

There is a small element of truth to this narrative (every effective lie includes a small element of truth to make it seem credible). Like the telescope and microscope before it, data is helping us see the world in ways we never could before. Tackled the right way, this is indeed an opportunity. But most Big Data today falls far short of this. It’s either flawed, unnecessary, irrelevant or downright dangerous — a bandwagon fueled by hype.

Big Data, we are being told, is a ‘game-changing’ development, revolutionising industries as diverse as marketing and health care and helping to ‘solve humanitarian issues around poverty, health, human rights, education and the environment’. It’s now ’a key factor in how nations, not just companies, compete and prosper’, a ‘foundation for disruptive business models’, ‘transforming processes and altering corporate ecosystems’. It can even predict the future (apparently).

Little wonder it dominates Government policies such as the UK Government’s National Data Strategy and the EU’s Horizon Europe research programme. But what is the reality behind this hype?

Are Big Data insights really transformational?

The Big Data bandwagon revolves around two words: ‘insights’ and ‘analytics’, which are widely referred to with close to mystical awe, as if they had magic powers.

Genuine insights, used for the right purposes, are wonderful. But Big Data has never generated an insight and never will. Why? Because only humans have insights. This is flaw Number One.

Big Data analytics are done by a computer crunching lots of numbers to surface correlations and patterns. But, in themselves these correlations tell us nothing because statistical correlation can be meaningless and is not the same as causation. While Big Data analytics can alert us to something we hadn’t seen before, only humans can understand the why; what it all means.

Confusing the two is one of the reasons why there is currently a crisis in medical research. Around 50% of medical research findings cannot be repeated if the experiment is conducted again. Why? Among the many reasons are that doctors (like most data scientists) are not trained in statistics and frequently confuse correlation with causation.

The trouble with Big Data is that the more data you crunch, the more noise: the more meaningless correlations pop up. Without clear ways of distinguishing between noise and signal, chances are you’ll end up in multiple wild goose chases.

This assumes that the data used by Big Data is reliable in the first place. But often it isn’t. One of the best kept secrets about most organisations’ large databases is that the quality of their data is often very poor: incomplete, out of date or simply wrong. And most data sets, no matter how large, are biassed in some way. No matter how big they might be, they are still a sample.

Building genuinely representative samples isn’t something that happens easily or automatically. In fact, one of the biggest barriers to Big Data research is the cost of accessing reliable data. According to the National Audit Office 60–80% of researchers’ time is spent cleaning and merging data. In a world where each individual was able to aggregate information about themselves into their own personal data store, these costs could be slashed — assuming, that is, that the researchers can make a compelling case for the value and ethics of their research.

But as it is, much of the data used for Big Data analytics is of questionable quality. GIGO is an ancient IT catchphrase. It stands for Garbage In, Garbage Out. With Big Data the risks of Big Garbage are high, such as Big Data-generated algorithms that build racist and sexist assumptions into how they work.

Can the insight be applied?

Even when a good signal is identified, it doesn’t mean it’s automatically useful. To be useful, it has to be applied. But Big Data doesn’t help here, because it only deals with statistical data — e.g. probabilities — not specifics.

If you toss a coin many times you can know with certainty that 50% of the time it will be heads, and 50% of the time tails. But complete, certain knowledge of probabilities doesn’t translate into complete, knowledge of the real world — of what it going to happen next. What’s it going to be? Heads or tails? For the real world, we need ways to deal with different, specific outcomes, not just generalised observations.

This can wreak havoc when it comes to applying Big Data insights in the real world. Say for example that Big Data analysis tells us that if you do X the probability of Y happening is doubled. Great! But if the probability of Y happening has doubled from 2% to 4%, it still means that 96% of the time it won’t happen. How useful is that when it comes to applying the ‘insight’ in the real world?

OK. What about a 99.9% probability rather than just 4%? You might think you can rely on a 99.9% probability in a real world application. The German Ministry of the Interior thought so when it installed facial recognition programmes outside big railway stations to identify known terrorists.

The 99.9% figure meant that 0.1% of observations generated false positives: that declared an individual to be a terrorist when they weren’t. With 12 million people passing through big railway stations every day, the German police were presented with 12,000 people a day wrongly identified by the system as terrorists. Acting on this information would have resulted in a large-scale invasion of civil liberties (hundreds of thousands of people being treated as if they were terrorists when they were not), while wasting all available policing resources hounding them. The programme was quickly (and quietly) dropped.

Behind this lies another flawed assumptions that keeps Big Data hype afloat: confusion between statistical data about populations — data that deals in probabilities — and the actual data about real, individual people that is needed to actually do something useful. The one does NOT automatically translate into the other.

Once Big Data is applied to an identifiable individual, it stops being Big Data and becomes deeply personal instead. It requires access to and use of personal data, and a matching, testing and application process to see if the generalised statistical ‘insight’ is relevant to this particular case. For that, you need to access exactly the right data, at the right time, for the use case in hand: you need the personal data logistics capabilities provided by personal data stores.

Without such a translation process from statistical probabilities to personal circumstances, Big Data insights can only be applied in a blanket manner — in a way that is often irrelevant, if not harmful.

Do you need Big Data to get big insights?

All this assumes that Big Data is the only or main way to generate useful insights. But this isn’t so. Most of the insights used to inform decisions and resulting actions today don’t come from Big Data. Other simpler, easier, cheaper processes such as traditional research or even mundane activities such as filling in forms provide the surprises — the new information needed to keep decisions in line with a changing world — that most people and organisations need to get stuff done.

So Big Data isn’t the be all and end all of ‘insight’ that it’s made out to be. Far from it. Viewed across the economy as a whole its contribution is actually marginal.

Beware hidden agendas

When you listen to Big Data hype however, it’s as though Big Data is a magic bullet that is going to save the world: it comes with a touching but naive faith that, by definition, if something is an ‘insight’ it must be ‘good’ — it will be used to help people rather than harm them or take advantage of them.

Unfortunately this is not true. Much of the hype about ‘insights’ derives from the activities of Silicon Valley advertising giants like Google and Facebook. But their insights weren’t generated to help people live better, richer lives. They were generated for the purposes of manipulation and control, to get people to do what advertisers wanted them to do. Cambridge Analytica was a business driven by Big Data insights. Its agenda was profit via manipulation and control — for which it needed as many ‘insights’ it could get.

The sad reality behind Big Data hype is that it is widely used as camouflage to extend surveillance capitalism into new areas, to justify a privacy-invading corporate data landgrab; an excuse for even further concentrations of data power and rewards in the hands of a tiny number of organisations (along with increasing pressure to bypass citizens’ data protection rights).

Are these corporations suddenly really that interested in data analytics for ‘data for good’? Or is this sudden interest a cover for a much more cynical hidden agenda?

Squandered resources, missed opportunities

The biggest damage wreaked by Big Data hype lies in a different direction, however. In a classic case of selective attention, countless politicians and policy-makers around the world have drunk the Big Data Kool Aid to sanction and promote vast Big Data programmes that promise the world and cost almost as much — billions of £/$/€.

In doing so, they failing to notice the gorilla in their midst: the immense opportunities for productivity improvements, service quality, outcome improvements and innovation that lie in the opposite direction of empowering citizens with their own data. The result is missed opportunities on a vast scale, coupled with equally vast misallocation of available resources — a misallocation of resources that deepens the imbalances of power and reward that already blight today’s data-driven economy.

Conclusion

Don’t get us wrong. We have nothing against Big Data per se. We are all for genuine analytics and insights, if done properly and ethically. But we are also all for seeing gorillas as well as counting passes — for seeing opportunities that are NOT reliant on big data, such as empowering individuals with their own data.

It’s not one versus the other. To really work well, Big Data needs to advance hand in hand with the personal data logistics capabilities offered by personal data stores. Focusing only on Big Data while ignoring personal data logistics — and associated citizen empowerment — is like trying to run with just one leg. Extremely hard work to not get far.

That, unfortunately, is what we’ve got right now. As long as policy-makers look determinedly in just one direction — only counting the passes — they won’t see the gorilla standing there, right in front of them. It’s time for them to look up.

Other blogs in this series are:

The Great Data Delusion 19th century doctors thought bloodletting would cure most diseases. Today’s prevailing theories of personal data are little better. Why is personal data so valuable? Because of two fundamentals: reliability and surprise Is it a bird? Is it a plane? No! It’s Super…! With personal data, what sort of a problem are we dealing with? The vital issue people don’t want to talk about Productivity is key to prosperity. But nobody wants to talk about it? Why? When organisations become waste factories The single design flaw at the heart of our economic system, and what happens if we can fix it. Why are joined-up services to difficult to deliver? Because the organisation-centric database is designed NOT to share data. People: the dark matter of the economy Individuals and households are all but invisible to economics. They shouldn’t be. An engine of economic possibilities How personal data stores open up new vistas of innovation and growth What has data got to with net zero? A lot more than you might think. Google and Facebook: Steam Engines of the Information Age They hardly touch the fundamental economics of personal data.

Are Insights Always Good? was originally published in Mydex on Medium, where people are continuing the conversation by highlighting and responding to this story.

Tuesday, 31. January 2023

FindBiometrics

Jumio Brings Talent Boost to Sales Team

Jumio is looking to supercharge its sales efforts, announcing two senior appointments to that wing as it seeks to further boost the momentum gained during a very successful 2022. The […]
Jumio is looking to supercharge its sales efforts, announcing two senior appointments to that wing as it seeks to further boost the momentum gained during a very successful 2022. The […]

More Money On the Table for Expert Spoofers: Identity News Digest

Welcome to FindBiometrics’ digest of identity industry news. Here’s what you need to know about the world of digital identity and biometrics today: FaceTec Triples Spoof Bounty Reward FaceTec has tripled […]
Welcome to FindBiometrics’ digest of identity industry news. Here’s what you need to know about the world of digital identity and biometrics today: FaceTec Triples Spoof Bounty Reward FaceTec has tripled […]

Holochain

Holochain 0.1.0 Is Here

Dev Pulse 132

Last week was a happy week for us at Holochain. It has been a long time coming — years of almost-but-not-quite-ready, of helping the bravest devs succeed with immature tech, of telling more risk-averse projects to wait just a little longer. And now it’s here — Holochain is finally in beta.

The general mood among colleagues and friends has varied from “quiet satisfaction”, in one person’s words, to open celebration. This is a big milestone for us and you, folks!

Of course, there is still lots of work to be done — the core devs are already working hard on the 0.2 beta series, which will bring the ‘immune system’ features that are meant to automate communities’ ability to protect their networks and data from attacks. And of course there are still planned features and the regular boring engineering work of performance tuning and bug discovery. And most of the developer tools haven’t been updated yet, although we’ll see that in the next week or two.

But this is the first point at which we can say, with confidence, that Holochain is ready for most projects (not just bleeding-edge ones) to start building serious apps on. The APIs will not get any breaking changes for the entirety of the 0.1.x release series, and keep our bootstrap and proxy servers running for six months.

In the coming weeks we’ll be publishing a series of blog posts, videos, and updates to the Developer Portal to help you get started building your first hApp.

Does this mean that Holochain is now ready for everyone? Not quite. You will be able to build your hApp without fear of refactoring, you’ll be able to get it into your users’ hands, and they’ll be able to use it long-term without fearing data loss on every upgrade. But it’s important to note that, before the immune system and security audits are completed, your hApp should be considered beta-quality and shouldn’t be used in hostile environments.

Holochain 0.1.0 release notes

Release date: 26 January 2023
HDI compatibility: 0.2.0-beta-rc.1 to 0.2.0
HDK compatibility: 0.1.0-beta-rc.1 to 0.1.0
Breaking changes: app API

This is largely a ceremonial release — you’ll note that we’ve dropped the beta and rc.x from the end of all of our packages. There are only two changes:

Removed (HDI): The unused validation_package callback is no longer supported, and all types related to it have been removed. Given that no hApps in the wild used this callback (to my knowledge), you shouldn’t need to update your integrity zome code. (#1739) Breaking (app API): FetchQueue has been renamed to FetchPool. This shows up in one of the fields of the NetworkInfo API endpoint’s return value. If you don’t know whether you need to know about this, you probably don’t — NetworkInfo is generally only used by conductor orchestrators like the Launcher. (#1793) Dev tools release notes JavaScript client 0.11.16: Internal refactor

Release date: 25 Jan 2023
Holochain compatibility: 0.1.0-beta-rc.4

This release refactors the way the agent’s public key is retrieved; now it gets it from AppInfo.

JavaScript client 0.12.0: Version bump

Release date: 25 Jan 2023
Holochain compatibility: 0.1.0-beta-rc.4

This release simply marks a shift to Holochain Beta. Note: A breaking change to the NetworkInfo app API endpoint, introduced in Holochain 0.1.0, is not yet supported, but will be supported in the next release. (#172) Most hApps don't use this, though, so it should be safe to use this release with Holochain 0.1.0.

Tryorama 0.11.0: Version bump

Release date: 27 Jan 2023
Holochain compatibility: 0.1.0-beta-rc.4

This release bumps the included JavaScript client to 0.12.0.

Rust client 0.3.0: Initial support for Holochain 0.1.x

Release date: 23 Jan 2023
Holochain compatibility: 0.1.0-beta-rc.3

This release brings the supported Holochain version up to 0.1.0-beta-rc.3, which means the following breaking changes:

All zome calls must be signed. install_app_bundle is renamed to install_app. archive_clone_cell is renamed to disable_clone_cell. restore_archived_clone_cell is renamed to enable_clone_cell. enable_clone_cell is moved to the app API. delete_clone_cell can only delete a single disabled clone cell. app_info returns all cells and DNA modifiers. request_agent_info is renamed to agent_info.

There are two non-breaking changes:

The admin API call get_dna_definition has been added. There’s now a utility crate for authorising credentials and signing zome calls.

Read the changelog.

Scaffolding 0.1.3: Support for Holochain 0.1.0

Release date: 31 Jan 2023
Holochain compatibility: 0.1.0

This is the first time I’ve included release notes for the scaffolding tool. We’ve shared about it already, and I will be sharing more about it soon, but for now, just know that it lets both newcomers and experienced hApp devs generate a lot of boilerplate DNA and UI code in a short time. Check out the release log if you’ve already been using it.

Second security audit completed

Last week Least Authority published the results of their second security audit of Holochain. The first, which analyzed the Lair keystore, identified only one issue and three further suggestions, all of which were resolved.

This second one analyzed the Holochain Deterministic Integrity (HDI) crate, which is the hApp developer’s interface between their integrity zomes and Holochain itself. The HDI allows the dev to write validation callbacks that check data for correctness. These callbacks, combined with the plumbing of Holochain’s DHT implementation, are what create the ‘immune system’ that keeps honest participants safe.

Because the HDI is so heavily dependent on the foundation of Holochain’s inner workings, they found that their analysis had to go further than just this one library. This gave them an opportunity to assess the correctness of Holochain’s model for arriving at consistency and take a look at how the model is implemented.

The results of this audit were similar to the first, with one issue related to insufficient formal documentation, which they identified as a potential risk for correctly implementing Holochain’s consistency model. We’ve resolved that by writing a formal specification and rewriting the Holochain whitepaper, which we plan to publish soon.

Reminder: Holochain Dev Training in March

Now that Holochain has its first beta and the dev tools are almost ready, Holochain is pretty much ready for prototyping your hApp and getting it into the hands of beta testers. If you want to skill up yourself or your dev team quickly and effectively, the online Dev Training course in March will get you there. Read the details and apply on our website.


Indicio

Five Verifiable Data Predictions for 2023

The post Five Verifiable Data Predictions for 2023 appeared first on Indicio.
Digital transformation across all aspects of business makes verifiable data the currency for seamless and secure processes in 2023

By Trevor Butterworth and Heather Dahl

Across every sector, digital transformation is the agenda. It means doing more and doing better, often with the economic constraint of having less. It means reorienting to changing employee demographics and labor shortages while meeting customer expectations for seamless experiences and greater efficiencies — often managed from a mobile device. It means adopting new platforms for product design, simulation, and lifecycle management, integrating data from sensor-equipped devices, and in-sourcing supplies.

It means adapting to continuous brand or product engagement by creating and integrating new data streams while complying with data privacy law. And it means seeing digital relationships in terms of authenticity: they aren’t just about modified  data flows, they are about creating new kinds of genuine, peer-to-peer interaction based on consent.

Successful digital transformation will deliver better products, experiences, and services, for both B2B and B2C companies, while reducing costs, friction, and risk.

Trusted data is the currency of digital transformation
Trusted data is data that is verifiable. The quicker it can be verified, the more seamless the process that depends on the authenticity and integrity of the data. That may sound simple and obvious, but the inability to trust the authenticity and integrity of data has throttled digital interaction with punishing costs in fraud and friction. There is no reliable verification layer for people, entities, or the data they wish to share; the internet was built without one.

Verifiable credential technology (often referred to as decentralized identity or self-sovereign identity) changes all this. It means data can be immediately acted upon while reducing the risk of fraud. In practical terms, there’s no need to check in with the source of the data or rely on a third party to store data in order to cross check and manage data verification; the data is issued in a credential from a trusted issuer and is not written to a database or a blockchain; It can be shared by consent and verified through metadata across direct, uniquely encrypted peer-to-peer communications channels. All this removes friction and adds consent, privacy, and security. The result is trusted data, seamlessly verified.

Trusted data not only accelerates seamless processes, it enables new kinds of digital processes and services by removing barriers to digital interaction and enabling interaction to scale.

So how do we think trusted data will be used in 2023 to drive digital transformation? Here are our predictions.

1. Digital wallets will take a backseat to “super” credentials
Perhaps because digital wallets are apps, and apps are what make mobile devices powerful, digital wallet apps have come to be seen as “really important” to implementing verifiable credential technology. But — just as a fancy wallet doesn’t do much in the real world if there’s nothing of value in it — a killer digital wallet is only as useful as what it actually stores, which is to say, digital credentials. 

Create interoperable “super” credentials for seamless KYC, payments, and travel, and a digital wallet becomes super useful. There isn’t a critical shortage of wallet apps; Hyperledger Aries has an open-source white label wallet — Bifold — that anyone can adapt, build on, and offer with their credentials. This is why we see the hype of a killer wallet being replaced in 2023 by the practicality of a “super” credential.

2. DeGov will gain commercial and public sector traction
One of our earliest insights into consumer needs was that governance needed to be simple to implement and yet capable of handling complex information flows, rapid rule change, jurisdictional hierarchies, and—crucially— offline environments.  Governance authorities need to be confident that they are in charge of the governance decisions within their jurisdictions and Decentralized Ecosystem Governance (DeGov) enables that accountability.DeGov began with machine-readable files that could be cached in the software for each party in an ecosystem, and it has now expanded to a specification, currently under discussion at the Decentralized Identity Foundation (DIF), with efforts to standardize and interoperate with other governance implementations at ToIP

The net result is that we have a coherent, powerful, and easy-to-use way for establishing and managing trust in an ecosystem, one that results in a smooth user experience with the capacity for verification in offline situations. DeGov will catalyze commercial and public sector use cases of increasing complexity.

3. KYC credentials pave the way for payment disruption
Identity assurance is expensive to do, time consuming, and has to be done over and over again, causing frustration and friction for both the customer and the financial institution. The crypto and banking sectors both experienced a doubling of identity fraud in  2022 from  2021, and payment fraud increased by 40 percent. At the same time, financial losses from identity theft rose to almost $6.1 billion in 2022.A know-your-customer (KYC) credential (with biometric binding to a device) means both one-time identity assurance and proof-of-card ownership without the need for storing and sharing any personally identifiable information. 

By simplifying KYC, verifiable credentials will save time, effort, friction, and, significantly, cost. This also means that the barrier to performing KYC will be lowered, encouraging  identity assurance to be more widely performed.

The combination of reduced friction and increased trust will change ecommerce, invoicing, mortgage brokering, payments — and DeGov — will make this easy to coordinate and manage on a global scale.

4. Passwords will become a relic of the old times — like rotary phones and punch cards
No one loves passwords and usernames. According to a recent poll, 68% of consumers would be willing to use non-password login options on their mobile apps. There is no digital future where they turn into vinyl and the kids will want to use them because they’re cool. Our digital lives will be immeasurably better once they are gone.The way to replace a username and the password is to create a trusted digital relationship with a verifiable credential. The authentication problem that passwords and user logins are meant to solve  vanishes, because they are no longer needed as a substitute for a trusted digital relationship. 

Digital-forward companies that use verifiable credentials for login will be able to build stronger relationships with their customers, develop more effective product lifecycle management, and collate product data and consumer feedback. These digital relationships will use DIDComm technology for direct, peer-to-peer secure communication between entities, and they will be data privacy compliant, as data can be shared by explicit consent and in privacy-preserving ways.

5. There will be new models for monetizing credentials
First, verifiable credentials will save you money, then they’ll make you money. The immediate value proposition for verifiable credential technology is in fixing inefficient and insecure verification processes, especially around payments. Then they can be deployed to tackle problems that are, presently, filed under “the cost of doing business,” such as chargeback fraud. But as  trusted data scales, it also presents opportunities for both better digital relationships and products and  services that deliver the customer value (think seamless travel). If the cost of KYC is minimal, the cost and risk of entering a market is correspondingly reduced. Anything that provides a valued customer with more value and efficiency has the potential to become a new business model.

Much of the talk around “super apps” reflects the behavior and expectations of a maturing digital-first demographic that wants to be able to manage everything as simply as possible through their phone. But super apps aren’t going to work without trusted data that’s interoperable across ecosystems. They aren’t going to be super if they can’t deliver privacy, consent, and security.

Verifiable credential technology doesn’t require everyone moving to a new system or companies and organizations trashing their existing systems. It’s a technology layer that transforms what you already have. And because it is open source, there’s no vendor lock in. It’s a tool and a toolbox for innovation and interoperability. Those that have it will be able to accelerate digital transformation. Those that don’t will face equally rapid obsolescence.

To learn more about how you can deploy-open-source verifiable credentials now, contact us.

Photo (modified) by Tech Daily on Unsplash

The post Five Verifiable Data Predictions for 2023 appeared first on Indicio.


YeshID

Day 2: Securing Your Login - Review and Update Authentication Settings

Today our focus is on authentication settings. If you are using Google Workspace as your IDP, reviewing your authentication settings to...

Today our focus is on authentication settings. If you are using Google Workspace as your IDP, reviewing your authentication settings to ensure they align with internal policies and are appropriate for any use of Google Cloud Platform is paramount.

We recommend two important settings:

1. Turn on two-factor authentication.

Why? Two-factor authentication (2FA) adds an extra layer of security to your online accounts by requiring a second form of verification in addition to your password. This can be in the form of a code sent to your phone, a biometric scan, or a physical token. By using 2FA, you make it much harder for someone to gain unauthorized access to your accounts, even if they have your password. This is because a hacker would also need to have access to your second form of authentication in order to log in. Additionally, 2FA helps in cases where the password is compromised, and it's a best practice for organizations to secure their online access.

2. Enforce strong passwords

Why? Weak passwords can be easily guessed or cracked by hackers using automated tools, making it easy for them to gain access to your accounts. A strong password typically contains a combination of uppercase and lowercase letters, numbers, and special characters, and is at least 12 characters long. The longer and more complex the password, the harder it is to crack. Using a unique password for each account is also important because if a hacker gains access to one password, they will not be able to use it to access your other accounts

If your org is behind a third-party IDP, super admins will still use username/password and Google’s 2FA, so you will still want to ensure that settings are set properly.

To review these key settings, follow these steps:

Sign in to the Google Admin Console Go to Security > Authentication Review Password Management to ensure that password policies are properly set. Consider requiring ,strong passwords to automatically set a secure default. Go to 2-Step Verification and check enforcement settings. If 2FA is not enforced, kick off a project to require it. If it’s on and all methods are allowed, look into whether it is feasible to disable verification codes over text and phone calls as those are more susceptible to attack. If your org uses GCP, check Google Cloud session controls to ensure that reauthentication is required. This way users cannot maintain a persistent connection to GCP resources without reauthentication.

auth0

Using TikTok as a Custom Social Connection in Auth0

Auth0 allows you to create custom social connections. In this post you'll learn how to create and set up Auth0 with TikTok
Auth0 allows you to create custom social connections. In this post you'll learn how to create and set up Auth0 with TikTok

1Kosmos BlockID

Digital Transformation of Personnel Onboarding

Cyber threats are becoming increasingly sophisticated with the number of bad actors involved increasing at an alarming rate. New and sophisticated techniques are being used that have evolved over the years to leverage modern computational hardware. These developments have aided and abetted the guessing of passwords and hacking of digital credentials. While these threats expose … Continued The po

Cyber threats are becoming increasingly sophisticated with the number of bad actors involved increasing at an alarming rate. New and sophisticated techniques are being used that have evolved over the years to leverage modern computational hardware. These developments have aided and abetted the guessing of passwords and hacking of digital credentials. While these threats expose the inherent weaknesses of password-based authentication schemes, simplistic identity-detached passwordless authentication is not the answer either.

Personnel Onboarding Risks and Requirements

Organizational risk tolerance for new hire onboarding is very low because of the Day 1 access granted to services and applications. Granting access comes with risk and therefore companies will try the usual means of identity proofing via background checks on government-issued documents, work history, address history, etc., as a means to establish trust. But there are limits to background checks that a company normally performs, as these might not extend exhaustively to domestic and international subcontractors. Therefore, establishing trust via identity-linked proofing becomes mandatory.

Passwordless for employees does not by itself inoculate the enterprise against credential compromise. Ensuring the employee is who they claim to be is the gap in most passwordless solutions today because while passwordless only eliminates friction in the employee’s journey, it does not make the journey more secure.

What is needed is a means to mitigate identity theft and identity spoofing to ensure the individual is exactly who they are claiming to be. For the new hire onboarding journey to be secure the employee’s identity must be established irrefutably as well. Companies ask for government-issued identity documents to prove an individual is who they are claiming to be, but that does not prove that the individual logging in to applications at any given time is indeed the same person.

Authorization is another added requirement for onboarding an existing employee to new applications and services. Risk is minimized best when there is continuous authentication as well as transactional authorization in place where the company checks for the identity and authentication assurance level of the individual in real-time before granting them access to critical service and application assets.

Privacy, PII and the Onboarding Process

A CIO is interested in simplifying and securing the Information Architecture of the Enterprise and implementing guardrails for preserving user privacy. Data privacy has recently attained critical focus in the international community with violations costing companies billions in penalties and significant reputational damage to boot. The organization has a fiduciary responsibility to protect the Personally Identifiable Information (PII) of their employees and customers.

In essence, there are a few key factors driving the need for a digital transformation of the new hire onboarding process: verify the identity of the individual, secure the PII data in a way that it is impossible to compromise it, secure or even better, eliminate login credentials (i.e., passwords), implement continuous verification of the identity, continuous transactional authorization before granting access, and finally, ensure continuing privacy of the individual by imposing protections and restrictions on any PII release to third-parties.

Simplified Access

Simplified access to legacy systems using passwordless sign-on improves productivity, no doubt about that. Even better, enabling quick password resets to legacy apps after the individual has completed a strong identity-based authentication roundtrip using live biometrics enables faster and frictionless access to legacy applications. Another strategic benefit of deploying an identity-based passwordless solution is to reclaim 2FA spend on migrating legacy systems to passwordless. Why not re-use the identity from a secure digital identity wallet for onboarding new hires and existing employees to modern and legacy services?

Reduce Costs

A CIO’s broad mandate is to reduce IT costs, including personnel onboarding administration costs by implementing projects that automate the self-service workflows for new hire onboarding, identity proofing, and identity verification. Another key requirement is to reduce help desk costs, which can be implemented by automating password resets for legacy applications as described previously. This identity-based authentication strategy also eliminates help desk calls for those same password resets and reduces 24/7 helpdesk overhead and support costs with fewer complaints and ‘stalled’ or stuck employees.

Improve Security

A quick word is in order for a CISO’s mandate as well. They are looking to improve the overall security posture and prevent – or the more likely scenario – manage losses from data breaches (they are inevitable!). CISOs want to restrict the amount of PII stored in a central user repository to only what is needed to do business. A good information security program also attempts to eliminate vulnerabilities resulting from weak non-identity based authentication techniques.

In summary, CXOs want to simplify the information and information security architecture of the organization, minimize the reputational damage of information breach, and reduce their insurance liability at the same time. Why not go with an identity based authentication and proofing platform that makes mass credential compromise impossible?

It is clear that live biometrics-based aka identity-based authentication that uses pre-proofed identity is much stronger for use in a continuous authentication and continuous authorization paradigm than any other form of identity-detached passwordless scheme. It is the only way to verify that the individual is who they are claiming to be at the time of an access request.

1Kosmos Identity-Based Authentication

All these goals can only be achieved if an organization starts with identity and not just authentication. The traditional approach to passwordless authentication is to focus on MFA or passwordless login. While 1Kosmos is passwordless, we bring identity-based features to an authentication scheme used by the organization. This flexibility may be used to enhance the identity verification process using strong biometrics-based identity and verification of user credentials via industry standards.

What’s unique about 1Kosmos is that we start with Identity, instead of starting with authentication, as the basis for strong authentication and this enables us to solve many of the same challenges for both employees and customers. Our biometrics engine allows for continuous identity verification of the individual at login-time, and continuous transactional authorization at access-time, while remaining aligned with the company’s risk policies. It is no longer acceptable to identity-proof an employee or a customer once – for example during onboarding or new hire – and let them use services indefinitely.

By combining authentication with true Identity (NIST 800-63-3a principles and modified versions for corporate applications), you have a much higher assurance to know who is truly at the end of a digital connection every time they authenticate.

These principles apply to customers especially in the banking industry where strong KYC is needed. 1Kosmos is a member of the FIDO alliance, DIACC, The DIF foundation, Linux Foundation (for Trust over IP), W3C, and the COVID Credentials group. This ensures that you will have a partner with a product that is not only open, preventing vendor lock-in, but is on top of the latest trends in Identity.

If you are interested in learning more about smoothing and securing your remote onboarding processes, I invite you to register for our upcoming webinar with OneSpan

The post Digital Transformation of Personnel Onboarding appeared first on 1Kosmos.


KYC Chain

Top 20 Crypto Market Makers: A Comprehensive Guide

By providing liquidity to crypto markets, market makers play a critical role in the crypto ecosystem, allowing new exchanges and token-based projects to gain traction and and develop new offerings for the community. In this article, we take a look at 20 of the most exciting crypto market makers out there, including some established names in the game and some new, innovative entrants to the scene.&

Monday, 30. January 2023

Shyft Network

Veriscope Regulatory Recap — 22nd January to 28th January

Veriscope Regulatory Recap — 22nd January to 28th January Back with another Veriscope Regulatory Recap! This week in focus — the note from the Biden administration officials urging Congress to intensify crypto regulatory efforts, Indonesia signing a new financial law to regulate crypto beyond trading, and a US Democrat introducing a bill for legalizing crypto payments in New York state. Let’
Veriscope Regulatory Recap — 22nd January to 28th January

Back with another Veriscope Regulatory Recap! This week in focus — the note from the Biden administration officials urging Congress to intensify crypto regulatory efforts, Indonesia signing a new financial law to regulate crypto beyond trading, and a US Democrat introducing a bill for legalizing crypto payments in New York state. Let’s dive straight into it.

US Official Note Urges Congress to Step up Crypto Regulations

The Biden Administration presented in its official blog, from January 27th, 2023, a roadmap for addressing potential crypto risks. It aimed to ramp up enforcement and asked the U.S. Congress to step up its regulatory efforts.

(Image Source)

The post cited some significant failures in the crypto industry in recent times, including the implosion of Terra protocol’s algorithmic stablecoin and the collapse of FTX, the third-largest crypto exchange in the world.

The lack of applicable regulations, the prevalence of misleading statements, inadequate disclosures, and weak cybersecurity measures are among some of the risk factors that the blog has identified.

The administration asked not to allow mainstream financial institutions like pension funds to enter crypto markets.

Indonesia Wants to Treat Crypto as a Security

With Indonesian President Joko Widodo signing a new law on January 12th, crypto regulatory powers will now shift from the Financial Services Authority (OJK) to CoFTRA, a commodities watchdog.

(Image Source)

ABI, the blockchain trade association of Indonesia, has welcomed the move saying that “this shift has shown a good understanding from the regulator that crypto assets are broader than just trading.”

According to the Indonesian Government, the regulatory changeover would take two years.

The country also has plans to set up a national crypto exchange similar to the NYSE, which the government believes will make market monitoring easier for regulators.

IMF Issues 5-Point Crypto Regulatory Recommendations

The International Monetary Fund (IMF) recommendations ask regulators to license, register, and authorize crypto asset providers. It calls to prohibit crypto entities from conducting multiple functions under one business, as it creates conflicts of interest.

(Image Source)

The IMF has also asked for regulators to apply robust regulations, as applicable to banks, to the issuers of stablecoins. It has suggested imposing requirements on traditional financial institutions for crypto exposure or engagement.

The final recommendation called for a consistent global approach to crypto regulation and oversight.

US Democrat Introduces a New Bill for Crypto Payments

A member of the Democratic Party and the representative of the 33rd district of New York, Clyde Vanel, has introduced a crypto bill to the New York State Assembly. It is asking state agencies to legalize and accept cryptocurrency as a means of payment for fines, taxes, fees, civil penalties, and other state-related dues by forming partnerships with entities that will enable crypto payment settlements.

(Image Source)

The bill is now in the hands of the New York State Assembly Committee on Government Operations for further study and possible amendments.

The bill requires it to be passed by the Assembly and Senate Body and by the Governor of the State before it could become law.

Shyft Network Tokenomics Update

Last week (January 26th, 2023), we published the Shyft Network token distribution and economic update. At the time of writing, we were at a block 8,884,061, with 595,473,894.1127162 SHFT in circulation.

The current SHFT unlocked at the time of the publication was 951,904,687 on the 26th of Jan. 2022. As yield wrap and AMM solutions have not been launched yet — unlocked tokens are accumulated and are not distributed. Thus, even as they are unlocked — they have not been touched and have not reached the market/users but may be introduced upon the launch of yield wrap programs. Also, it includes unlocked SHYFT treasury tokens.

More details: https://www.shyft.network/newsroom/shyft-network-token-distribution-and-economics-update

Interesting Reads

Norway leading the way toward CBDC

MiCA at the Door: How European Crypto Firms Are Getting Ready for Sweeping Legislation

UK Minister Commits to Greater Crypto Industry Engagement as New Regulation Looms

______________________________

VASPs need a Travel Rule Solution to begin complying with the FATF Travel Rule. So, have you zeroed in on it yet? Check out Veriscope, the only frictionless crypto Travel Rule compliance solution.

Visit our website to read more: https://www.shyft.network/veriscope, and contact our team for a discussion: https://www.shyft.network/contact.

Also, follow us on Twitter, LinkedIn, Discord, Telegram, and Medium for up-to-date news from the world of crypto regulations. Also, sign up for our newsletter to keep up-to-date on all things crypto regulations.

Veriscope Regulatory Recap — 22nd January to 28th January was originally published in Shyft Network on Medium, where people are continuing the conversation by highlighting and responding to this story.


UbiSecure

What is Step-up Authentication?

Step-up authentication is an authentication method available for eService providers to allow users to access certain resources with minimal credentials, while requiring... The post What is Step-up Authentication? appeared first on Ubisecure Customer Identity Management.

Step-up authentication is an authentication method available for eService providers to allow users to access certain resources with minimal credentials, while requiring additional verification when accessing sensitive information. Matching authentication levels and authentication methods with the sensitivity (or value) of the resource provides a good balance between security and ease of use.

This blog will concentrate on and explore step-up authentication, what it is, how it interacts with SSO and its benefits.

What is step-up authentication?

Step-up authentication is the use of two separate authentication methods, at differing intervals throughout login and information access. A lower level of assurance method is used to access the basic information before a higher level of assurance method is then requested if the user wants to access more sensitive information.

Step-up authentication is technically an upgrade to an existing authentication level. The basic concept is that initially, users authenticate themselves using a lower-level authentication method. For example, the username and password created during the account registration or one of the social media methods like Apple, Google, or Facebook. This login creates a session which is typically stored to a web browser cookie, and this allows the users to access certain types of services and information.

If the users want to access more sensitive information or, for example handle money-related transactions, then they must upgrade the authentication level of the existing session to a higher degree. This is where the step-up authentication takes place. From the users’ point of view, it is just another authentication workflow where they use a higher level of assurance authentication method, such as Bank ID, Time-based One-Time Password (TOTP) codes, mobile PKI, ID cards or similar. This adaptive approach ensures that only authorised users can access the sensitive information, thereby managing the risk level without negatively impacting the usability.

Step-up authentication and SSO

One of the basic features of a modern CIAM System is the single sign-on (SSO). It is a great feature that allows users to move between different services of the same service provider without the need to re-authenticate in between. The SSO between different applications requires the user session to be authenticated to a specific level or re-authentication will be requested. The different MFA authentication workflows utilise the so-called ‘weakest method first’ principle where the lower level of assurance methods are executed first, followed by the higher assurance methods. This is the same principle usually applied in step-up authentication. If users have not yet executed the higher level of assurance within step-up authentication, they may be blocked from using the SSO within applications that require the higher level of authentication. In these cases, the users must execute a new authentication which is not part of the step-up authentication.

Why choose Step-up authentication?

There are several so-called multiple authentication request schemes to choose from. These include multifactor authentication (MFA), step-up authentication and risk based authentication (RBA). Although there are some similarities between these, including the authentication methods being the same and step-authentication and RBA both being based on MFA. There are also some fundamental differences, like the conditions of when and where those extra authentication steps are requested. Step-up Authentication sits between MFA and RBA. Making it an ideal authentication method when your system requires a higher level of authentication than basic MFA can provide but does not require the intricacies of RBA.

In comparison to MFA, which requests two authentication methods during the initial login process, step-up authentication only requests the additional higher-level authentication when the user wants to access sensitive information, or perform higher risk, higher value transactions. This not only makes the system more user friendly but also saves costs in comparison to traditional MFA due to only requesting secondary authentication as and when required.

Step-up authentication and Risk-Based Authentication (RBA) differ in that step-up authentication allows access to different parts of a service using both single-factor and multi-factor authentications, while RBA only utilises multi-factor authentication as a means of added security when suspicious activity is detected, without providing any additional access to the service.

By raising the assurance level of an authentication method, we make services more secure. However, at the same time, the access procedure becomes more complex for the end user, thus potentially increasing frustration, which we want to avoid. In addition, added security of MFA usually increases the cost for the service providers in form of transaction-based pricing per authentication action. Step-up authentication and RBA provide optional approaches to solve these issues.

Conclusion

Nowadays, people understand the role of the cyber security of eServices and are becoming more familiar with the higher level of assurance authentication procedures. However, usability is a key factor for a successful eService. Providing easy access without compromising security can be a separator between you and your competition.

Step-up authentication provides the best of both worlds, easy usability when accessing protected resources and a high-security level when accessing more sensitive information. Even if criminal hackers managed to get users’ credentials to access their services, sensitive information or money-related transactions would not be available without using the higher level of assurance method required to access these resources. This, combined with the cost savings that service providers can achieve with step-up authentication, makes it a very attractive option when evaluating different authentication method schemes for their eServices.

As a Customer and Identity Management System (CIAM) vendor, Ubisecure’s Identity Platform offers various authentication method options for the eService providers. These methods can be used individually, as a single-factor manner, or as a combination of two or more, offering various schemes for authentication.

If you’d like to find out more about step-up authentication, contact us and speak to one of our technical experts.

The post What is Step-up Authentication? appeared first on Ubisecure Customer Identity Management.


MyDEX

Google and Facebook: Steam Engines of the Information Age

This is one of a series of blogs exploring Hidden in Plain Sight: The Surprising Economics of Personal Data, the subject of Mydex CIC’s latest White Paper. Image generated by AI using openai.com/dall-e-2 As a technology, steam engines are awesome. Literally. They inspire awe in people. Their raw power. The heat, the huffing and the puffing. The choo-choos! What they do is visceral. Breath-tak

This is one of a series of blogs exploring Hidden in Plain Sight: The Surprising Economics of Personal Data, the subject of Mydex CIC’s latest White Paper.

Image generated by AI using openai.com/dall-e-2

As a technology, steam engines are awesome. Literally. They inspire awe in people. Their raw power. The heat, the huffing and the puffing. The choo-choos! What they do is visceral. Breath-taking.

Steam engines transformed our world and came to symbolise an entire era. They powered the first factories. Steam-powered trains transfigured our society, economy and geography bringing distant, remote places closer. The stupefying power of The Flying Scotsman, the steam engine that slashed the time it took to travel from London to Edinburgh, was celebrated, admired — and emulated — across the world.

And yet.

Transformational and awesome as it was, the steam engine was an evolutionary dead-end. It may have been the harbinger of a new industrial age. But in the end it wasn’t the foundation and driver of this new age. Instead, it was displaced by something far better: electricity. Steam had its day. Then the world moved on.

Steam engines of the information age

Could modern juggernauts like Google and Facebook (sorry, Alphabet and Meta) be the Flying Scotsmen of today’s information age? Awesome but doomed symbols of an era? Yes, they could.

Steam faded from its glory because, as a means of providing energy, it had severe limitations. A separate new steam engine had to be built for every use-case and occasion. Steam engines were inefficient. Cumbersome to use. Expensive.

Compared to an electricity national grid there was no contest. The grid provided universal, safe, instant, easy, cheap access to something that everybody needs: energy for the purposes of heat, light and motion. Steam may have inspired awe. But ultimately it did not represent the future.

Today, we have a similar situation. Personal data is something everybody needs to access safely, instantly, easily and cheaply to manage their affairs better: to make better decisions and to plan, organise and coordinate their implementation. So we need the equivalent of a national grid to enable this, via better data sharing.

But today’s data systems are specifically designed NOT to do this. They revolve around the organisation-centric database — a separate data silo where each organisation keeps the data they collect under lock and key, for its use only. Like steam, a separate database needs to be built for every use case and occasion.

Yes, back in its day, the organisation-centric database transformed our world by enabling the collection and use of data, to make an increasing range of activities data driven, just as steam made an increasing range of activities steam driven.

But like the steam engine, the organisation-centric database is also inefficient, cumbersome and expensive, restricting access and use of personal data. Steam demonstrated the power of energy. The organisation-centric database demonstrated the power and potential of data. It opened a door to a different future, But like steam, it does not represent this future.

A flawed model

This includes Google and Facebook, icons of our age, just as the Flying Scostmen was an icon of its era. As data juggernauts, they have transformed our society, inspiring awe, admiration (and emulation) the world over as they have done so. As steam-powered railways did.

But they are also severely limited.

Despite their size and ambition, they are limited in the data they can collect (how many people would share their health or financial data with Google or Facebook?). Economically speaking, they are just skimming the surface of what data can do. Focused as they are on marketing, advertising and media they hardly touch the core — industries such as health, education, banking finance and insurance, public administration in central and local government; any and all services dealing with identifiable individuals. As such, they cannot touch anything more than just a tiny fraction of all economic activity.

Meanwhile, their business models generate irreconcilable conflicts that constrain them: to keep on growing their profits, they have to hoover up ever more data. But the more they try to do so, the less trustworthy they become and the more they are hounded by regulators and competition authorities. They had an amazing run, but they are already plateauing. They are not the harbingers of the future. They represent an evolutionary dead end. They do NOT represent any sort of sustainable model for the future.

From electrification to datafication

In this blog series we’ve talked about the transformational power of personal data and personal data stores, their ability to:

unleash breakthrough productivity improvements for both households and service providers solve the challenges of seamless joined-up services transform the way people and organisations go to market create entirely new person-centric services and industries accelerate and enrich the journey to net zero.

We’ve talked in some detail about the how and why of each of these opportunities, but we haven’t explained the common economic principles that unite them.

One such principle is illustrated by the national electricity grid. By making a universally needed enabler — energy — safe, easy to access and affordable, it lifted previous barriers to its use and opened up an explosion of possibilities. Providing every individual with their own personal data store would do the same for another universally needed enabler — the personal data that lies at the heart of every service that deals with an identifiable individual, across all sectors of the economy (public, private and third).

The national electricity grid enabled the electrification of the economy. The personal data store enables its datafication, transcending the barriers and limitations of the steam-engine of our age, the organisation-centric database.

Applying tried and tested economic principles

The national grid analogy is useful in two ways. It captures some key principles that need to be adopted, and it illustrates the nature and scale of the opportunity.

It also illustrates the importance of key divisions of labour, between electricity generation and distribution. With personal data, today’s organisations act as ‘power stations’ that generate data and personal data stores enable its distribution.

But what the analogy misses is the ‘how’. To see the underlying economic principles at work here, we need to look at another transformational development of the early 20th century: mass production, the brainchild of Henry Ford.

Think of any physical product you like. It’s made up of a specific combination of different, specified parts which (if put together in the right way) contribute their own little bit to making the whole product work. Different products require different parts, combined in different ways.

Henry Ford’s genius was to invent an extremely efficient, generalisable way to bring standardised parts together to create a whole product, using a moving assembly line. It was this that cut the cost of making a motor car by over 90%.

Personal data stores apply exactly the same economic logic to assembling the highly specific combinations of personal data that are needed to deliver specific services. If you think about it, what service providers do with personal data is exactly the same as what manufacturers do with parts to make products: assemble unique configurations of data/parts to produce any and all services/products imaginable.

The only way to make this work, efficiently and effectively, is to use standardised parts. Previously in manufacturing, when each part was made by hand, it was also unique. Different. Which meant it couldn’t just ‘snap’ to fit to other parts. Each one had to be reworked, even if just a little bit, to make the whole product. And this rework took an enormous amount of time.

To make the whole thing work, the components that went into the making of products had to be standardised. By using standardised parts and a moving assembly line (which slashed the time, energy and effort needed to get each part to where it needed to be) Ford reduced the costs of making a motor car by over 90%.

In the world of data, verified attributes (or credentials) are the standardised parts of service manufacturing.

A verified attribute is a piece of information about an individual that has been generated or confirmed by a responsible organisation and turned into a cryptographically secure token — so that it can be shared and used quickly and easily, thereby eliminating the huge amount of re-work that currently takes place.

A verified attribute could be about anything: the fact that you have a valid driving licence, that you are over 18, what your blood type is, or your credit score, or inside leg measurement. It doesn’t matter. But once it has been verified it can be turned into a token (a standardised part) and used to assemble any service that needs it (thereby eliminating the need to fill in forms, for example).

By enabling the gathering and sharing of such verified attributes (as well as additional information provided by the individual that only the individual knows), personal data stores apply the logic that revolutionised the manufacture of physical products to the manufacture of personalised services. They achieve what the national electricity grid achieved for energy.

Conclusion

Breakthrough innovations like the national electricity grid and the mass production moving assembly line, or personal data stores, happen at unique moments in history. They combine novel combinations of things which are relatively new but nevertheless tried and tested. In Ford’s case for example, the oil and electricity industries were relatively new. Standardised parts were already used in arms manufacture. Moving (dis)assembly lines by meat packers. Ford put them together to transform car making.

In our day, the Internet, cloud storage, APIs for data sharing and technologies to make cryptographically secure tokens are all relatively new but tried and tested. Personal data stores bring these developments to transform the collection and use of personal data.

Such transformations are seismic in their effects. Together, the national electricity grid and the mass production assembly line transformed 20th century economies, making steam engines history. Current organisation-centric approaches to the collection and use of data, including those of the Flying Scotsmen of the information age Google and Facebook, are its steam engines.

Yes, they paved the way, highlighted the opportunities, and built new capabilities and infrastructure. But they do not represent the future. Personal data stores on the other hand are the 21st century information age equivalents of the electricity grid and the assembly line. They provide the data logistics infrastructure a modern economy needs, making access to and use of personal data universally accessible and available. They do so by applying principles that were tried and tested with the handling of things to the handling of information.

This raises a knock-on question however: If the resulting opportunity is so big and so beneficial, why haven’t people been falling over themselves to seize it? We turn to this in our next blog.

Other blogs in this series are:

The Great Data Delusion 19th century doctors thought bloodletting would cure most diseases. Today’s prevailing theories of personal data are little better. Why is personal data so valuable? Because of two things nobody is talking about: reliability and surprise Is it a bird? Is it a plane? No! It’s Super…! With personal data, what sort of a problem are we dealing with? The vital issue people don’t want to talk about Productivity is key to prosperity. But nobody wants to talk about it? Why? When organisations become waste factories The single design flaw at the heart of our economic system, and what happens if we can fix it. Why are joined-up services to difficult to deliver? Because the organisation-centric database is designed NOT to share data. People: the dark matter of the economy The elemental force that economics never really talks about An engine of economic possibilities How personal data stores open up new vistas of innovation and growth What has data got to do with net zero? More than you might think.

Google and Facebook: Steam Engines of the Information Age was originally published in Mydex on Medium, where people are continuing the conversation by highlighting and responding to this story.


PingTalk

Key Takeaways From CIAM Survey | Ping Identity

As a leader in the identity and access management industry, Ping Identity strives to understand the challenges consumers face and the expectations they have for the brands they interact with. The Ping Identity CIAM Survey titled “The Balancing Act: Earning Trust Through Convenience and Security” has given us a lot of information about consumers’ relationship with their identity and why they contin

As a leader in the identity and access management industry, Ping Identity strives to understand the challenges consumers face and the expectations they have for the brands they interact with. The Ping Identity CIAM Survey titled “The Balancing Act: Earning Trust Through Convenience and Security” has given us a lot of information about consumers’ relationship with their identity and why they continue to choose digital channels that offer the right balance of convenience and security.


YeshID

Start the New Year on the Right Foot: A 5-Day Challenge for Securing Your Google Workspace

Introduction from YeshID At YeshID, we're big fans of Google Workspace and have adopted it like many other startups. However, we were not...

Introduction from YeshID

At YeshID, we're big fans of Google Workspace and have adopted it like many other startups. However, we were not fully aware of all the security features it offers.

That's why we reached out to our design partner, Ylan Muller, an expert in Google administration, for guidance on best practices for a Google administrator. Ylan is an IT professional with nine years of experience in managing SaaS applications, including Google Workspace, across industries from financial services to software. She currently runs IT at FireHydrant.

Ylan provided us with valuable and easy-to-implement advice, even without a dedicated security or IT team. With Ylan's permission, we've turned her advice into a 5-day New Year's challenge to improve our workplace security. Oh and bonus - you don’t have to purchase any new software to put these good practices in place!

We're excited to share Ylan's insights with you.

Introduction to Google Security Basics

Inheriting a Google Workspace environment can be an intimidating experience. While you can address some problems through written policy (don’t send sensitive data outside of the org!), Google Workspace offers ample technical control for locking down the data that matters most to your business.

I’ve honed in on the settings I find most important to review over several years of managing Google Workspace environments for companies ranging from 50 to 1050 employees and am here to share them with you! Below you will find a list of settings to review and ideas to implement as you look to lock down your Google organization.

Welcome to Day 1!

Day 1: Establishing a Secure Foundation - Review and/or Create Your Organizational Unit and Group Structure

Many settings in Google Workspace can be scoped to either a user group or OU. Ideally, groups and/or OUs should be able to represent your organization structure and any other groupings that would be appropriate for slicing and dicing access and settings. For example, you may want to prevent your finance team from sharing files outside of the organization, but your customer success team may need to. Having your groups and OUs set up properly to start will allow you to more easily scope settings.

Groups are also used in Google Cloud Platform - if you or your engineering team plans on utilizing GCP in the future, think about how you would want to handle access. Setting up department and team groups in Google Workspace in advance may save you time when you go to configure IAM.

We also recommend:

Avoiding nesting OUs too deeply. Having many layers of nested OUs will get unmanageable quickly since a user can only be in one OU, but they can be in many groups. For example, the nested OU structure below will result in a lot of overheard if you ever need to change a setting for every user who is allowed to share externally:

With the newly introduced abilities to apply settings to groups, smaller organizations should consider keeping everyone within a general “Employees” OU while applying settings purely through groups.

Creating a group for IT as a first-stop for new setting assignments for testing Creating a cross-functional group of people in the company, representing all parts of the organization, that can serve as beta testers outside of IT. There is no world in which IT can vet every setting for every use case. Work with this group to collect their feedback about new features or settings you deploy.

If you are on an Enterprise or Education plan you can also ,construct dynamic groups based on profile attributes to automate memberships.


IDnow

All bets are on: World Cup 2022 breaks all records!

Popular sporting tournament surpasses previous World Cup viewing figures, betting activity, and visits to black-market sites. As predicted, the unusual combination of winter scheduling and enforced breaks of domestic football leagues resulted in a World Cup with a record-breaking 5.4 billion cumulated views .  Unfortunately, traffic to black market gambling sites also tripled during the […]
Popular sporting tournament surpasses previous World Cup viewing figures, betting activity, and visits to black-market sites.

As predicted, the unusual combination of winter scheduling and enforced breaks of domestic football leagues resulted in a World Cup with a record-breaking 5.4 billion cumulated views . 

Unfortunately, traffic to black market gambling sites also tripled during the world-famous sporting tournament. According to the UK Betting and Gaming Council, 250,000 people visited black market sites in December alone, compared to 80,000 during the same month in 2021. 

“It’s unfortunate, but not surprising. Consumers are often drawn to black-market gambling sites because of bonus benefits and unclear rules regarding deposit limits. However, users should be aware that there is no consumer protection when using these black-market sites. So, if you win a bet, there is no guarantee that they will pay out. This is why it’s incredibly important for users to use licensed gambling operators that have rigorous player protection and safer gambling tools in place,” said Roger Redfearn-Tyrzyk, Director of Global Gambling & Sales at IDnow.

Read more about Roger’s 2023 predictions for the online gambling market, in our ‘“Always bet on good regulation.”’ interview.

Why bettors may go to black-market operators.

Two of the main reasons players may decide to go to black-market operators is that they tend to offer higher bonuses, along with the promise they don’t have to do KYC. However, it’s worth remembering that the KYC requirement is there to not only protect operators from consumers using the platform for nefarious activities like money laundering or other types of identity fraud, but also for consumer protection. If consumers use an unregistered, black-market gaming platform then there is no guarantee that they will pay out.

The other reason why some consumers decide to go with a black-market operator could be down to their superior customer experience.

Businesses that enter the online gambling market, and comply with regulations, are also often plagued by sluggish UI, and poor customer experience. In fact, research shows that 30% of online gamblers have abandoned the onboarding process when registering a new account with a gambling website due to its complicated sign-up processes.

The importance of offering a fast and smooth, multi-jurisdictional, and KYC-compliant onboarding experience cannot be overstated. As every potential customer needs to be verified before being allowed to gamble, players will no doubt flock to the website or gambling platform that can verify their information the quickest. This need for speed is becoming increasingly important in both the KYC, and online gambling arenas. If player onboarding is too complicated, or has too many hurdles, organizations run a real risk of significant drop-off.

FIFA World Cup 2022 recap.

The 2018 FIFA World Cup was a massive success for the betting and gaming industry, with an estimated $155 billion wagered on the game, including $1.9 billion in the UK alone. As we predicted in our ‘How the Winter World Cup in Qatar impacts other sports events, including the Premier League.’ blog, many bettors appeared to have preferred to stay indoors to watch the winter World Cup tournament, leading to increased demand for online gambling platforms. Indeed, the World Cup 2022 tournament attracted even more gambling activity that World Cup 2018, with a 13% increase in sports betting according to provider of sportsbook technology, content and services, OpenBet.

Key to the growth in betting activity was the recent legalization of online gambling in some states of the United States. GeoComply, a provider of geolocation security and compliance solutions, revealed that the Argentina-France World Cup final actually attracted more US bettors than the final games of the NCAA’s March Madness, NBA final, and NHL’s Stanley Cup final.

In a bit to capitalize on the growing popularity of football, many operators seized the opportunity to offer new products specifically designed for World Cup fans. For example, offering odds on everything from which team will be sent off first, to particular bets, such as how many yellow cards England will receive during one game.

However, now that the World Cup 2022 is over, sportsbooks should not assume that a newly gained customer will continue to bet after the tournament ends, or even place multiple bets throughout the tournament. This was demonstrated on the Kambi network during the Euro 2020 tournament, when 78% of new players bet on the event when given a coupon, but only 76% returned for a second bet during the same event.

Customer acquisition and retention after a big tournament are crucial to successful sportsbooks. The timing of the World Cup 2022 created a real opportunity for operators as the Premier League resumed just a week after the World Cup 2022 ended, allowing for a much shorter retention (and loss) window compared to Euro 2020, which had a gap of over a month.

Unique operational challenges for gambling operators.

The last thing sportsbooks need is for consumers to try their platform – whether they are solid bet builders or player props – only to lose interest and vanish when the World Cup ends.

Operators that can offer a great customer onboarding experience, and a product that delivers a wide variety of markets and an outstanding level of combinability, are perfectly positioned to benefit from the opportunities at major sporting tournaments like the World Cup.

Of course, there is no uniform global legislation that governs gambling services. For example, some countries allow all types of gambling, while others only allow certain forms such as poker or casino games.

Keeping abreast of regulatory differences can be confusing, time-consuming, and expensive. Some of the most common gambling regulations concern age limits, which means verifying the age of players is essential in preventing underage gambling. There are also specific national and regional laws that gaming operators must follow. Find out more in our overview of both the EU & UK Online Gambling Regulations.

Security and betting fraud.

During the World Cup 2018, there were major concerns about security and betting fraud. In preparation for the World Cup 2022, gaming operators were warned to be wary of the following:

Bookmakers offering inaccurate odds – this could be due to a lack of liquidity, and result in customers not being able to place bets. A lack of transparency – some bookmakers were accused of fixing matches. Fraudulent activities – bots operated by criminals were used to manipulate bettors into placing wagers on specific teams or players.

Betting fraud is a problem that affects all countries where online gambling is legal, not just those hosting major sporting events like the World Cup or Olympics. Ahead of the World Cup 2022, fraud prevention service, Cifas warned operators and users about the serious consequences of committing gambling fraud.

As gambling fraud and betting fraud tends to spike during major sporting events, Cifas predicted a huge increase in the number of fraudulent claims that would be submitted in the months following the World Cup 2022.

An increase on figures that were already at an all-time high. Indeed, in the first nine months of 2022, fraudulent claims for chargebacks (where a person fraudulently claims they didn’t make a purchase or receive a product to recoup costs from bank) rose by 172%.

Gambling operators endure many challenges, such as risks of fraud, underage gambling, regulatory compliance, and offering a safe and smooth onboarding process for new users.

How to prepare for World Cup 2026 as a gambling operator.

By the time the next World Cup rolls around, in 2026, the online gambling market is likely to be even more regulated, but unfortunately still susceptible to different types of betting fraud.

It will still also likely be an opportunity to attract new users (considering it will be held in the growing football market of Canada and the United States). For all these reasons and more, customers will want to be able to use a safe and secure gambling platform that complies with multi-geographical regulations.

IDnow offers a seamless onboarding process, reducing verification times and increasing compliance with all major European regulations and beyond.

As the leading European identity verification provider, IDnow’s solutions have a proven track record of success, providing conversion rates of up to 90%, and are not only compliant within the DACH and UK markets, but assist rapid scalability with compliance in 195 countries and support in over 30 languages. IDnow can be trusted to deliver security, support, engagement with regulators, and the prospect of global scalability.

Leveraging the latest eKYC solutions will allow gambling operators to offer a safe, secure and friction-less experience, which is a win-win, for the business and the consumer.

FAQ for the World Cup 2026. What bets will be placed during the World Cup 2026?

Bets for the World Cup are wide-ranging. The most common bets are on who will win the tournament, who will be the top scorer, and who will be the scorer of the tournament’s best goal.

How do I secure my iGaming website against fraudulent activity?

The best way to secure your website is by using an automated, AI-powered system for identity verification. This technology can verify users’ identities from all over the world – anytime, anywhere – and will comply with AML regulations. Read more about our KYC solution, AutoIdent.

Is there a safe and fast way to verify the identity of my customers?

Yes. IDnow offers AI-powered identity verification solutions for users worldwide, using AML-compliant video and document verification. We provide solutions for high-security requirements with VideoIdent, backed by AI technology, and a wallet for storing and reusing verified documents. IDnow helps you verify identity documents faster and more efficiently than ever.

By

Max Irwin, Sales Manager, Gambling at IDnow
Connect with Max on LinkedIn

Sunday, 29. January 2023

KuppingerCole

Analyst Chat #158: The Crown Jewels Are a Lie

Is digital data really every organization's most precious possession, its "crown jewels"? Alexei Balaganski takes a different perspective towards a widely accepted opinion. He instead claims that data is not your most valuable asset. In fact, it can be a toxic liability without intrinsic value, since business value is only created when data is moving or transforming, producing insights, analytics,

Is digital data really every organization's most precious possession, its "crown jewels"? Alexei Balaganski takes a different perspective towards a widely accepted opinion. He instead claims that data is not your most valuable asset. In fact, it can be a toxic liability without intrinsic value, since business value is only created when data is moving or transforming, producing insights, analytics, etc.




Ocean Protocol

Data Farming DF21 Completed, DF22 Started

Stakers can claim DF21 rewards. DF22 runs Jan 26-Feb 2, 2023 1. Overview The Ocean Data Farming program incentivizes the growth of data consume volume in the Ocean ecosystem. It rewards OCEAN for stakers who allocate liquidity to curate data asset with high data consume volume (DCV). To participate, users lock OCEAN to receive veOCEAN, then allocate veOCEAN to promising data assets (d
Stakers can claim DF21 rewards. DF22 runs Jan 26-Feb 2, 2023 1. Overview

The Ocean Data Farming program incentivizes the growth of data consume volume in the Ocean ecosystem. It rewards OCEAN for stakers who allocate liquidity to curate data asset with high data consume volume (DCV).

To participate, users lock OCEAN to receive veOCEAN, then allocate veOCEAN to promising data assets (data NFTs) via the DF webapp.

DF Round 21 (DF21) is part of DF Beta. DF21 counting started 12:01am Jan 19, 2022 and ended 12:01am Jan 26, 2023. 75K OCEAN worth of rewards were available. LPs can now claim rewards at the DF webapp Claim Portal.

DF22 is part of DF Beta. Counting started 12:01am Jan 26, 2023.

The rest of this post describes how to claim rewards (section 2), and DF22 overview (section 3).

2. How To Claim Rewards

As an LP (staker), here’s how to claim rewards:

Go to DF webapp Claim Portal Connect your wallet Rewards are distributed on Ethereum mainnet. Click “Claim”, sign the tx, and collect your rewards

Rewards will accumulate over weeks so you can claim rewards at your leisure. If you claim weekly, you can re-stake your rewards for compound gains.

Rewards will accumulate over weeks so you can claim rewards at your leisure. If you claim weekly, you can re-stake your rewards for compound gains.

3. DF22 Overview

DF22 is part of DF Beta. DF Beta’s aim is to test the effect of larger incentives, learn, and refine the technology. DF Beta may run 10–20 weeks. In any given week of DF Beta, the total budget may be as low as 10K $OCEAN or as high as 100K $OCEAN

Some key numbers:

Total budget is 75,000 $OCEAN. 50% of the budget goes to passive rewards (37,500 $OCEAN) — rewarding users who hold veOCEAN (locked OCEAN) 50% of the budget goes to active rewards (37,500 $OCEAN) — rewarding users who allocate their veOCEAN towards productive datasets (having DCV). Ocean currently supports five production networks: Ethereum Mainnet, Polygon, BSC, EWC, and Moonriver. DF applies to data on all of them.

As usual, the Ocean core team reserves the right to update the DF rewards function and parameters, based on observing behaviour. Updates are always announced at the beginning of a round, if not sooner.

Conclusion

DF21 has completed. To claim rewards, go to DF webapp Claim Portal.

DF22 begins Jan 26, 2023 at 12:01am. It ends Feb 2, 2023 at 12:01am.

DF22 is part of DF Beta. Reward budget is 75K $OCEAN.

Further Reading

The Data Farming Series post collects key articles and related resources about DF.

Follow Ocean Protocol on Twitter, Telegram, or GitHub for project announcements. And chat directly with the Ocean community on Discord.

Data Farming DF21 Completed, DF22 Started was originally published in Ocean Protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.

Friday, 27. January 2023

YeshID

The YeshID kitchen: where security and usability meet

Identity and access management is a nuanced problem. Everyone hates it, and with good reason. So, welcome to YeshID’s metaphorical...

Identity and access management is a nuanced problem. Everyone ,hates it, and with good reason.

So, welcome to YeshID’s metaphorical kitchen. Notice it’s spotless! Let’s take a quick tour, and I’ll point out how we do things differently.

Let’s start over there. Do you see the sign “Customers never hacked because of YeshID”? That is where we create the secure foundation for managing the identities of customers, and employees, the services they need to access, and the authentication methods we provide. Notice the minimalism. How simple everything is. Simplicity minimizes mistakes and ensures precision work. No rough cuts or shortcuts here! The chefs use Golang, a simple, fast, statically typed language created by Google to power our backend.

Next is the frontend station. It’s the one with all the colorful spices and ingredients. Do you see the “Dead simple and delightful to use” sign? We want everything that comes out of the YeshID kitchen to be a perfect balance of security and usability. Every step is continuously refined: customer onboarding, deployment, the employee on and offboarding, security, compliance, privacy, and simplicity. We eat with our eyes first and keyboards next.

The next station, the one with the storage islands, charts and whiteboards, lab instruments, and measuring tools, is our metrics station. Here is where we measure everything we do. The goal here is to identify what – exactly – works. How much time did you spend onboarding a new employee? How many clicks did you have to make? What was the employee’s experience? How can we make it easier and more secure for everyone? We try to measure everything and turn it into actionable information so that the next dish is better for everyone.

The last thing I am going to show you is the most important: our team. Our talented chefs work across all stations. Together, we yesh things into existence–building everything from nothing. We’ve built the kitchen and tools; we’re growing the ingredients, and assembling the dishes. We’re obsessed with the customer experience, your experience as an admin, your experience as an employee, and your experience as a user of the YeshID wallet. We constantly iterate on every aspect and try new ingredients, new combinations. We’re never satisfied. This is why one of the values in our metaphorical kitchen, but very real company, we put in our first blog post: “,Innovate until experience & security exist in harmony.

Our chefs are experienced. They have tasted good dishes and bad ones. They experiment and innovate mixing old ingredients and new, and delight when they create something delicious that no one has ever seen, tasted, or smelled before.

Let me tell you a little about why we felt compelled to start YeshID - the frustration that drove us.

Imagine you are responsible for digitally onboarding new employees to your company, AcmeLabs. You just hired Fred or Sally. And you (or someone charged with that duty) need to assign an @acmelabs email address (and temporary password) for each new hire. They need to make sure that the email and password are sent securely and that the new hire enabled 2FA.

So here’s the classic recipe:

Get the new hire’s personal email address (the one that’s been used in the hiring process) Go to Google Workspace and create a new user account. Assign a user name based on the company’s standard (an alias can be added later) Add the personal email and phone number (if known) as backup contact points. Send an email to the personal email address with a log-in link. When they click the link they’ll have to replace the temporary password with a new one. Yikes! Passwords. Of course, you’ve set up a domain-wide password policy – the minimum and maximum password length; whether to require strong passwords; how often they expire; Yes, replace a password with a password. And users never make mistakes. And they never forget the passwords that they’ve entered or increment that last digit. Never. And they never let the link expire – like you sent it on Friday and they don’t get to it until Monday. Two weeks from now. Oh never! And of course, you’ve enabled 2FA for the domain. So after they replace a password with a password they have to choose their second authentication method. And that’s not dead simple. It’s easy to screw up and then you’ve got to fix it. By the way, how long a grace period do you give new users before they provide their second authentication method? And do you want to enforce it? Hint: if enforcement and no grace period, then they can’t log in! And if there’s a grace period, what happens when the grace period lapses – as it will? And how many steps do they go through to authenticate with their phone? If you’re using a third-party MFA solution, make sure it’s been installed ahead of time and your new hires already know how to use it, or they’ll get locked out.

Taste good so far? Or is it kind of bitter? And that’s not all of it. There are lots more things that can go wrong.

And will.

And we wonder why 40% of help desk tickets have to do with credentials.

What about YeshID? How do we make it dead simple, completely secure, and delightful to use?

Stay tuned as we share more about our special sauce in upcoming posts.

Hungry? Take a bite. Reach out to ,Dana or ,me.

Want to build with us? Chefs needed! Check out: ,https://www.yeshid.com/web-frontend-engineer.


This week in identity

E20 - Strata.io Series B $26M / Home Depot Consent Breach / Fave Biometric Poll Result / Identity Based Authentication / IAM Maturity Assessments

This week Simon and David discuss a $26 million series B round for identity orchestration vendor Strata.io. What is identity orchestration, why is it a problem today and how can it be handled within the enterprise?  What is IDQL and what are recipes?  A discussion on a recent consent breach at Home Depot in Canada saw the Canadian Privacy Commissioner got involved. They also review a rec

This week Simon and David discuss a $26 million series B round for identity orchestration vendor Strata.io. What is identity orchestration, why is it a problem today and how can it be handled within the enterprise?  What is IDQL and what are recipes?  A discussion on a recent consent breach at Home Depot in Canada saw the Canadian Privacy Commissioner got involved. They also review a recent poll covering our favourite biometric, which spawned a discussion around identity based authentication (see 1Kosmos and keyless.io for more on that).  They also delved into the world of IAM maturity assessments...




Shyft Network

Norway leading the way towards CBDC

Norway joined Israel and Sweden in September 2022 to test the feasibility of CBDCs as a way to facilitate cross-border payments The Norwegian Central Bank has made the open-source code for its Ethereum-backed digital currency Sandbox available on GitHub. The second part of the code is yet to come out, while the test network is leveraging Hyperledger Besu now instead of Ethereum. The prev
Norway joined Israel and Sweden in September 2022 to test the feasibility of CBDCs as a way to facilitate cross-border payments The Norwegian Central Bank has made the open-source code for its Ethereum-backed digital currency Sandbox available on GitHub. The second part of the code is yet to come out, while the test network is leveraging Hyperledger Besu now instead of Ethereum.

The prevalence of digital assets and the intensity of the crypto-economy in Norway is at a moderate stage compared to some of its neighboring countries.

While Swedish and Danish crypto startups have raised 40 million and 32.5 million Euros, respectively, so far from Initial Coin Offerings, Norwegian crypto companies have raised 27 million Euros.

If we compare Norway’s volume of ICO funds with Lithuania and Estonia, it lags by an even wider margin. For instance, in Lithuania, blockchain startups have raised nearly 1.1 billion Euros, and the figure has touched 285 million in Latvia.

In terms of the number of blockchain solution providers, Norway has 22, while Denmark has 24, Finland 18, and Latvia 15. The numbers show that while the volume of funds raised in Norway might be less, there is no lack of blockchain solution providers in the country.

It is in such context that Norway has taken up a CBDC project in collaboration with countries like Israel and Sweden.

Project Icebreaker: A Joint Exploration on CBDCs

In late 2022, the Bank of International Settlements (BIS) and the Central Banks of Israel, Norway, and Sweden jointly launched Project Icebreaker. The project’s core rationale was to see how Central Bank Digital Currencies can facilitate cross-border payments.

The banks decided to connect the domestic proof-of-concept CBDC systems and publish the exploration outcome report during the first quarter of 2023.

The architecture of the Icebreaker, a retail CBDC cross-border project, involves five stakeholders: the end user, retailer, FX Provider, Wallet provider, and the Central Bank. The project aims to enable and test the feasibility of immediate retail CBDC payments across borders at a significantly lower cost than traditional systems.

Norway has long been trying to eliminate any over-dependence on traditional payment systems, significantly cash.

Relevant Article: China Leads the Digital Currency Future

Norway’s Problems With Cash

Trond Bentestuen, a then executive at a major Norwegian Bank DNB, proposed restraints on the use of cash in 2016, citing that the country’s Central Bank could account for only 40% of its use, while 60 percent of the usage was outside of any control.

(Image Source)

In 2015, another Norwegian bank, Nordea, refused to accept cash except for its Oslo Central Station branch in a similar instance.

Inclination towards non-cash means of transactions ran parallel to it. However, the country took another six years to launch digital currency initiatives on the ground.

The Norwegian CBDC

In September 2022, the Norwegian Central Bank made the open-source code for the Ethereum-backed digital currency Sandbox available on GitHub. The second part of the code was decided to be revealed in mid-September 2022. However, it has not come out to date.

The purpose of the digital currency sandbox was to offer an interface to interact with the test network and enable functions like minting, burning, and transferring ERC-20 tokens.

To explain the release of the first phase of the open-source code, the Norwegian Central bank said that it was to offer a good starting point for learning and not to imply that the technology will be based on open-source code.

At this point, the Norwegian Test Network is not using the public Ethereum ecosystem anymore but leveraging a private version of Hyperledger Besu, an enterprise blockchain.

Relevant Article: India’s CBDC Project: What is it Upto?

(Image Source)

The Norwegian Central Bank’s principal partner in building the CBDC project’s infrastructure is Nahmii, the developer of a layer-2 scaling solution for Ethereum.

Overall, the progress of Norwegian CBDC development is yet to reach a state where conclusive opinions around its feasibility can be drawn. Both Project Icebreaker and the country’s domestic CBDC project are still on their way to fruition. One would have to wait until the end of Q1 2023 to get a better idea of the direction in which things are to move.

______________________________

VASPs need a Travel Rule Solution to begin complying with the FATF Travel Rule. So, have you zeroed in on it yet? Check out Veriscope, the only frictionless crypto Travel Rule compliance solution.

Visit our website to read more: https://www.shyft.network/veriscope, and contact our team for a discussion: https://www.shyft.network/contact.

Also, follow us on Twitter, LinkedIn, Discord, Telegram, and Medium for up-to-date news from the world of crypto regulations. Also, sign up for our newsletter to keep up-to-date on all things crypto regulations.


Northern Block

Mobile Driving Licence (mDL): Exploring ISO 18013-5&7 (with Andrew Hughes)

 >>  Listen to this Episode On Spotify >>  Listen to this Episode On Apple Podcasts About Podcast Episode There has been quite some traction behind the ISO Mobile Driver’s Licence (mDL) recently. Many US States have opted to deploy them, it has recently been suggested in the EU digital identity guidance as an initial […] The post Mobile Driving Licence (mDL): Exp

>>  Listen to this Episode On Spotify

>>  Listen to this Episode On Apple Podcasts

About Podcast Episode

There has been quite some traction behind the ISO Mobile Driver’s Licence (mDL) recently. Many US States have opted to deploy them, it has recently been suggested in the EU digital identity guidance as an initial use case to be supported, and the mDL mdoc format is starting to be supported by other transport protocols such as OIDC4VC.

We talk a lot about various flavours of verifiable credentials on this podcast. Verifiable credentials aim to give broad expressive capacity to digital credentials for a variety of use cases, while the Mobile Driver’s Licence (mDL) addresses the particular use case of mobile driving licenses.

One of the things that resonated the most to me in this conversation with Andrew was the fact that mDL has the opportunity to be a powerful tool in credential transformation. What does that mean? 

Governments today aren’t accustomed to issuing verifiable data to their citizens.  And because mobile driver’s licenses are an issuer-oriented standard today, and governments (issuers) are members of the organization that is defining these standards (ISO), there is a clear appetite for them. The thought is that once a use case-specific credential such as a mobile driver’s licence gets issued into the wild, then the next logical step for government issuers is to begin thinking: “Hey! What if I don’t have to issue digital representations of physical documents, but instead start issuing specific claims or attestation?” This is the credential transformation.

Another thing that resonated with me in this conversation was the fact that we need to make sure we separate standard work from implementation work in our discussions, regardless of the standard. Certain features such as privacy, interoperability and security can be built-in, or rather suggested by the standards. It however comes down to implementers to build their solutions around them in the right manner. This also includes how mDL issuers determine how they interact with the large mobile hardware and software vendors (e.g., Apple and Google), who pose risks of walled gardens and control.

In this podcast episode with Andrew Hughes, we discuss,

Distinguishing the mobile driving licence (mDL) credential type from a verifiable credential (VC). How the mDL standard is working towards being consumed by other credential transport protocols (e.g., DIDComm, OIDC4VC) Can the same ISO standard for mDL be used to issue non-driving licence credentials? And should it? Do issuers of driving licences consider mDL it as a driving licence credential, or an identity credential? What does the ecosystem look like for mDL vs the one for physical driving licences? Who are some new participants that aren’t involved in physical DL production and governance? Why implementation supersedes the standard work. What are some interesting use cases around mDL that are gaining traction? How ISO works and how the relevant mDL sub-committees are evolving the standard. Are there concerns with the mobile hardware and OS providers gaining too much control over the mDL credentials?

 

About Guest

Andrew Hughes CISM CISSP is Director of Identity Standards at Ping Identity. He is a digital identity strategist contributing to international standards development. He works with international associations and standards bodies as a domain expert, developing standards and related conformity assessment materials. Andrew serves on the Board of Directors of Kantara Initiative, and as the Chair of the Kantara Leadership Council. As a national expert delegate for Standards Canada on digital identity, he contributes to development of international standards at ISO SC 27 for identity management and ISO SC 17 for mobile driving licenses and mobile eID. Andrew is currently investigating how the worlds of Government Issued Photo ID can co-exist with the emerging Verifiable Credentials models, in a mobile-first manner.

 

LinkedIn: https://www.linkedin.com/in/andrew-hughes-682058a/

Twitter: https://twitter.com/IDIMAndrew

The post Mobile Driving Licence (mDL): Exploring ISO 18013-5&7 (with Andrew Hughes) appeared first on Northern Block | Self Sovereign Identity Solution Provider.

The post <strong>Mobile Driving Licence (mDL): Exploring ISO 18013-5&7</strong> (with Andrew Hughes) appeared first on Northern Block | Self Sovereign Identity Solution Provider.


SelfKey

AI: A Threat to Web3 Decentralization? SelfKey Can Help

It is imperative that Web3 adopts a decentralized human verification system, much like the SelfKey ecosystem, to fight against the potential threat of AI to the fundamental properties of decentralization and democracy that are embedded in Web3.

The rise of AI has been imminent for quite some time. What was lacking were commercialized products that are relevant for average users. However, 2023 seems to have offered the first real glimpses of a change on this path — many believe that this will be the breakthrough year for AI as more applications and AI utilities become embedded in our day-to-day lives.

With the increased use of AI-based bots, users around the world are relentlessly testing and improving these products – and AI in general. And when it comes to crypto, bots have always been part of the picture. With all these recent developments in mind, it’s becoming increasingly obvious that AI has an undeniable influence on our daily lives.

The rise of a technology like AI, which reduces manual human effort, is probably the next step in human evolution. AI can help bring improvements to various facets of life, and the possibilities are endless. Nevertheless, the drawbacks – and any necessary steps to curb these drawbacks – should also not go overlooked.

AI for Crypto & Web3

Bots that utilize AI and machine learning technologies have been used for various purposes within the crypto industry. Anyone with even a superficial interaction with the space will have come across trading bots, anti-spam bots, or similar others that offer unique utilities for users. And with the advent of Web3, the dependency on bots and AI within the crypto industry is bound to increase.

The utilization of AI and its related technology should collectively be beneficial for the crypto community. However, since every coin has two sides, we should always “hope for the best and plan for the worst.”

So what could go wrong with AI or bot usage on Web3 or within crypto communities?

Fundamentally, Web3 is more decentralized and democratic than its predecessors. And these inherent properties enable Web3 platforms to ensure that all value created through these platforms will be shared equally by creators, content-makers, and users without any discrimination or bias towards one party.

Imagine a video-sharing platform that viewers can use to earn tokens for the time they spend on the platform and the work they do to suggest improvements to the overall working of the platform.

As mentioned earlier, an ideal Web3 platform would treat all parties equally. In such an ideal scenario, everyone associated with the platform will have equal stakes on the platform, making it truly decentralized.

However, an ideal scenario is often not the reality. Bots or AI programs created by bad actors, which can mimic human behavior, can take advantage of this parity and skew the stakes of the platform in their favor. Such behavior can distort the decentralization of a platform and tip the balance toward centralization.

Added to that, by gaining an unfair advantage on a platform, exploiters can theoretically influence voting and gain an unfair advantage over the governance rights of other users, making the system even more biased.

Thus, bots and AI programs have the potential to dismantle the two core properties of Web3, which would effectively take it back to Web2.

SelfKey Can Help

Web3 users and platforms can win the race before it even starts by not allowing bots to compete with real human users. To achieve this, platforms may need to employ a decentralized human-proof verification process: Only users who pass the verification as real humans should be allowed to share the value of the platform and subsequently take part in its governance.

A decentralized identity ecosystem like the SelfKey ecosystem can help Web3 users and platforms distinguish humans from bots. Users who verify their identity once will be able to use their verified identity across Web3 using NFTs backed by Soulbound NFT technology.

In this way, the SelfKey ecosystem can keep bots in check, allowing the technology to be used for its benefits, and giving peace of mind to users and platforms alike.

Conclusion

The AI revolution is here, and like any other technological advancement that can help humanity, we will need to embrace it. However, the exploitation of AI for unfair advantages on new-age crypto and Web3 platforms needs to be kept in check to promote a more decentralized digital environment.

Recent internet history has shown that humans often cannot successfully compete with bots, as they are constantly active and evolving. This doesn’t mean that bots or AI are fundamentally bad technological advancements. In fact, in most cases, it’s quite the opposite: AI is and will play a more active part in our digital lives, performing countless functions that most humans would either be incapable of or unwilling to perform.

It is foolish to fear AI and attempt to arrest its growth. However, communities need to work on solutions to avoid exploits by bad actors who can wield the power of AI for unfair, biased, and centralized benefits. If we don’t, humans will soon be competing against AI bots seeking to skew decentralized systems in their creators’ favor.

A decentralized identity ecosystem like the SelfKey ecosystem can effectively allow human participants to verify their identity and use their verified identity across Web3. This solution does not sideline or ban the usage of AI but allows Web3 platforms to distinguish bots from humans.

AI technology will constantly evolve, which may one day make bots indistinguishable from humans. However, what defines us as humans is our identity. Using projects like SelfKey, we can use this unique attribute to distinguish ourselves on Web3.

It is imperative that Web3 adopts a decentralized human verification system, much like the SelfKey ecosystem, to fight against the potential threat of AI to the fundamental properties of decentralization and democracy that are embedded in Web3.

Identity verification using the SelfKey ecosystem can make AI and bots work for humans – not the other way.

Join the SelfKey ecosystem now!

Thursday, 26. January 2023

Holochain

Holochain Beta Released

Dev Tools on the Way

Today we published Holochain Beta 0.1.0, an app-stable beta release, the next step of our Holochain Beta release sequence.

This is a major leap in the Holochain development roadmap, and it is one we have been diligently preparing since our first Release Candidate published in December.

Let’s walk through what this means.

What does app stability mean for devs?

Over the past years of development as a rapidly iterating project we have introduced new fixes, features, and optimization to our code. In our weekly Holochain alpha releases we have typically pushed new features, structural changes, fixes, and updates, many of which would break functionality in existing applications. App developers frequently had to change their app code to access the new fixes and features. Not an easy environment to develop a fully fledged app in.

With Holochain Beta, the days of us regularly breaking your code are behind us.

Now we will be updating a beta 0.1.0 branch of Holochain with non-breaking fixes and features for six months. So you know you won’t have to make changes to your app code to keep it running. We’ll also keep the 0.1.x-compatible bootstrap and proxy services running for that entire time, so people using your hApps will be able to find and connect with each other.

Holochain’s second security audit is complete

Least Authority, a respected security firm that focuses on decentralized tech recently completed another audit for Holochain. Their first audit, completed in December 2022, was of the Lair Keystore. In this second security audit, the Holochain Deterministic Integrity library was reviewed.

Holochain Deterministic Integrity (HDI) is a hApp developer’s interface with the Holochain framework’s consistency and security model and is crucial in allowing devs to write rules that ensure all nodes make the same conclusions and ultimately reach the same state. This required the auditors to go deep into Holochain itself, analyzing the assumptions embedded in the integrity engine and networking protocol.

Here is what the Least Authority team published today:

“Our team performed a comprehensive design review and audit of the Holochain Deterministic Integrity (HDI) crate, a critical component of the Holochain system. Overall, we found the HDI crate to be highly modular and organized in a logical, compartmentalized manner.”

The report identified one issue, related to a lack of up to date design specifications, which we resolved. If you’re interested to read the findings, the full report is now available.

Dev tools coming soon

While Holochain Beta 0.1.0 is fully functional, we now need to update the multiple components and developer tools that are affected by the breaking changes introduced by Beta. We expect to have those complete by 9 February. Once that is announced you will be able to start building more easily and with the confidence that will come with our full, integrated dev stack!

Preview of new CLI scaffolding tool

In preparation for you being able to run stable apps, we’re making it much easier to build and maintain them. One of the new dev tools is a refreshed scaffolder built on feedback from the developer community. It’s useful for both new and experienced hApp devs, offering a command-line-driven process for rapidly creating boilerplate code. This isn’t just back-end DNA code, but also front-end client code (in vanilla, Vue, Svelte, and Lit flavors), tests, and a dev environment setup that keeps all your project dependencies in sync. It also supports custom templates, and we expect that the dev community will come up with all sorts of creative uses for this: templates that target that favorite front-end framework of yours, or NodeJS, or Electron or Rust; templates that bundle popular and maturing hApp libraries; and more.

Eric Harris-Braun, Holochain co-creator and a prolific hApp developer himself, has this to say about the scaffolder:

"For people who have been in the web app development world for a while, you might remember when Rails came out with its scaffolding, how much of a game changer that was for building Web2 apps—the way you could scaffold an app that runs in literally ten minutes. And what we’ve got coming out will be, I believe, just as powerful for Web3 apps. That gives developers a super powerful ability to just get started."
What can you do now?

With Beta released now and dev tools targeted to be live by 9 February, we encourage you to start developing fully fledged apps on Holochain. We know that there are many projects who have come to love Holochain over the past years but have been waiting for a beta release before they put significant investment of effort into their app development. Now is the time. Let’s show the world what real Web3 apps look like.

Dev Training opportunity

Do you know Rust? In March we have a unique education opportunity led by the Co-founder of HackReactor. It is synchronous, online Holochain developer training. Please apply!

If you’re not a developer consider sending folks to start building your developer team, or help us create direct hiring opportunities for our next crop of alumni. With a stable beta, now is the time to learn and grow with the ecosystem.

What’s next?

Three key pieces:

We are working on an updated whitepaper, so you can read about the underlying logic behind the code. Keep an eye out for that. The Holochain team is also already working towards a beta 0.2.0 with enhancements to network security and performance, along with any breaking API, SDK, or protocol changes that are discovered to be necessary or useful. This will be released and run alongside Beta 0.1.x, giving you time for comprehensive testing and migration without a loss of support. We’ll also be updating the Dev Portal alongside the release of the components and dev tools to reflect these changes and provide an easy learning environment and knowledge base to help you build.

Indicio

Unifying Trust Registries and Trust Lists to Answer the Question of “Who Can You Trust?”

The post Unifying Trust Registries and Trust Lists to Answer the Question of “Who Can You Trust?” appeared first on Indicio.
Verifiable credentials are a powerful solution for verifying data, but whose verifiable credentials are trustworthy in any given use case? Two solutions to this challenge—Trust Registries and Trust Lists— have emerged from two different organizations. We explain their strengths, weaknesses, and differences, the effort to unite these groups, and how you can get involved in developing both.

By Mike Ebert

“How do we know which agents to trust?” is a problem you encounter quickly when creating a decentralized network (“agents” is the generic term for the software that issues, holds and presents, and verifies credentials; think of an “agent” acting on your behalf). Multiple organizations have come up with different answers to this challenge. Two of the most prominent are the Trust Over IP (ToIP) foundation with “Trust Registries,” and the Decentralized Identity Foundation (DIF) with “Trust Lists.” Coordination of these efforts has begun; and, at some point in the future, joint and separate work will be defined and duplication will be eliminated in order to provide unified standards for sharing trust information. At the moment, there are some key differences to each organization’s approach.

Trust Registries

To date, a Trust Registry has referred to a solution for sharing trust information via an API: You call the API when you have a question about trust and it provides you with that information, usually one item at a time. Trust Registries also allow for a cached copy of the data when necessary. One strength of a Trust Registry is that it is likely to be up to date: as you are constantly asking questions of the API, you will receive updated data the API has found in real time. But one downside to implementing a Trust Registry based on an API is that it requires a relatively high level of expertise and commitment to develop, host, and maintain.

To use a dictionary as an analogy, you can almost think of a Trust Registry as a search tool for an online dictionary — if you want to know the definition of a specific word, you don’t need to read the entire dictionary to find it.

Trust List

A Trust List is usually described as a solution for sharing trust information through a file-based approach that doesn’t require the heavier support requirements of running or using an API. Being able to publish, retrieve, or load a single file requires less effort for developers than building and maintaining an API. Because copies of the governance file are stored with each software agent, trust data is cached with each one. How often you retrieve that file can be configured, so you might retrieve it once a day or once a month; but whatever the setting, you have caching built in. A huge benefit here is that, should you lose connection to the internet, you can still look up the data you need because it is all stored locally. The tradeoff is that caching can be tricky because you have to consider if your information is recent enough and deal with the possibility that information that has been cached differently by different parties.

To go back to our analogy: With a Trust List, you now have the dictionary in your pocket, so you may need to look through all the words, but you can get the definition of a word without needing an Internet connection.

With a good internet connection, the end user will likely not be able to tell the difference between a Trust Registry and a Trust List; problems may arise for the API-based approach when there is poor internet connection and similarly for a Trust List if cached information differs between downloaded files. Each approach has benefits and tradeoffs that ecosystem builders and developers will have to consider.

How do they come together?

Led by Indicio’s Sam Curren, representatives from both organizations met at the most recent Internet Identity Workshop (IIW) and, through discussion, realized that their goals have a significant overlap. This has led to collaboration on compatible solutions and to ensure minimal or no duplication of effort.

The new goal is a coordinated, agreed-upon data format for sharing trust information. By building on the same data formats, anyone who needs to bridge the gap between API or file based solutions should be able to do so without having to resolve fundamental differences in what is being communicated. 

Indicio is working closely with both the DIF and the ToIP Trust Registry Task Force to help define this specification. Indicio also created an open-source governance file editor and is working on open-source reference implementations for how to create, share, interpret, and follow a governance file. Both organizations are likely to stick to their API or file-based approaches as described above, but in providing reference code for delivering trust information in the same format, interoperability between both solutions should be much more achievable.

One useful task for the two organizations would be to agree on a single name, so that people are not confused by artificial distinctions between Trust Registries and Trust Lists. The real differences lie in how the trust information is delivered via an API or a file and not in the core purpose or format.

If you are interested in this conversation, I highly encourage you to get involved in the discussions that are taking place right now as they are shaping the technology. You can find out more information on the ToIP (Trust Registry) solution here, and the DIF (Trust List) solution here.

If you are looking to implement a solution and have specific questions you think the Indicio team can help with you can get in touch with us here.

To learn more about how Indicio has implemented trust you can learn about the open source Governance Editor.

The post Unifying Trust Registries and Trust Lists to Answer the Question of “Who Can You Trust?” appeared first on Indicio.


Spruce Systems

Future State: Digital Credentials for Healthcare

Digital credentials and self-sovereign identity have the potential to revolutionize the healthcare industry by providing a more secure and efficient way to store and share important health information.

Digital credentials and self-sovereign identity have the potential to revolutionize the healthcare industry by providing a more secure and efficient way to store and share important health information.

A digital credential (such as the W3C Verifiable Credential) is a tamper-proof, cryptographically secure form of a machine-readable credential. For example, it might be a digital version of your SSN, your passport, or even a concert ticket. Your health records–immunizations, diagnoses for medical conditions, allergies, clinical vitals, lab results, prescriptions, and other health data–can also be represented as verifiable credentials, and therefore, owned and managed by patients themselves.

In this blog post, we'll explore the benefits of using digital credentials in the healthcare industry, including protecting sensitive health data from data breaches, improving patient care with more complete records, and reducing administrative burdens.

Protecting from Data Breaches
Verifiable credentials can secure health data from cyber threats, such as ransomware and data breaches. Health data is often stored in large centralized databases, which are increasingly vulnerable to these threats. With verifiable credentials, it is possible to decentralize the storage of health data to be accessed only by authorized individuals, which greatly reduces the risk of data breaches. This architecture can transform a centralized honeypot–a single point of failure–into a decentralized network supported with multiple points of resiliency and backups to meet high assurance requirements.

The U.S. Department of Health and Human Services Office for Civil Rights tracks all breaches of unsecured protected health information affecting 500 or more individuals, as required by section 13402(e)(4) of the HITECH Act. In 2021, there were over 27 million individuals affected by healthcare data breaches. In 2022, reported cases indicate over 48 million individuals affected by breaches, an increase of over 177% since the previous year.

The total number of incidents has also dramatically increased, with 601 breach incidents reported in 2022, a 226% increase from the 266 breaches reported in 2021. Breaches reported can result from hacking/IT incidents, theft, loss of records, or unauthorized access/disclosures. In most of these scenarios, wrongful access is granted to sensitive health records, either due to human errors, system errors, or lack of security controls. Currently, the individuals whose data is stored in these records are required to trust their healthcare providers and others who handle their sensitive data to have adequate systems in place to prevent unauthorized disclosures or loss of data.

An example architectural shift enabled by digital credentials eliminating a central database as the single point of failure.

In a new health records system, powered by verifiable credentials, patients would control disclosures and access to their data. Healthcare providers, pharmacies, or insurance companies would be granted access whenever required, but would not need to store a “honeypot” of sensitive health data on their internal servers, nor would they be able to share with any other third parties (either intentionally or unintentionally).

Better Care through Complete Records
Another key benefit of verifiable credentials in healthcare is that they enable patients to control and manage their own health data. Patients can create a portable, secure record of their health information that they can access and share with healthcare providers, insurance companies, and family members as needed. According to the Bureau of Labor Statistics, individuals hold an average of 12.4 different jobs between the age of 18 to 54. With each new job, the employer-sponsored health insurance provider may change, which requires the individual to find and establish new, in-network primary care. It becomes increasingly more difficult to piece together health history records with each new doctor’s office visit, primary care physician, specialty doctor, and same-day care facility.

Let’s use Alice as an example. Alice is an early 30s young professional who grew up in Florida and now lives in New York. As a child and teenager, Alice saw the same doctor every year for her annual checkups and regular vaccinations. After graduating from high school, she moved to California to attend university and enrolled in the student health insurance plan. She had a vague, but imprecise recollection of receiving different vaccines and, as a result, had to request records be sent from her family doctor in Florida to the university health office. Her new health records covering the care administered for her four years of university are stored in her student profile for the university health system.

After graduating from university, Alice moves to New York to pursue a career in finance. Her health insurance is now provided by the bank she works for, and Alice needs to find a doctor in New York in that health insurance network. She is no longer a student, so she’s lost access to her university student health portal and has to call the university health office to request that her health records be faxed to the new doctor. She also has to contact her family doctor in Florida to fax records to her new doctor, which can take a few days or even weeks. Between the ages of 22 to 32, Alice changes jobs four times, staying at each job for between two to three years. For each new job, Alice has new health insurance coverage, which requires finding new in-network doctors and repeating the process of requesting her health records over and over again.

Compared to average American workers, Alice is not a unique, fictional character. According to the Kaiser Family Foundation, 49% of Americans, or 156 million people, receive health insurance coverage through their employer. Insights from the Bureau of Labor Statistics show that the average worker tenure for employees aged 25 to 34 is 2.8 years. This means that 156 million people might be faced with establishing new in-network care every 2.8 years as they switch employers and, subsequently, employer-provided health insurance coverage. This results in spotty health records with incomplete care history.

Alice’s ability to establish care would be simplified dramatically with verifiable credentials allowing her to manage her own health records. Each time Alice needs to visit a new primary care physician, get a routine vaccine, or visit urgent care, it would be a simple, seamless process to share relevant health records and receive new records to store or share in the future. Alice would have a more complete, accurate picture of her health history to help any new doctor, either for routine care or emergency situations, and have all the context required to make informed recommendations for care plans. Medical records currently are often fragmented and incomplete, which can make it difficult for healthcare providers to diagnose and treat patients effectively. Applying self-sovereign identity within healthcare will empower patients to create a comprehensive record of their medical history that can be accessed by healthcare providers, allowing for better diagnosis and care.

Lowering Healthcare Administrative Costs
In addition to the benefits mentioned above, verifiable credentials can also help to reduce administrative burdens and overhead costs for healthcare providers. A study by the American Medical Association and Dartmouth-Hitchcock health system published in the Annals of Internal Medicine found that physicians only spend 27% of their office day on direct, patient-facing care time, with 49% of their day spent on electronic health records and administrative paperwork.

“This study reveals what many physicians are feeling – data entry and administrative tasks are cutting into the doctor-patient time that is central to medicine and a primary reason many of us became physicians,” said Steven Stack, M.D, Past-President of the American Medical Association.

With current systems, healthcare providers must spend a significant amount of time and resources verifying the authenticity of health records and other information, requesting health record transfers from previous care providers, and facilitating health record transfers between patients and insurance companies. Verifiable credentials make this process much more efficient, as the credentials can be easily verified using cryptographic algorithms, and, being fully controlled by the patient, accessed quickly on a need-to-know basis.

This future state with verifiable credentials powering health records wouldn’t completely eliminate the administrative overhead related to record-keeping, but it would streamline the processes with standardized formats that are easily verified. This can help to free up time and resources that can be used to provide better patient care, while also lowering overhead costs for healthcare providers–including fees paid to cloud service providers

Digital credentials have the potential to significantly improve the healthcare industry by providing a secure and efficient way to store and share important health information. They enable patients to control their own health data, create a full medical history, and protect sensitive health information from cyber threats. As the use of verifiable credentials continues to grow, we can expect to see many more applications of this technology in the healthcare industry.

Healthcare is just one example–virtually every industry that relies on data storage and record-keeping can be supercharged with verifiable credentials and similar technologies. Watch this space as we explore use cases across different industries.

About Spruce: Spruce is building a future where users control their identity and data across all digital interactions.


Shyft Network

Shyft Network — Token Distribution and Economics Update

Shyft Network — Token Distribution and Economics Update More than a year has passed since the launch of SHFT, a time of excitement, challenges, achievements, and hard work. Our community and partners have been highly supportive of our mission to power trust on the blockchain through data discoverability and compliance while preserving privacy and sovereignty. The economics behind our Shyft Networ
Shyft Network — Token Distribution and Economics Update

More than a year has passed since the launch of SHFT, a time of excitement, challenges, achievements, and hard work. Our community and partners have been highly supportive of our mission to power trust on the blockchain through data discoverability and compliance while preserving privacy and sovereignty. The economics behind our Shyft Network token SHFT reflects this mission and our vision for the ecosystem's future.

The Hard Facts (Jan 26, 2023 ) Initial SHFT supply: 2,500,000 Currently at block 8,884,061 Current SHFT circulation: 595473894.1127162 (Tokens that have been unlocked and claimed.) Current SHFT unlocked: 951,904,687 on Jan 26, 2023 (As, for example, yield wrap and AMM solutions have not been launched yet — unlocked tokens are accumulated and are not distributed. Thus, even as they are unlocked — they have not been touched and have not reached the market/users but may be introduced upon the launch of yield wrap programs. Also, it includes unlocked SHYFT treasury tokens) Maximum SHFT supply: 2,520,000,000 Projections: month 33 (Dec 2023), SHFT circulation 1,745,981,345 (conditioned to inflation rate remaining constant at 6% p.a.) Node Inflation rate: set at 6% of the initial universe, released on a per-block basis, perpetually

The initial supply for SHFT at the time of launch was 2,500,000. At the time of writing (Jan 26, 2023), the circulating supply of SHFT tokens was 595473894.1127162. The current circulating supply only accounts for the number of tokens released (per the token release scheduled) and claimed.

It is important to note that if SHFT tokens are unlocked but remain unclaimed, they won't be included in the circulating supply.

A proposal to change the way the circulating supply is calculated is being debated in Shyft DAO. If the community approves the proposal, the SHFT circulating supply will include all tokens released per the schedule, regardless of whether the tokens are claimed.

SHFT Token: The Fuel That Powers Shyft Ecosystem

Made publicly available on Polkastarter on Mar 26, 2021, Shyft Network's native token, SHFT, is available on two blockchains:

As the native layer 1 token for the Shyft Network blockchain, and As an ERC-20 token on the Ethereum blockchain. The ERC20 token is a wrapped version of the native SHFT token.

Virtual Asset Service Providers (VASPs) use the native layer 1 SHFT token to pay for the gas fees required to post attestations on Veriscope, the only frictionless Travel Rule Solution in the market, and comply with the FATF Travel Rule. Click here to find out more about attestations. Additionally, it is expected that users will be able to use SHFT tokens to pay for some fees for the DeFi solutions launching soon on the Shyft Network.

SHFT tokens can also be used for governance. Token holders can exercise their right to vote on Shyft DAO proposals.

Token Distribution

The SHFT token supply was divided into multiple allocations prior to distribution. Each allocation was attributed to a specific set of activities designed to benefit the Shyft community and network. The distribution ratio was also entered into the token distribution contract to ensure transparent and automated distribution. The distribution is organized as follows:

Next-Gen DeFi

5.8% of the maximum supply of SHFT token supply, which is 146,093,916 SHFT, is reserved for Next-Gen DeFi projects that are expected to be built atop the native Shyft Network. The distribution of Next-Gen DeFi funds will begin in the 39th month from the date of the public launch of the SHFT token (Mar 26, 2021) and will continue over the next 12 months.

Each month, between months 39 and 50, 12,174,493 SHFT tokens will be unlocked for Next-Gen DeFi and added to the circulating supply. At the time of the publication (Jan 26, 2023), which is month 22, no Next-Gen DeFi reserved SHFT tokens have been unlocked.

Wrap Yield

The Wrap Yield fund is expected to reward users holding Shyft Network-wrapped ERC-20 assets to incentivize user participation. A total of 117,282,495 SHFT tokens have been allocated to Wrap Yield, which is 4.65% of the token supply.

Each month, between months 1 and 120, 977,354.13 SHFT tokens will be unlocked for the Wrap Yield and may be added to the circulating supply as per the product launch timeline after the unlocks. Currently (month 22, Jan 26, 2023), the number of unlocked Wrap Yield reserved SHFT tokens is 21,501,790.75 SHFT.

AMM Yield

The AMM Yield fund incentivizes liquidity providers in SHFT liquidity pools. A total of 150,000,000 SHFT tokens have been allocated to AMM Yield, which is 5.95% of the token supply.

Each month, between months 1 and 180, 833,333.33 SHFT tokens will be unlocked for the AMM Yield and may be added to the circulating supply as per the product launch timeline after the unlocks. Currently (month 22, Jan 26, 2023), the number of unlocked AMM Yield reserved SHFT tokens is 18,333,333.33.

VASP Ecosystem Initializer

The VASP Ecosystem Initializer is reserved for promoting Shyft's Travel Rule-related products by incentivizing Virtual Asset Service Providers (VASPs). A total of 875,000 SHFT tokens have been allocated, which is 0.03% of the token supply.

These tokens were unlocked according to the following schedule:

125,000 were unlocked in the second month that followed the launch 125,000 were unlocked in the third month. The remaining 625,000 were unlocked in the fourth month, totaling 875,000 tokens.

All VASP Ecosystem Initializer reserved tokens have been unlocked.

Shyft Treasury

Shyft Treasury holds 166,260,215 SHFT tokens, representing 6.60% of the token supply. It consists of the unallocated SHFT tokens reserved for Shyft's partners, advisors, etc. The treasury will continue to hold onto these assets until it allocates them per their pre-decided purpose. So far, this allocation has not been unlocked.

Economic Metagame

With economic metagame, if a user wraps a token from one ecosystem (e.g., ERC20) to another (e.g., Shyft Network), a similar amount of SHFT tokens would automatically be burned from the economic metagame allocation. This automated action is designed to reward users for wrapping their tokens.

504,000,000 SHFT tokens, which is 20% of the token supply, have been set aside for the economic metagame. Depending on how many tokens users wrap to different ecosystems, these tokens would be burned over the next two years from the SHFT token public launch date.

To date, this allocation remains untouched. However, to preserve the allocation's original use, all 504,000,000 tokens will continue to be allocated to the economic metagame until the allocation is exhausted.

Core Team

450,000,000 SHFT tokens (17.86%) have been reserved for the Shyft core team. The distribution begins from the 24th month and will continue over 12 months with a monthly tranche of 37,500,000 tokens.

No Core Team reserved SHFT tokens have been unlocked by the date of this publication (Jan 26, 2023).

Advisors

Our advisors have been valuable partners in Shyft Network's development. To reward them for their unparalleled support, we have reserved 110,251,147 tokens, which is 4.38% of the maximum supply of Shyft Network's tokens.

The distribution began on the 13th month from the public launch date (Mar 26, 2021), with 6,125,063.72 SHFT tokens unlocked & released to advisors, followed by a 6-month pause. It resumed again on the month 20th (Nov. 2022) and is expected to continue release, in equal monthly installments of 6,125,063.72 SHFT tokens released until month 36th.

At the current time, the number of unlocked tokens allocated to Advisors is 24,500,254.89 SHFT.

Partnerships

While building the network's critical infrastructure, Shyft has inked many strategic partnerships with development teams and advisory entities. The goal is to expand the use cases of the Shyft Network and, respectively, the SHFT utility.

Shyft Network has reserved 127,334,060 SHFT tokens for strategic partnerships, which is 5.05% of the maximum supply of SHFT tokens. The distribution of tokens from the strategic partnership reserve will occur in two distribution cycles.

A total of 18,083,515 SHFT tokens were released in the first distribution cycle, which was only one month long, followed by a six-month pause. The second distribution cycle began in April 2022, the 13th month from the public launch date, and will end in February 2024. The amount of tokens released every month during the second distribution cycle varies.

Between March 2022 and February 2023, 3,389,739.94 SHFT tokens will be released monthly. Then, 7,973,071.27 SHFT tokens will be released in March 2023, the 24th month from the public launch date.

Between April 2023 and August 2023, 7,298,071.27 tokens will be released monthly. Next, from September 2023 to February 2023, 4,583,333.3 SHFT tokens will be released monthly. By then, the Partnerships reserve will be completely exhausted.

Currently, the volume of unlocked SHFT tokens from the Partnership reserve is 51,980,894.40.

Strategic Purchasers

Strategic purchasers fulfill the mandatory bonding requirements to enable the Shyft Network's interoperability bridge layer to function. It can be network participants, industry partners, and more.

134,215,584 SHFT tokens have been reserved for strategic purchasers, which is 5.33% of the maximum supply of SHFT tokens. The SHFT tokens from the Strategic Purchasers reserve will be unlocked in four cycles of varying lengths, starting in April 2021 with the release of 333,333 SHFT tokens.

In the second token distribution cycle (July 2021 to August 2022), a total of 91,239,394.06 SHFT tokens were unlocked, whereas in the third one (December 2022), 250,000 SHFT tokens were unlocked. The fourth distribution cycle will begin in January 2024 and end in September 2024, and 42,392,857 SHFT tokens will be unlocked during this period, exhausting the Strategic Purchasers' reserve in the process.

The Strategic Purchasers reserve SHFT tokens unlocked as of this date are 91,822,727.06.

Public Distribution

2,500,000 tokens which is 0.10% of the maximum supply of SHFT tokens, were distributed through Shyft Network Initial DEX Offering (IDO) on Polkastarter, which took part on Mar 26, 2021.

Purchasers

507,987,583 SHFT tokens, or 20.16% of the total supply of SHFT tokens, have been reserved for purchasers. They bought SHFT tokens before the launch date, which was Mar 26, 2021. The allocation began in March 2022 and will continue until June 2023.

To date, 380,990,686.64 SHFT tokens have been distributed to purchasers.

Token Reserve (People)

103,200,000 SHFT tokens, or 4.10% of the entire supply of SHFT tokens, have been reserved for Token Reserve, which is meant to facilitate liquidity across the Shyft ecosystem. The Token Reserve was fully exhausted after four months of the public launch.

Inflation

Similar to real-world currencies, Shyft Network's native token, SHFT, too, is influenced by inflation. The rate at which the SHFT tokens will be released yearly has been fixed at 6% per annum.

SHFT tokens are subjected to three types of inflation: Staking — Adjusted Inflation, DAO — Adjusted Inflation, and Nodes — Adjusted Inflation. Each of these inflation sub-types contributes 2%, bringing the total inflation rate to 6% p.a.

To address the block-wise inflation, 11,767,813 SHFT tokens are added per month to the circulating supply, which is less than the expected amount of 12,600,000 SHFT tokens if we go by the 6% annual inflation rate.

You can find our Shyft Network block explorer here: Shyft POA Explorer For updated information on Shyft Network's token supply, please check out Shyft Network, SHFT live marketcap, chart, and info | CoinMarketCap.

Shyft Network powers trust on the blockchain and economies of trust. It is a public protocol designed to drive data discoverability and compliance into blockchain while preserving privacy and sovereignty. SHFT is its native token and fuel of the network.

Shyft Network facilitates the transfer of verifiable data between centralized and decentralized ecosystems. It sets the highest crypto compliance standard and provides the only frictionless Crypto Travel Rule compliance solution on the blockchain while ensuring user data is protected.

Visit our website to read more: https://www.shyft.network, and follow us on Twitter, LinkedIn, Discord, Telegram, and Medium. Also, sign up for our newsletter to keep up-to-date.

Shyft Network — Token Distribution and Economics Update was originally published in Shyft Network on Medium, where people are continuing the conversation by highlighting and responding to this story.


Tokeny Solutions

Tokeny’s Talent|Laurie’s Story

The post Tokeny’s Talent|Laurie’s Story appeared first on Tokeny.
Laurie Cappelle is a Digital Marketing Intern at Tokeny.  Who are you?

Hello everyone, I am Laurie, and I am 23 years old. I was born in the north of France, close to Lille, where I have studied almost all my life, but I have always loved traveling, so I finished studying in Italy, and Spain. I am now living in Granada, Spain, and I love the city. During my free time I love to go to hike, take pictures and spend time on social media.

What were you doing before Tokeny and what inspired you to join the team?

After having studied business administration, I realized that what I really liked was marketing, so I decided to do a master in marketing. When I saw that a position was available at Tokeny, I immediately thought that it would be a good opportunity for me. Effectively, I would help in all the domains of marketing and would learn a lot of things. Moreover, even though I did not know much about blockchain and tokenization, I always thought that it was interesting.

How would you describe working at Tokeny Solutions?

Working at Tokeny is fascinating, I am learning new things every day, as well about marketing as about the blockchain and tokenization world. Moreover, all the colleagues are really nice and helpful, they are always here to help if we need it. 

Besides, it is a pleasure to see that the company is doing its best to be flexible and for us to feel good at work, and we have colleagues coming from all over the world.

What are you most passionate about in life?

My passion is traveling. I love discovering new cultures, languages, food, landscapes, and cities. I also really enjoy taking pictures and sharing them on social media.

Moreover, I like learning languages. I speak fluent French, Spanish and English, a bit of Italian, and I am learning Russian. 

I really appreciate learning new things and, since I started working at Tokeny, I have been discovering the blockchain universe and the potential it has into the financial world. 

What is your ultimate dream?

I just want to be happy, having the people that I love close to me and traveling to the most places possible. I think that everyone should be able to have a happy life living where they want to live and working of something they like.

What gets you excited about Tokeny’s future?

I see that Tokeny is growing very fast and that the tokenization sector is only at its beginning. I believe that Tokeny has a bright future and will help to revolutionize the banking sector, and I am glad to do my internship here.

She prefers:

Tea

check

Coffee

check

Movie

Book

Work from the office

check

Work from home

Dogs

check

Cats

check

Call

Text

Salad

check

Burger

check

Ocean

check

Mountains

Beer

check

Wine

check

Countryside

check

City

check

Slack

Emails

check

Casual

Formal

Crypto

check

Fiat

Morning

check

Night

More Stories  Tokeny’s Talent|Ivie’s Story 1 July 2022 Tokeny’s Talent|Sefa’s Story 24 November 2022 Tokeny’s Talent|Thaddee’s Story 2 June 2022 Tokeny’s Talent|Radka’s Story 4 May 2022 Tokeny’s Talent|Barbel’s Story 17 December 2021 Tokeny’s Talent|Shurong’s Story 20 November 2020 Tokeny’s Talent|Alexis’ Story 26 October 2022 Tokeny’s Talent|Eva’s Story 19 February 2021 Tokeny’s Talent|Xavi’s Story 19 March 2021 Tokeny’s Talent|Cyrille’s Story 17 September 2021 Join Tokeny Solutions Family We are looking for talents to join us, you can find the opening positions by clicking the button. Available Positions

The post Tokeny’s Talent|Laurie’s Story appeared first on Tokeny.


Identosphere Identity Highlights

Identosphere 118: Identity Harms • Transmute, Beta Testers Wanted • GBA launches Battery Passports

A weekly digest on self sovereign identity: upcoming events, company news, policy, organizational updates, standards development and more, For Over 2 Years!!! Support us on Patreon!!!
Welcome to Identosphere - We Gather, You Read Please Contribute to our efforts by PayPal, or Patreon! We’ll keep aggregating industry info. Be sure your blog has an rss feed so we can better track and feature your content! Send us a message at newsletter@identosphere.net if you have an RSS feed you want us to track for this digest. Upcoming 

Entra Verified ID: A trustworthy way to verify remote employees 1/19

Identity, Authentication and the Road Ahead: A Cybersecurity Policy Forum 1/25

GS1 Global Forum 2/13-16

DID:Day 3/1 around ETHDenver

APAC Digital Identity unConference 3/1-3 Bangkok, Thailand

Thoughtful Biometrics Workshop virtual unConference 3/13-17 [Registration]

Internet Identity Workshop 4/18-20, Mountain View, CA

Literature A Socio-Ecological View Of Decentralized Identity Foresight News

As our transition to a more digital society continues, what will this look like from the individual’s perspective, and to what extent will the individual have any agency in how their digital identity is manifested?

What if your identity ecosystem caused pollution? Wider Team 

Talking about the identity harms white paper from one of the primary authors. 

The paper looked at identity systems that enabled predators to hound a child to suicide and facilitate gambling addiction without a responsible person to blame.

Identity ecosystems are built up over time with legacies of cultural assumption and prejudices, of old choices and narrow contexts. The root causes of these harms are untraceable, and the root causes are unaccountable.

Complete Knowledge (paper) James Austgen, Kushal Babel, Vitalik Buterin, Phil Daian, Ari Juels and Mahimna Kelkar

Powerful new technologies such as TEEs and MPC threaten to upend the classical proof-of-knowledge landscape. CK offers a way to prevent such encumbrance of secrets and restore the intuitive idea of knowledge being attributable to a single entity

Phil Windley SSI Doesn't Mean Accounts Are Going Away

SSI offers new ways of supporting relationships, especially ephemeral ones, that means companies need to store less.

Markets Market Signals — Increased Interoperability, Regulations, and Adoption to Come! Indicio

US regulators will require tech giants to adopt non-proprietary identities, says Mike Engle. “VPNs will give way to identity-based perimeters for the virtual workforce” says Engle, Security Boulevard. ‘2 billion passwords were leaked in 2021 alone, and roughly 70% of breached passwords are still in use.

Predicts 2023: Users Take Back Control of Their Identities (source) Dr Ant Allan

We have appeared in Gartner Reports apparently but it is behind a paywall. Someone posted this pick (but only one VC format puts schema formats and definitions on ledger, they also call it a Verifiable Claim - not Verifiable Credneital).

Explainers

Using Decentralized Identifiers (DIDs) to Authenticate Your Devices (Device Arbitration) Transmute

[101] What is Decentralized Identity verification, its Application, and Drawbacks? Abhishek Pratap Singh

AnonyomeLabs How Do Blockchains Provide the Trust Foundation for Decentralized Identity-Based Apps? the VDR allows users and services to verify their authenticity by proving they hold the private key corresponding to the public key written to the VDR. It is often called a decentralized PKI.

Entrust DECENTRALIZED IDENTITY – KNOW YOUR CUSTOMER (KYC)

Myths and Truths About Decentralized Identifiers Curity

Myth: DIDs Require Blockchain

Myth: DIDs Should Contain Personal Data

Myth: DIDs Are Self-Controlled Identities

Truth: DIDs Do Not Solve All Problems

[ContinumLoop] Trust Registries: A Real-World Example with Bonifii’s MemberPass® increase trust and streamline business operations in various industries.

Company news Metadium 2.0: Our Roadmap Metadium

The Metadium platform is developed with the expansion of META ID and the availability of smart contracts as a top priority. The platform will provide EVM compatibility for an easy onboarding process and the smooth processing of tasks (deployment, execution, etc)

[Video/Podcast] Re-architecting the internet around the individual with Evin McMullen, Co-Founder & CEO of Disco Crypto Sapiens

Anyone who has met Evin knows she isn't just brilliant but capable of breaking down complex subjects into fun, accessible bits, and this interview is no different. 

BetaTesters Wanted Transmute

We are looking for users with interest or experience in digitally securing a supply chain ecosystem using cryptographically verifiable data.

Standards Work New Baseline Protocol Reference Implementation Hits Milestone EthEntAlliance

Recently, the Baseline Core Developer Community announced on the Baseline Show the completion of a major milestone for the third Baseline Reference Implementation (BRI-3), which focuses on simplifying the implementation and use of the Baseline Protocol.

Policy Digital Identity.NZ Will 2023 be remembered as ‘the melting pot year’ for digital law? 

The EU is a great example where in the fullness of time the upcoming Data Act will sit beside the Data Governance Act, the GDPR, the Free Flow of Non-Personal Data Regulation, the Open Data Directive, the Digital Markets Act and the Digital Services Act as part of the European strategy for data, not to mention EIDAS 2.0.

IAM Not SSI  2023: Perspectives from the ForgeRock C-Suite ForgeRock

Predictions on insider threats, passwordless authentication, artificial intelligence, and more Few industries move as quickly as cybersecurity, broadly, and the identity and access management (IAM) segment, specifically.

DWeb

Of course the attention economy is threatened by the Fediverse John Udell - I’m no longer employed in the attention economy. I just want to hang out online with people whose words and pictures and ideas intrigue and inspire and delight me, and who might feel similarly about my words and pictures and ideas

Working with Mastodon lists John Udell - First I encapsulated the relevant APIs in a pair of tables provided by the Steampipe plugin for Mastodon: mastodon_list and mastodon_list_account.

The Decentralized Internet: A Future State - ConsenSys Mesh begins a two months series. 

The Decentralized Internet: A Future State will explore decentralized storage, decentralized identity, decentralized media, and the new realities these make possible. The series will explore how the premise of increasing decentralization can bring us closer to that original promise.

Thoughtful

Kind, Friendly Designs David Smith - I don’t want to make apps that are unkind. I want apps that are encouraging and build up their users. I want to make apps that don’t make you feel bad about yourself.

Layoffs are bullshit Werd.io - Layoffs do not solve what is often the underlying problem, which is often an ineffective strategy, a loss of market share, or too little revenue. 

Ben says Werd.io - Speaking loudly about ethics in tech and media is more important than it's ever been. And it'll only become more important as time goes on. Use your voice - please.

Web 3

WeForum Going mainstream: four Web3 developments to watch in 2023 - The global Web3 market was worth an estimated $3.2 billion in 2021, and is projected to continue growing.

VentureBeat Next wave of DeFi will be driven by decentralized identity solutions - DIDs could be designed to always stay in line with legislation in a given jurisdiction, meeting the regulators halfway and preventing the regulations from being broken.

Organization GitCoin Passport Sybil Defense Gitcoin

gm to everyone who has distinguished themselves from Sybil attackers by creating a @GitcoinPassport and adding a bunch of stamps 

Quick thread on why we offer so many different ways for you to indicate your trustworthiness to systems like Gitcoin's Grants program 

Global Battery Alliance Launches World’s First Battery Passport Proof of Concept GlobalBattery

supported by @kryha_io's Re-Source tech! The GBA also links to CIRPASS, an EU initiative for  digital product passports for batteries (tech support by @EnergyWebX). - @GreenAltCrypto

Hyperledger Mentorship Program

All the blogs from the 2022 Hyperledger mentees are now online! @Hyperledger

Thanks for Reading

Read more \ Subscribe: newsletter.identosphere.net

Please support our efforts by Patreon or Paypal

Contact \ Submission: newsletter [at] identosphere [dot] net


auth0

Mastodon for Developers: Everything You Need to Know

Learn how to use Mastodon effectively as a developer.
Learn how to use Mastodon effectively as a developer.

KuppingerCole

Cloud Access Governance

by Paul Fisher Across the globe there has been a significant increase in the adoption of cloud and multi-cloud environments, as businesses scramble to take advantage of digital transformation. With more clouds comes more access and more data spread across expanding IT infrastructures. With a further shift towards Infrastructure as Code (IaC), there is now a huge challenge to keep everything secure

by Paul Fisher

Across the globe there has been a significant increase in the adoption of cloud and multi-cloud environments, as businesses scramble to take advantage of digital transformation. With more clouds comes more access and more data spread across expanding IT infrastructures. With a further shift towards Infrastructure as Code (IaC), there is now a huge challenge to keep everything secure with advanced cloud access and governance tools. This Whitepaper looks at the issues and the options available to IT managers and security strategists.

Going Passwordless – Separating Identity and Authentication

by Alejandro Leal Identity and Authentication Digital transformation can be defined as a process that organizations go through to deliver digital services to their customers and consumers in the Digital Age. Essentially, delivering digital services requires the management of the digital identities of consumers, customers, and business partners in a secure and seamless manner. Therefore, the suc

by Alejandro Leal

Identity and Authentication

Digital transformation can be defined as a process that organizations go through to deliver digital services to their customers and consumers in the Digital Age. Essentially, delivering digital services requires the management of the digital identities of consumers, customers, and business partners in a secure and seamless manner. Therefore, the success of digital transformation initiatives depends on managing access and managing these digital identities.

In contrast to the increased centralization of digital identity, the authentication layer has become more complex and fragmented. However, it's important to understand the difference between identity and authentication before we get into passwordless authentication. Identity is who you are. Identity proofing is about knowing whether someone is who he or she claims to be. It's the process of verifying the identity of users based on life history, biometrics, and other factors before granting them access to an application or system.

On the other hand, authentication is an ongoing identity-proofing process that ensures both the identity of digital users and the integrity of their devices. Nevertheless, disruptive login experiences and the continued reliance on passwords are creating significant challenges for consumers and enterprises. As a result, organizations are starting to adopt new authentication mechanisms that go beyond the traditional username and password. This passwordless approach leverages public-key encryption and open standards to offer greater flexibility and increase both security and convenience.

The Birth of Passwordless

In today's digital world, identity theft and credential-based attacks are some of the most pressing concerns for companies and consumers alike. For years, the use of passwords as the primary method of authentication has made identity a nuisance in the digital world. Since data breaches are often the result of stolen credentials and compromised passwords, organizations are looking to adopt more modern forms of authentication that can help close the security gaps that are associated with the use of passwords.

As a result of the security risks and inconvenience of passwords, businesses and organizations are starting to embrace passwordless authentication technologies. The evolution of passwordless was driven by the adoption of multi-factor authentication (MFA) and Single sign-on (SSO). The combination of these two technologies, coupled with the exponential growth in smartphone use and the development of open standards such as FIDO2 and WebAuthn has further generated adoption of passwordless systems.

Legacy MFA solutions were supposed to solve the problem with passwords, but these still rely on the password as a backup or first factor of authentication. MFA requires users to provide two or more factors in order to be authenticated: something they are, something they have, and something they know. Unfortunately, some of these factors are prone to phishing attacks, such as mobile SMS codes, voice calls, push notifications, and one-time passcodes (OTP).

SSO is an authentication method that allows a user to enter one set of login credentials (such as name and password) in order to access multiple applications. With an SSO mechanism in place, users only need to authenticate themselves once regardless of which applications or websites they access. While SSO can be paired with password-based authentication, integrating it with passwordless access is the key to unlocking its full potential.

Separating Authentication from Identity

Organizations must choose between adopting a single identity platform or maintaining multiple fragmented identity systems as they move to the cloud. The separation of authentication from identity providers is being implemented by businesses and organizations in order to fully leverage passwordless technologies. It is important for passwordless technologies to have a passwordless authentication layer in front of the various access management systems.

Passwordless authentication should work across everything –all attack surfaces and identity sources, apps and devices, the VPN, SSO, Azure AD, operating systems, workspaces, servers, and whatever your organization has in place. Some solutions in the market are using passwordless at the device, and then federating to other access management services or directly into applications.

Separating identity and authentication is a very important thing. If implemented successfully, organizations will provide their users with a frictionless user experience and a consistent authentication approach. In addition, IT teams will no longer be overburdened with managing multiple identity and MFA products. Thus, fully leveraging the potential of passwordless technologies.


OWI - State of Identity

Synthetic ID vs Thin-File

What is a synthetic identity and who is doing it? On this State of Identity podcast, host Cameron D’Ambrosi and Kurt Weiss, Vice President of Enterprise Sales at Ekata discuss synthetic identity and the levels of sophistication. Can it be solved, and what are the keys to solving the problem? 

What is a synthetic identity and who is doing it? On this State of Identity podcast, host Cameron D’Ambrosi and Kurt Weiss, Vice President of Enterprise Sales at Ekata discuss synthetic identity and the levels of sophistication. Can it be solved, and what are the keys to solving the problem? 


1Kosmos BlockID

Zero Trust and Customer Experience

There is a classic juxtaposition within security controls. Organizations need to make data and services available, but if it’s too easily accessible, too open, then a data breach can occur. On the other hand, if data and services are too restricted then the controls are marginalized and ignored. Organizations struggle with balancing the risk between … Continued The post Zero Trust and Customer E

There is a classic juxtaposition within security controls. Organizations need to make data and services available, but if it’s too easily accessible, too open, then a data breach can occur. On the other hand, if data and services are too restricted then the controls are marginalized and ignored. Organizations struggle with balancing the risk between easy access and advanced controls. Whichever approach is taken impacts the user experience, and both can lead to less than desirable results if done poorly.

I’ve asked many IT professionals if they consider the end user when implementing a new security protocol. Surprisingly the answer is often, they don’t. The reasoning for my question is simple. If you know more about the users being asked to perform the task, the task can be implemented with less friction.

Friction, or the user experience, is a critical consideration to security. To achieve the desired security outcome, IT organizations must understand and develop with their customers’ motivations and behaviors in mind. In doing so, the intended ask will be met with less resistance and higher adoption, therefore improving overall security.

Why SSO Is Still A Must

Let’s clarify with something first – SSO is still a must in an enterprise infrastructure stack.

An end user will not type in their credentials for every access point. But, SSO in its current form and with 2FA or MFA as part of the access flow does not meet Zero Trust standards. And this construct ties back into what I started with above. Organizations need to balance the risk between easy access and advanced controls.

Zero Trust is a proactive security approach that continuously verifies users, devices, and services before trusting them. This approach is commonly summarized as “never trust, always verify”. Essentially, Zero Trust assumes that anything connecting to a system is a potential threat that should be verified before earning trust. But to do that security teams need to know as much about the identity accessing the resources as possible. Without this fundamental knowledge of identity, security is more hope-based than fact-based and impacts the effectiveness of a Zero Trust architecture.

On its own, the Zero Trust sounds like it will cause friction at the time of user access, a poor user experience. But that doesn’t have to be the case. Many of the access types that are in existence are here because the end users are creating poor passwords. These passwords can be hacked or even phished. So the transition to a Zero Trust architecture can be an opportunity for organizations to improve the user experience and implement technologies that improve security and improve user experience at the same time.

How can that be done? With Identity. What does identity have to do with Zero Trust architecture? It’s a pillar of the Zero Trust architecture because when you verify user identity at each point of access, you proactively verify users before a breach can happen. This is in line with the “never trust, always verify” core principle of Zero Trust. But to securely authenticate a user, one must first implement an indisputable identity proofing process. Because an indisputable proofed ID must involve the triangulation of a user claim with biometrics. Implementing this element of identity management will ensure that every access attempt is tied to a trusted and verified identity.

The result is a secure access infrastructure that is based on verified identities tied to the user’s biometrics. So instead of using passwords and trying to secure them with additional authentication methods, the user’s identity becomes the access method.

1Kosmos Supports Zero Trust and a Better User Experience

The 1Kosmos BlockID platform ensures that individuals are who they claim to be by using an identity-based approach to authentication. We bring identity into the security forefront so that organizations implementing a Zero Trust infrastructure know with certainty who is accessing IT assets and online services.

This means we have a quick and convenient way for users to self verify their identity using government, telco, and banking credentials. Then, once verified, workers, citizens, and customers use their digital identity to be utilized at login or transaction approval. This identity pre proofing injects a level of trust into the Zero Trust implementation and provides users with a frictionless experience. Organizations will implement their Zero Trust deployment with a significantly improved access user experience and high level of identity assurance for the identity on the other side of the digital connection.

The post Zero Trust and Customer Experience appeared first on 1Kosmos.


Ontology

Meet the Team: Ontology’s Community Manager for India, Dixit

What’s your name and where are you from? My name is Dixit and I am from New Delhi, India. Tell us a bit about yourself. I am a blockchain enthusiast and also a web developer. Outside of tech and development, I also have a good experience working in marketing and communications. What kind of work do you do on a day-to-day basis? I am the community manager for Ontology, India and I jo
What’s your name and where are you from?

My name is Dixit and I am from New Delhi, India.

Tell us a bit about yourself.

I am a blockchain enthusiast and also a web developer. Outside of tech and development, I also have a good experience working in marketing and communications.

What kind of work do you do on a day-to-day basis?

I am the community manager for Ontology, India and I joined in December 2021. I communicate with the Indian community members and provide them with updates in Hindi Language. I also manage the Ontology, India Twitter account. I find this is an interesting activity because it allows me to interact directly with Ontology’s close-knit Indian community.

In your opinion, what makes Ontology stand out from other blockchains?

I think the fact that Ontology has been growing continuously for 5 years now is what makes it stand out from other blockchains. This shows that the team is focused on continuous innovation and is here for the long term, which is particularly important during challenging market conditions such as those we’re currently facing.

What is the most exciting part of the project you’re working on for Ontology?

I’m really impressed by the ONT and ONG token models. Both models have separate functionalities but they still work together in harmony. The low gas fees and flexible API also attract users — so I see huge value in those elements of Ontology too.

What has been the most important moment in your career so far?

The most important moment for my career so far was taking the time to really understand Ethereum in 2016. Despite being involved in Bitcoin since 2014, my ideas in the space were limited to scalability infrastructure or payment systems for Bitcoin. After truly understanding how smart contracts worked, however, my mind was opened to all the possibilities of financial and non-financial systems that could be fully run on a blockchain.

What are you most excited for in your future at Ontology?

Coming into the cryptocurrency world was the best move I made and worked for. The past few years have been life changing for me, thanks to blockchain.

Follow us on social media!

Ontology website / ONTO website / OWallet (GitHub)

Twitter / Reddit / Facebook / LinkedIn / YouTube / NaverBlog / Forklog

Telegram Announcement / Telegram English / GitHubDiscord

Meet the Team: Ontology’s Community Manager for India, Dixit was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


Transmute TechTalk

See the Global Supply Chain with Knowledge Graphs and UN Web Semantics

This article was based on Transmute Solutions Architect Nis Jespersen’s ‘UN/CEFACT Linked Data’ presentation from December 2022. Leading the UN Web Vocabulary project, I presented at the December 2022 UN/CEFACT Forum. The UN Forum sessions are not recorded, so I thought I would just capture the main points in this article. The essential part of my presentation was a live demo, building

This article was based on Transmute Solutions Architect Nis Jespersen’s ‘UN/CEFACT Linked Data’ presentation from December 2022.

Leading the UN Web Vocabulary project, I presented at the December 2022 UN/CEFACT Forum. The UN Forum sessions are not recorded, so I thought I would just capture the main points in this article.

The essential part of my presentation was a live demo, building a supply chain knowledge graph from the bottom up. In doing so, I gradually introduced the full tech stack in play:

The essentials of APIs, JSON and JSON Schema Adding semantics with JSON Linked Data Building Knowledge Graphs from LD files The important role of standardized Web Vocabularies

I typically also talk about Verifiable Credentials and Decentralized Identifiers. But not this time — this is all about UN semantics and data graphs.

Introducing The UN/CEFACT Web Vocabulary

The UN/CEFACT Buy-Ship-Pay model is the undisputed semantic model for terms in global trade. It has been around for decades, and is broadly recognized, adopted and implemented.

UN/CEFACT Web Vocabulary brings this model to the web, expressing the existing trade terms as a library of so-called Unique Resource Identifiers — URIs.

https://vocabulary.uncefact.org/

The above example shows the definition of the term Trade Party, behind the URI https://vocabulary.uncefact.org/TradeParty. The URI itself is great for unambiguously express intent; resolving it to the above documentation page makes it human understandable too.

We will come back to the UN/CEFACT Web Vocabulary shortly and why it matters. But for proper context, we will start a couple of layer down the tech stack, level-setting on some API essentials.

APIs, JSON and JSON Schema

Since you are reading this, there is a pretty good chance that you have heard of APIs before.

An API is a way for computer systems to communicate over HTTP. It is the same protocol serving you this article right now, an API is just tailored to be invoked by computers instead of humans. This is done by stripping away all the graphical elements and putting more emphasis on the structure of data.

JSON Schema, Defining Data

APIs are made up of “endpoints”, each of which can define a data structure for what you sent to it (the request) , and another data structure for what you can expect to get back (the response). A definition of such a data structure is called a schema. Typically, data gets encoded in the JSON syntax, so you can think of an API endpoint consisting of a Request JSON Schema and a Response JSON Schema.

The Request and Response JSON Schemas of a sample API.

Each JSON Schema defines for example hierarchical structures and naming of attributes. Note that a JSON Schema does not contain data — only how data must be structured.

JSON, the Actual Data

JSON files carry actual data. JSON is just a file format, much like .doc or .txt. A JSON file can be validated against a JSON Schema — this is how an API controls what data gets exchanged.

While JSON is designed for data transfers, it is also quite readable by humans:

{
"id": "https://sales.online-shop.global/inv/0000112318",
"type": "TradeTransaction",
"applicableTradeSettlement": {
"type": "HeaderTradeSettlement",
"invoiceDocument": {
"id": "urn:uuid:9c3c66a6-f49f-485e-a7b2-013fb4a0e0a8",
"type": "Document"
},
"invoicerParty": {
"id": "did:web:online-shop.global",
"type": "TradeParty",
"postalAddress": {
"type": "TradeAddress",
"attentionOf": "Nis",
"streetName": "Runebergs Alle",
"cityName": "Copenhagen",
"countryName": "Denmark"
}
},
"invoiceeParty": {
"type": "TradeParty",
"postalAddress": {
"type": "TradeAddress",
"streetName": "Sunshine Ave",
"cityName": "Austin",
"countrySubDivisionName": "Texas",
"countryName": "USA"
}
}
}
}

Unless you have a severe case of code-allergy, you should intuitively get the gist of this JSON file: “Trade Transaction”, “Invoicer Party” and “Invoicee Party” — this is a something about an invoice.

If you know UN/CEFACT you might even recognize the exact terms used. Trade Transaction, for example, means something very specific to those who “speak the UN/CEFACT language”. Such a common language helps the developer on the receiving side interpret the data according to the sender’s intention.

While such a “common language” is much better than nothing, the human interpretation aspect entails certain challenges:

Costly, as developers must make data mappings on “both sides” of the API. Error prone, because all human involvement is, even assuming alignment to a common specification is established. Does not scale, if you have 10 customers using your API, they must each have a developer team doing data mapping 10 times over. API breaking, a live API cannot just be changed to align to UN/CEFACT will without breaking the API contract (JSON Schemas), greatly displeasing your customers.

While many organization accept these shortfalls of working with raw JSON, there is a much smarter way, namely…

JSON Linked Data

What traditional integration developers do is add the context needed for the target computer to work with the data. JSON-LD allows the sender to add this context. Literally, using a keyword called @context. The context maps the “human friendly” terms used in the JSON to “machine friendly” URIs.

In non-technical terms: when the sender is more explicit, it is less ambiguous for the receiver to understand the message.

Adding context switches from “encode once, interpret anywhere” to “interpret once, understand everywhere” which is great for scalability economics.

Even better: adding a line with the@context definition into your JSON doesn’t even break your existing APIs! If the receiver does not have JSON-LD support, the JSON library will just ignore this attribute.

As an example, the invoiceDocument attribute in the earlier example really isn’t anything but a string. But the @context maps this to a computer-friendly URI such as https://vocabulary.uncefact.org/invoiceDocument.

The example below was the first live demo of my presentation. It shows how adding just the@context lets the Linked Data processor automatically structure the data.

Adding an @context statement to the JSON (left) makes the data processable by a computer (right).

To recap: with very little effort, we can add precise semantics to our data. We make the data meaningful. Our next step will be to turn that meaning into knowledge.

Knowledge Graphs

The JSON-LD processing we just saw above actually picks apart the JSON, turning it into individual basic statements. Each statement is called a triple, because the consists of three things: subject, predicate and object.

For example: “The consignment’s (subject) consignor (predicate) is a business called Global Online Shop (object)”.

Another triple could be “Global Online Shop’s location is the UN/LOCODE DKCPH”.

We can piece together these two statements: Consignment — Global Online Shop — DKCPH. This way we can infer that the consignment is going to Copenhagen. The JSON-LD file is actually a data graph, which we are traversing for insights.

How a sample Bill of Lading data snippet turns into a data graph with JSON-LD.

The above diagram illustrates the data graph aspects of a Bill of Lading document which could be transferred through any standard API. But because JSON-LD is based on URIs (and not vague strings like “a business called Global Online Shop”), the graph does not have to be limited to just one JSON file. Say we have another API which deals with invoices, and we route inbound messages to a graph database. A graph database can recognize common URIs and easily deal with overlapping graph segments. This way, we can continuously piece together larger and deeper data graphs.

Data graphs from two separate JSON-LD files “snap together”.

The above example is also derived from my live demo. By importing a Commercial Invoice “on top of” the previously imported Bill of Lading, we realize that:

The consignee of one document is the same organization as the invoicer party of the other. We expanded the common knowledge of this organization, now knowing both its postal address and UNLOCODE.

We did this with literally no manual data mapping. The knowledge graph just “snaps”into place like magnets.

A knowledge graph constructed from very large amounts of JSON-LD files. Pretty-looking knowledge graph on the right curtesy of https://medium.com/@annalienk/investigation-of-the-flow-of-tweets-d0b1c31d915b.

This means we can dump massive amounts of JSON-LD files at the graph database. Data can come from different origins, APIs, data schemas, etc — it will all still snap together automatically. A key feature of graph databases is their ability to reveal hidden relationships across siloed data.

Extracting Knowledge

Not having to worry about the hassle of fitting data together, we can focus our efforts on data analysis. We can do this with standard data queries. For example “return all consignments to be delivered in Denmark”.

In my final demo at the UN/CEFACT Forum I it took it a step further, though, introducing some basic data science tooling. This means applying graph algorithms and machine learning on the data graph’s explicit relationships, in the search for its implicit relationships. I used Neo4j’s Graph Data Science which offers rich library of such features.

Specifically, I ran the Betweenness Centrality graph algorithm on the knowledge graph we created earlier, build from a Bill of Lading and a Commercial Invoice. The result is illustrated graphically below:

Betweenness Centrality algorithm applied on the previous example.

This result reveals how often shortest paths between nodes pass through a given each node. Unsurprisingly, the “Global Online Shop” node scores high — remember, this was the node which connected the two subgraphs.

Data scientists typically connect multiple graph algorithms in their search for patterns. Here are links to a couple of examples which I have shared in earlier articles:

Determining untrusted subgraphs of the UN Trust Graph.

Determination of Verifiable Credential data originating from untrusted subgraphs, based on the UN Trust Graph concept:
https://medium.com/transmute-techtalk/the-united-nations-trust-graph-d65af7b0b678

Trade Party Community Detection.

Trade Party Community Detection, determined from running a series of graph algorithms on basic verifiable credential issuance patterns:
https://medium.com/transmute-techtalk/neo4j-graph-data-science-with-verifiable-credential-data-98b806f2ad78

Semantics is Everything

We have now gone through the whole tech stack and at the same time progressed from Data to Meaning to Knowledge.

Without semantic context, raw data is meaningless. Traditional APIs depend on human intuition and labor to make sense of data. But we have seen how simple it is instead to add an explicit, declarative context and let computers do all the hard work.

When meaning is automated, we can be much more flexible with our data sources yet still shift our attention to gaining knowledge. With all those previously disparate datasets now connected, by leveraging modern algorithms we can extend our knowledge into the implicit relationships, answering questions we would have never thought to ask.

Web vocabularies provide the common definition of meaning, and strong web vocabularies are vital to this new infrastructure. The best web vocabularies are governed by authoritative institutions, which are relevant for the domain they define. This way, they gain gravitational critical mass and become not only formal, but also de-facto standards.

Lending from UN/CEFACT’s decades-long established authority, the UN/CEFACT Web Vocabulary is the undisputed global semantic dictionary for terms in trade.

https://vocabulary.uncefact.org/

UN/CEFACT, now available as Linked Data

Nis Jespersen, Transmute’s Solutions Architect, is editor of the United Nations CEFACT JSON-LD Web Vocabulary project.

Connect with Nis on LinkedIn, Twitter, & GitHub

About Transmute: Building on the security and freedom that Web3 promised, Transmute provides all the benefits of decentralization to enterprise teams seeking a cost effective, interoperable, planet-forward experience provided by experts in technology and industry.

Transmute was founded in 2017, graduated from TechStars Austin in 2018, and is based in sunny Austin, Texas. Learn more about us at: http://www.transmute.industries

Connect with Transmute on LinkedIn and Twitter

See the Global Supply Chain with Knowledge Graphs and UN Web Semantics was originally published in Transmute on Medium, where people are continuing the conversation by highlighting and responding to this story.


IDnow

IDnow appoints Bertrand Bouteloup as new CCO

Munich, January 26, 2023 – IDnow, a leading identity proofing platform provider in Europe, welcomes Bertrand Bouteloup to the management team as its new Chief Commercial Officer (CCO). Based in France, he will assume the group-wide commercial end-to-end responsibility. For the last six years, Bertrand held the position of Vice President of Sales at ARIADNEXT, […]

Munich, January 26, 2023 – IDnow, a leading identity proofing platform provider in Europe, welcomes Bertrand Bouteloup to the management team as its new Chief Commercial Officer (CCO). Based in France, he will assume the group-wide commercial end-to-end responsibility.

For the last six years, Bertrand held the position of Vice President of Sales at ARIADNEXT, the French market leader for identity proofing, which was acquired by IDnow in June 2021. Prior to that, Bertrand was Global Director of the Cybersecurity Business Unit at Capgemini, then European Director of Security Services at Unisys, and Managing Director of 8-i, a consulting and integration company. He will be employing his more than 15 years of management experience in the IT industry in his new role as CCO of IDnow.

Bertrand’s previous position of Vice President of Sales within the IDnow group will be filled by Cyril Patou, who most recently held the position of Regional Director for France, Alps and Southern Europe at Clear Skye. Cyril has over six years of experience in the digital identity sector, having worked for One Identity and Ping Identity in France.

Commenting on his appointment, Bertrand said: “I am delighted to be taking on this new challenge within the IDnow group and to drive our revenue targets in 2023 and beyond. As a group, we want to continue to grow closer together and leverage our expertise in the German and French market across borders. One of my priorities will be to bring the different commercial teams closer together, and harmonize targets and processes even further.”

“I have known Bertrand for several years now and I look forward to continuing to work with him in his new role as Chief Commercial Officer. His many years of managerial experience in the identity verification and cybersecurity industry, coupled with his understanding of our company values and his vision for our future with a strategic and business-led approach, means that he is the perfect match for this role,” added Andreas Bodczek, CEO of IDnow.

Wednesday, 25. January 2023

Entrust

Looking to Elevate Operational Efficiency and Brand Impact for Your Bank? Here’s How Flat Cards Can Help

Have you received a new payment card either in the mail or from a branch... The post Looking to Elevate Operational Efficiency and Brand Impact for Your Bank? Here’s How Flat Cards Can Help appeared first on Entrust Blog.

Have you received a new payment card either in the mail or from a branch recently? You may have noticed that many financial institutions are transitioning away from using embossing machines for personalization, and toward flat card technology. Is now the right time for your institution to make the move to flat cards? As we enter a new era for consumer payments, reimagining your card design can maximize efficiency and brand impact to give you a competitive edge.

Review your fleet for maximum efficiency

To fully understand your efficiency, it’s important to evaluate the total cost of ownership, or TCO. TCO is an estimation of all the cumulative expenses associated with purchasing and operating a product. The TCO will provide a way to measure how efficient your current operations are and where you can lift the efficacy of your current technology to outperform what was done previously.

With inflation still high and supply-chain, logistical, and EMV chip shortage challenges persisting, now is an ideal time to make the move to flat cards for your efficiency and brand. Does your current setup prepare you for your continued success in the time of flat card innovation? Are you considering eco-friendliness and durability before you make your next big purchase? Are you taking full advantage of your cards from a branding and personalization perspective? If you are curious about answers to these questions, you are at the right place. Let’s start reviewing what to consider for your financial institution.

Review your total cost of ownership and identify where you can improve your efficiency. Is your system supported by best-in-class standards for hardware, software, and service? How efficient and durable are your system and its supplies? Find your best possible option in the market to elevate your operations. Take control of your future by starting out with the right solution. Does your solution support modularity to enable future development and innovation into your current fleet? Making the right first choice helps you establish a better fleet, and you don’t have to start all over when there is a new technology in the market. Consider eco-friendliness for your overall operations. Is your current system enhancing durability of your card compared to the industry standard? By reducing waste and minimizing environmental impact from your operation, you will lead the eco-conversation that brings positive brand impact to your business.

Lift your brand impact with flat card printing

Flat card printing enables card layout flexibility by supporting frontside, backside, horizontal, and vertical orientations and provides more real estate for branding and creativity. Now you don’t need to worry about how your card design will look after applying the protective topcoats. Unique and eye-catching payment cards have a stronger chance of being elevated to the top of the consumer’s wallet.

With the Entrust Sigma DS4 direct-to-card instant issuance solution, financial institutions have an industry-leading security and monitoring solution. By adding the Light Curing Module onto your existing Sigma solution, you can provide up to two times the card life and four dynamic monochrome colors (including metallic gold and silver for a seamless switch from emboss to flat and industry-first white text printing) that enables both maximum efficiency and elevated brand impact. With the enhanced efficiency, brilliance, and durability of Sigma DS4 Instant Financial Issuance System, you will see the number of replacement and reissue cards reduce, and you will see how much customers love their new cards. For a more detailed discussion tailored specifically to your business needs, click here, or reach out to an Entrust representative to have a conversation.

The post Looking to Elevate Operational Efficiency and Brand Impact for Your Bank? Here’s How Flat Cards Can Help appeared first on Entrust Blog.


Demo for a donation

Entrust is exhibiting at this year’s IoT Solutions World Congress in Barcelona at the end... The post Demo for a donation appeared first on Entrust Blog.

Entrust is exhibiting at this year’s IoT Solutions World Congress in Barcelona at the end of January, and I’m proud to be part of a new initiative Entrust is undertaking at the event we’re calling “Demo for a donation.”

A standard part of all events like this is the ‘swag’ offered by all the exhibitors. But with the event theme of “Game Changing,” we got to thinking that instead of little branded tchotchkes like pens, stress balls or socks, we want to change the game by donating to charities that focus on kids in tech.

Together with our exhibition partner Device Authority, we are contributing to the education and development of young minds in the digital world, the young people who will one day “change the game” in the world of cyber security.

If you’re attending the event, helping us to achieve this aim couldn’t be simpler:

Drop by booth D441 in Hall 4 and chat with an Entrust or Device Authority expert For every badge scanned Entrust will donate $10 to your choice of one of three charities You will receive a thank you card confirming the donation

The three charities we have selected are as follows:

Every Child Online– a UK-based charity that provides refurbished PCs, laptops and tablets to underprivileged children for free. The charity’s goal is to help close the digital divide and improve the quality of digital education offered in schools. CODE– a US-based charity, it is an education innovation non-profit dedicated to the vision that every student in every school has the opportunity to learn computer science as part of their core K-12 education. Technovation Girls – a global charity that equips young women to become tech entrepreneurs and leaders. With the support of volunteer mentors and parents, girls work in teams to code mobile apps that address real-world problems.

I’ll also be presenting together with an expert from Device Authority at two sessions during the show on the topics of Zero Trust Security in the Connected Supply Chain and The Challenges and Considerations of Preparing IoT for PQ. They’re both hot topics of discussion, so if either (or both) of those sound of interest, come by and check them out, I’d love to know if you have any questions.

As well as feeling good about helping support one of these fantastic charities, you can also get a pick me up in the form of a candy treat from our “Sugar bar,” so be sure to stop by the Entrust/Device Authority booth and speak to one of our experts.

The post Demo for a donation appeared first on Entrust Blog.


KuppingerCole

Security Orchestration Automation and Response (SOAR)

by Alejandro Leal This report provides an overview of the SOAR market and a compass to help you find a solution that best meets your needs. We examine the SOAR market segment, product/service functionality, relative market share, and innovative approaches to providing SOAR solutions.

by Alejandro Leal

This report provides an overview of the SOAR market and a compass to help you find a solution that best meets your needs. We examine the SOAR market segment, product/service functionality, relative market share, and innovative approaches to providing SOAR solutions.

IDENTOS

IDENTOS launches new data sharing capabilities with FPX Vale

IDENTOS’ latest FPX version, FPX Vale, features user-led delegation, enriched administrative capabilities, and more IDENTOS Inc., a leader in digital identity and access management, launched the latest Federated Privacy Exchange (FPX) version of its identity and authorization software API offering as a strong kick-off to 2023.  The FPX Vale version builds on the company’s previous […] The p

IDENTOS’ latest FPX version, FPX Vale, features user-led delegation, enriched administrative capabilities, and more

IDENTOS Inc., a leader in digital identity and access management, launched the latest Federated Privacy Exchange (FPX) version of its identity and authorization software API offering as a strong kick-off to 2023. 

The FPX Vale version builds on the company’s previous FPX version (FPX Junction), a cloud-based software product that provides fine-grained API authorization and user-centric identity management capabilities. Enabled by and extending User-Managed Access (UMA) 2.0, FPX allows users to safely share their information with trusted digital service partners. This facilitates the efficient flow of data between users and applications benefiting all parties.

“We are proud to release our latest version, FPX Vale. As identity sits at the root of all digital experiences, we are continually pushing the user-centric experience while keeping security and privacy at the heart of everything. We’re thrilled to empower our customers with added flexibility to execute on use cases that are critical today and in the future.”  – Alec Laws, Chief Technology Officer, IDENTOS

The highly anticipated FPX Vale version incorporates several capabilities, including User-to-User delegation and new API functions, enabling improved administrative flexibility, user convenience, security, and privacy – critical to sharing personal and sensitive data across digital healthcare, financial, and government services. 

FPX Vale key capabilities

-User-to-User delegation at the wallet – FPX Vale now allows a user to delegate access to another user over certain functions, including consent to share personal information on their behalf. The ability to provide authority over sharing personal information is critically required in healthcare, legal and financial settings where a child, senior, or individual may need the support of their family or trusted party to assume their role in a critical scenario. With FPX Vale, individuals can do so without sharing passwords and avoid compromising security. Moreover, determining connection timeframes and revoking access at any time enables many privacy outcomes.

-New Admin API to configure Resource Server Adapter (RSA) – Resource Server Adapter minimizes data source onboarding effort through pre-integration to other FPX components and configurable extension points for data and API integrations. FPX Vale offers administrative flexibility and ease of use.

-New Disable API – FPX Vale allows for entities within the FPX network to now be disabled, using the new Disable API, allowing system administrators to quickly cut-off access to any resource protected by FPX. This provides the administrators complete lifecycle management of ecosystem partners and services.

Availability
IDENTOS’ FPX Vale updates are available today to new and existing customers.

Learn more about FPX Vale
-Release notes: https://developer.identos.com/docs/release-notes/fpx-vale
-Documentation for developers:  https://developer.identos.com

(source: CNW)

The post IDENTOS launches new data sharing capabilities with FPX Vale appeared first on Identos.


KILT

Get Your DID: Now You Can Pay with KILT and PayPal

The new KILT website introduces a way to get a KILT DID (decentralized identifier) without using cryptocurrency. This allows users to have their DID — a “digital fingerprint” at the core of their decentralized digital identity — anchored on the blockchain where no one else can ever delete it. DIDs also provide access to the full range of KILT’s identity solutions. In the physical world, identity

The new KILT website introduces a way to get a KILT DID (decentralized identifier) without using cryptocurrency. This allows users to have their DID — a “digital fingerprint” at the core of their decentralized digital identity — anchored on the blockchain where no one else can ever delete it. DIDs also provide access to the full range of KILT’s identity solutions.

In the physical world, identity starts with your face or fingerprint. KILT brings this model of identity to the digital world, with a DID serving as your digital fingerprint. Identity is then built around your DID by adding credentials or digital certificates. On KILT, you can anchor your DID on the blockchain. This means that unlike an identity card (which should not be confused with identity itself — it’s just a credential) no external body can cancel or delete the core of your identity.

Anchoring a DID: Pay with KILT

Anchoring a DID requires data storage on the KILT blockchain database, which is replicated among multiple computers. Unlike Ethereum, KILT incentivizes cleaning up this database rather than letting it grow for eternity. Every time a user stores data on the chain, they have to put down a small deposit, which they receive back automatically when they remove the data. Storing a DID on the KILT chain requires a deposit of roughly 2 KILT Coins. The deposit never leaves the user’s account. It is just locked and not available for transfers, voting or staking.

If the DID is no longer required, the user can remove the data at any time, which automatically unlocks the deposit and makes it available again.

Once users anchor their DID on the chain they can use the full range of KILT identity solutions. Since KILT DIDs meet W3C industry standards, they are also compatible with any applications that also resolve the KILT DID method.

Anchoring a DID: Pay without KILT

Many people are interested in enjoying the benefits of decentralized identity. But some don’t want the friction of buying or holding crypto in order to put their DID on the blockchain.

B.T.E. BOTLabs Trusted Entity GmbH (BTE), a subsidiary of BOTLabs, the initial developer of KILT Protocol, has created two solutions to address these adoption challenges:

A Checkout Service that puts the individual user’s signed DID transaction on the KILT blockchain, and allows payment with PayPal. An enterprise solution for buying large numbers of DIDs for their customers or employees and paying once via bank transfer.

In both cases, the user’s DIDs are generated on their own device in their private wallet, and are unique to the user.

These two services utilize a unique feature of KILT: the account that pays the fees and makes the deposit (the service) does not need to be the same as the account that authorized the transactions (the DID owner). This means that the owner of the DID stays in control of their DID while an external account (the service used) can be used to submit the DID operations to the blockchain and lock the deposit on behalf of the user. The user always stays in control of the DID. No one except the user can delete or add to the DID.

Naturally, both solutions are compliant with KILT’s privacy by design. Here’s how they work:

Checkout Service

Individuals who prefer not to pay with KILT can use the Checkout Service provided by KILT partner BTE. This service can be accessed via the KILT website or through KILT’s Sporran wallet. Getting a DID (anchored on chain) using the Checkout Service currently costs a non-refundable service fee of EUR 4,⁰⁰. Users can pay for this service using PayPal, which may also provide options for paying in other fiat currencies.

The process starts by creating an account in Sporran, which automatically generates DID details as part of the setup process. After clicking “Get your DID” in the Sporran wallet and selecting the Checkout Service option, users are directed to sign with their password in Sporran to prepare the transaction. Then users will be directed to the Checkout Service, which processes their transaction and offers them PayPal to pay for the Checkout Service:

The service then puts the transaction on the blockchain, pays the KILT transaction fees as well as the required deposit, and registers the DID on the blockchain on the user’s behalf. When the order has been processed, the Checkout Service website shows the order has been completed:

Users will be able to see their DID in their Sporran under “Manage DID”.

The Checkout Service has no control over a user’s DID at any point. Once the service makes the deposit required by the blockchain, the deposit is locked forever. Only the user can delete the DID. The DID is now stored on the blockchain, and the user can start building their digital identity around it, knowing that no external party can turn it off.

Enterprise Solution

This service is also relevant for enterprises that want to build on their real-word trust reputation to create a business around digital identity. Or need to equip their customers or employees with DIDs for various reasons, without requiring them to acquire KILT Coins. Some enterprises even have policies that prevent them from holding cryptocurrencies.

Working on a larger scale, these enterprises don’t want to go through the payment process for every single DID and would prefer an automated solution. To meet this need BTE developed an enterprise service. Here’s how it works:

The service allows an entity (e.g. a company) to pay once in fiat for a large number of DID deposits. The BTE service then acquires the necessary amount of KILT Coins. The company can develop its own wallet for their customers to store their credentials securely using KILT’s open source software, or integrate the Sporran wallet. Employees or users of the company’s service can then download the company’s wallet on their device and generate their DID which is anchored on the blockchain. Following a similar process to the Checkout Service, the BTE enterprise service communicates with the company’s software to register DIDs on the blockchain as requested via an API, up to the number of DIDs paid for in advance.

As in the Checkout Service, neither the BTE enterprise service nor the company have any control over the DIDs; they cannot revoke them or remove them from the KILT blockchain.

The Checkout Service and enterprise solution represent a significant step forward for KILT Protocol. These innovations add convenience and flexibility that will drive adoption of KILT DIDs across new audiences and industries.

______

Still have questions? Browse the FAQs below.

FAQs

Q: Where do users get their DID?

A: The DID is generated on your computer when you create your account in your wallet (Sporran for Checkout Service, or the wallet developed by the provider for other services).

Q: Can other people add credentials to my DID?

A: No. You can request credentials and store them in your wallet on your device. You decide, which credentials you accept and who you share them with. If you wish to make some credentials public (e.g. email address, social media handles) you may link them to your DID, but no one else can do that for you.

Q: How do I know that BTE won’t cancel my DID?

A: BTE / the Checkout Service cannot remove the user’s DID from the blockchain. Only the user can delete their DID.

Q: Can the Checkout Service / Enterprise service / Company control my DID?

A: No. Only you have control of your DID. Once it’s registered on the blockchain, no one can delete or turn off the core of your digital identity besides you. However, if you later decide you don’t need your DID, you can choose to delete it forever.

Q: Once I have my DID anchored on the blockchain what can I do with it?

A: There are already several services “built on KILT” that utilize DIDs including Social KYC and DIDsign. More applications are in development and will be available soon; keep an eye on Twitter or the KILT blog for announcement updates.

Q: How much does Checkout Service cost?

A: You currently pay a non-refundable €4 (or Euro equivalent) using PayPal for the Checkout Service. This includes VAT, the transaction fee, a handling fee for the service, and the equivalent of the deposit.

Q: If I pay for my DID using Checkout Service, can I delete my DID later?

A: Yes, you can delete your DID from the KILT blockchain. If you do so, this DID cannot be recreated.

Q: What happens if I delete my on-chain DID. Is it completely removed from the blockchain?

A: No, records are permanent on the blockchain. However, deleting your on-chain DID invalidates it forever.

Q: If I pay for my DID using Checkout Service and later delete my DID, do I get a refund.

A: No, deposits paid by Checkout Service are not refundable. The deposit is locked forever.

Q: How are deposits calculated?

A: The deposit amount is calculated using a formula based on the size of the data stored multiplied by a scaling factor that can be decreased or increased by governance vote., e.g.:

Number of items * 56 * 0.001 KILT (milliKILT) + (item size, bytes) * 50 * 0.000001 KILT (microKILT)

Q: If KILT Coins increase in price, will the price of a DID increase too?

A: KILT is a decentralized protocol governed by the community of KILT holders; a decision to change the price of a DID would be made by the community. Any KILT Coin holder could create a proposal to change the price, and the community would vote.

Q: Who owns the KILT blockchain?

A: KILT is a decentralized blockchain, meaning it is owned and governed by the community of KILT Coin holders. Read more about KILT governance here.

Get Your DID: Now You Can Pay with KILT and PayPal was originally published in kilt-protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.


New Year, New Launch:  KILT Website Features Enterprise and Consumer Onramps to DIDs

New Year, New Launch: KILT Website Features Enterprise and Consumer Onramps to DIDs The new KILT.io website is now live! The redesigned site features a vibrant look and feel, with new tools to “Claim your Independence” and regain control of personal data using KILT’s digital identity solutions. The new website continues KILT’s mission of education about digital identity, DIDs and verifiabl
New Year, New Launch:
KILT Website Features Enterprise and Consumer Onramps to DIDs

The new KILT.io website is now live! The redesigned site features a vibrant look and feel, with new tools to “Claim your Independence” and regain control of personal data using KILT’s digital identity solutions.

The new website continues KILT’s mission of education about digital identity, DIDs and verifiable credentials. New sections highlight enterprise use cases and ecosystem partners, and the Protocol section includes KILT’s governance framework, the new Constitution, and guidelines for Treasury proposals. Developer resources and documentation provide the information needed to start building with KILT.

In the physical world, your identity starts with your face or fingerprint. KILT brings this model of identity to the digital world, with a DID (decentralized identifier) serving as your digital fingerprint. Setting up a DID is the first step to building your identity with KILT services, and requires a deposit of around 2 KILT Coins. Users can get their DID with KILT via the new website and KILT’s Sporran wallet.

But not all users are familiar with or comfortable buying crypto. So today’s relaunch introduces a Checkout Service that makes digital identity and DID creation available to a much broader audience, and doesn’t require crypto knowledge or coins.

Created by B.T.E. BOTLabs Trusted Entity GmbH (a subsidiary of BOTLabs, the initial developer of KILT Protocol), the Checkout Service enables individuals to get their DID without using KILT and easily pay for the Checkout Service via Paypal.

Also available is a new DID solution for enterprises to offer DIDs at scale for their employees and customers. Learn more about how enterprises can buy a batch of transactions and anchor the DIDs on the KILT blockchain, while preserving the user’s privacy and control.

So what are you waiting for? Browse the site, Get your DID with KILT or Paypal via the new Checkout Service, and start building your decentralized identity. It’s time to take back control.

About KILT

KILT is a blockchain identity protocol for generating decentralized identifiers (DIDs) and verifiable credentials, providing secure, practical identity solutions for enterprise and consumers.

Discover more on the KILT website, brainstorm KILT use cases in Discord, or follow KILT on Twitter and Telegram to keep up with the latest news.

New Year, New Launch:  KILT Website Features Enterprise and Consumer Onramps to DIDs was originally published in kilt-protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.


KuppingerCole

Mar 28, 2023: Pioneering Zero Trust Network Access With Intelligent Microsegmentation

Working from anywhere and the adoption of cloud services from multiple providers have become increasingly common in the post-pandemic era, but this has created new opportunities for cyber attackers to enter and move across networks. A modern approach to network security is essential.
Working from anywhere and the adoption of cloud services from multiple providers have become increasingly common in the post-pandemic era, but this has created new opportunities for cyber attackers to enter and move across networks. A modern approach to network security is essential.

Spherity

Hat-trick in U.S. pharma innovation

Spherity and U.S. pharma? The U.S. Drug Supply Chain Security Act (DSCSA) aims to secure patient safety by increasing supply chain integrity. Hence, all pharmaceutical supply chain actors should ideally only interact with those that meet the DSCSA Authorized Trading Partner (ATP) status when they engage in electronic product identifier (PI) verifications, in particular in relation to suspici
Spherity and U.S. pharma?

The U.S. Drug Supply Chain Security Act (DSCSA) aims to secure patient safety by increasing supply chain integrity. Hence, all pharmaceutical supply chain actors should ideally only interact with those that meet the DSCSA Authorized Trading Partner (ATP) status when they engage in electronic product identifier (PI) verifications, in particular in relation to suspicious or returned product investigations.

Spherity is a software start-up specializing in enterprise identity solutions. In association with industry players including Legisym, Center for Supply Chain Studies, rfxcel, SAP, Novartis, Johnson & Johnson, Bristol-Myers Squibb, AmerisourceBergen, GS1 US and HDA, we have addressed the challenge of authenticating a trading partner’s identity and authorizing them in DSCSA-regulated electronic PI verifications that are facilitated by Verification Router Service (VRS). The original solution is based on cryptographically verifiable credentials. This credentialing approach has been developed further collaboratively within an industry collaboration called the Open Credentialing Initiative (OCI). The method introduces a new layer of security, efficiency, and convenience to DSCSA compliance for any VRS-enabled pharmaceutical supply chain actor.

Innovation hat-trick?

The credentialing solution began as a classic technological innovation project in April 2020 with Spherity and other companies representing different segments of the U.S. pharmaceutical supply chain coming together in a cross-functional pilot. Explore the published pilot resources for detailed information. After successful conclusion the team realized that true industry-wide innovation will not happen in isolation. A close partnership emerged from the pilot between Spherity and Legisym, an expert in regulatory compliance technologies for the pharmaceutical industry. To standardize and evolve the piloted solution, Spherity and Legisym spearheaded the foundation of OCI, a non-profit industry collaboration, in April 2021. Thus, the innovation has been threefold — at technological, compliance, and governance level. 2022 was a pivotal year for the Spherity-Legisym partnership and OCI yielding not only matured architectural and governance frameworks for the latter but also a commercially ready solution for the former.

Technological innovation: Spherity leverages decentralized identifiers in combination with verifiable credentials (VCs) as a new layer of trust in digital interactions. Each decentralized identifier is uniquely and cryptographically verifiably associated with a pharmaceutical trading partner. A VC is effectively a translation of existing real-world evidence, including trading licenses, into a specific standardized electronic format. These identifiers and credentials are stored securely in a piece of software called a digital wallet. By presenting an organization’s VC to a counterparty through electronic PI verification messages facilitated by VRS, supply chain actors are able to identify trading partners as intended by DSCSA in secure electronic processes. The key advantage of the described digital credentials is that they can be independently automatically verified unlike other identifiers (see below).

Compliance innovation: Initially, pharmaceutical supply chain actors intended to rely on existing enterprise identifiers for the accurate identification of trading partners in PI verifications, in particular global location numbers (GLN) or data universal numbering system (DUNS) numbers as well as existing manual or semi-automated look-ups of compliance information, such as trading licenses. However, in electronic interactions, the receiving party cannot be sure that an entity providing the aforementioned identifiers is indeed who they say they are because their identity and provided details cannot be independently verified. It poses a challenge to reliably associate such identifiers with separate evidence for trading authorization increasing the complexity of due diligence. In addition, there have been cases where GLN were not well maintained, e.g., they were not active or not associated with the expected entity. Thus, existing means had left a compliance gap that the credentialing method has now plugged. Besides the technological qualities of VC solving the challenges around the assurance of the interacting parties indeed being who they claim to be and possessing the appropriate trading authorization, Legisym’s assessment of the existing compliance landscape has enabled the adaptation of due diligence processes and available pieces of evidence with a high level of assurance for use with this novel technology.

Governance innovation: To bring true value to the US pharmaceutical industry as a whole, ecosystem thinking must be applied. To create not only an isolated solution but allow for industry-wide interoperability and adoption, Spherity, Legisym, and others have co-founded OCI as a governance platform for the credentialing method, forum for industry stakeholders, and to create the foundation of an open market for other service providers to join. Under the stewardship of the Center for Supply Chain Studies, the Spherity-Legisym partnership has been a leading force within OCI in the creation of the technological architecture, conformance criteria, and organizational governance as well as industry-wide alliances and educational efforts.

Gains in productivity, supply chain safety, and efficiencies

Supply chain safety — automation: DSCSA has been enacted to enhance supply chain safety. By enabling the automation of aspects of legal compliance, ATP credentialing directly pays into fulfilling the promise and intent of the law.

Supply chain safety — auditability: Equipped with monitoring and reporting features, the Spherity solution, named CARO, generates automated audit trails enabling timely investigations.

Productivity — staff time: Credentialing involves the conversion of real-world evidence provided by each trading partner to Legisym into electronically verifiable credentials in a one-off enrollment. Any interactions between trading partners leverage these credentials through automated checks. Thus, the due diligence process in the context of PI verifications becomes a passive, entirely hands-off exercise for trading partners on either side of an exchange. This means that (1) an individual company does not need to undergo a separate due diligence process with each new trading partner and (2) that staff are freed up from due diligence-related tasks. For example, a manufacturer does not need to vet every single dispenser before reacting to a product enquiry because the dispensers have already been vetted within the credentialing system. Legisym’s ongoing monitoring keeps verifiable credentials active for as long as trading partners maintain their businesses in good standing.

Productivity — speed: While DSCSA allows for up to 24 hours, there is a drive by trading partners to achieve response times of under one minute to PI verification requests by using automated electronic systems. Consequently, the ATP check must happen in the same timeframe. The credential-based system is able to handle both the ATP and the PI verification requests within far less than one minute.

Efficiency — operations: The credentialing approach alleviates the additional compliance burden on supply chain actors caused by DSCSA, as it maximizes the use of existing processes as much as possible. (1) Legisym leverages existing licenses/registration etc for due diligence and eliminates the need for repetitive checks. (2) Thanks to fully API-based integration of VRS providers with Spherity’s CARO, trading partners have no technical implementation effort. Hence, ATP authentication can be incorporated into existing processes without disruption.

Efficiency — market: The ecosystem approach through OCI opens the market to any company aspiring to be a service provider, as shared information and standards are openly available to lay the foundation for industry-wide system interoperability. This avoids monopoly-like pockets, vendor tie-ins, and encourages competition through value-adds or pricing.

Efficiency — joint innovation: The ecosystem approach through OCI encourages the involvement of various industry stakeholders and, thus, enables the development of solutions that are needed and fit for purpose, as they are born from within the industry.

Contact us

DSCSA will reach its final enforcement deadline on Nov 27, 2023. If you would like to know how our credentialing solution CARO can support you on your DSCSA compliance journey, don’t hesitate to contact us today and take advantage of our free trial.

We look forward to supporting you on your compliance journey!

Hat-trick in U.S. pharma innovation was originally published in Spherity on Medium, where people are continuing the conversation by highlighting and responding to this story.


KuppingerCole

Gear Up for the Future with Decentralized Identity

by Warwick Ashford In the digital era, business would greatly benefit from having increased identity assurance levels, be able to interact securely with partners, suppliers, consumers, and customers, and be able to reduce the administrative load during onboarding or ongoing verification of credentials. One potential solution, that is gaining momentum and support around the world, is the impleme

by Warwick Ashford

In the digital era, business would greatly benefit from having increased identity assurance levels, be able to interact securely with partners, suppliers, consumers, and customers, and be able to reduce the administrative load during onboarding or ongoing verification of credentials.

One potential solution, that is gaining momentum and support around the world, is the implementation of decentralized Identity systems that use verifiable credentials that are tamper proof, can be stored in a digital wallet, and can contain a decentralized identifier that is globally unique and can be cryptographically verified.

A user-centric universal digital identity solution

The ultimate goal is a universal digital ID that can be used to access all digital services without having to enter personal identity information for every single service and without having to trust every service provider to keep that centralized personal information safe.

While we still have some way to go in achieving that goal, organizations can already start using decentralized identities and verifiable credentials, thereby gearing up to enable the wider ecosystem that they will be able to tap into as soon as it is up and running. 

Decentralized Identity highlights in the EIC agenda

Any organizations interested in finding out more about decentralized identity, verifiable credentials, and how and why to implement them, should attend the track at the European Identity and Cloud Conference 2023 dedicated to these topics. The track is designed to provide a wide range of useful and practical information about Decentralized & Verifiable Credentials.

Interworking or interoperability is one of the key challenges to the implementation of Decentralized Identity. To find out how OpenID for Verifiable Credentials can help with this challenge and a lot more besides, attend this track session on: How to Build Interoperable Decentralized Identity Systems with OpenID for Verifiable Credentials.

Learn more about the NFTicket protocol, how it combines the identification power of cryptographic wallets with those of Decentralized Identifiers (DIDs), and how it has been applied in a pilot to create Renewable Energy Certificates as Verifiable Credentials in this session entitled: Verifiable Credentials and Dynamic NFTs – Two sides of the same medal.

Drilling down even further into the topic of interoperability, the session on Interworking of Verifiable Credential Products examines in detail two projects aimed at fostering wide scale adoption of Verifiable Credentials and discusses their finding.

Discover some of the business benefits of Decentralized Identity, the steps involved in setting up the necessary systems, important considerations, challenges, potential pitfalls and KuppingerCole’s recommendations in this session on Implementing Decentralized Identity.

And, find out about a new paradigm for IDaaS in the session on Rethinking Educational Accreditation and Onboarding with Decentralized Identity, in a discussion on how educational institutions and employers can adopt a privacy-friendly, frictionless, and more secure onboarding process for students and employees.


MyDEX

What has data got to do with net zero?

Source: pixabay This is one of a series of blogs exploring Hidden in Plain Sight: The Surprising Economics of Personal Data, the subject of Mydex CIC’s latest White Paper. At first glance it may not seem that data, especially personal data, has much to do with climate change. After all, using personal data doesn’t help us replace fossil fuels with renewable sources of energy, and all uses of
Source: pixabay

This is one of a series of blogs exploring Hidden in Plain Sight: The Surprising Economics of Personal Data, the subject of Mydex CIC’s latest White Paper.

At first glance it may not seem that data, especially personal data, has much to do with climate change. After all, using personal data doesn’t help us replace fossil fuels with renewable sources of energy, and all uses of data have their own carbon footprint from the energy used to power servers and devices.

So how can we argue that the proper collection and use of personal data is an essential part of the journey to net zero? To be sure, it definitely isn’t the solution. But it is part of the solution, a hidden key to carbon reduction. Here are six ways how.

A human-made, expanding resource

First, a new net zero economy has to find a way of reducing dependence on materials extracted from the earth. It needs to go circular. Data, including personal data, is a vital part of this transition because of two things.

It is human-made, generated by humans in the course of their activities — not extracted from the earth. It is infinitely renewable and more, because every time we use data we generate more of it (including ‘metadata’ e.g. data about the data we are using and generating).

Data isn’t the only, or even the most important human made, renewable, expanding resource. It is part of something even bigger: human culture and knowledge. As economic forces they’re even more powerful than energy (because we use knowledge and our ability to do things together to obtain energy). But data is a key part of this, providing an ever-growing contribution to it.

Active carbonisation

Second, at a lesser but nevertheless real level, increased use of digital and data processes helps to decarbonise the economy. An enormous amount of energy is required to fell trees, transport them, turn them into pulp and then paper, to move this paper using physical transportation, and to store it. In comparison, an electronic digit is a massive carbon saving. The more we can eliminate paper (by, for example, eliminating form filling and the sending of letters in the post), the better.

Third, improved data logistics is key to improved physical logistics — eliminating unnecessary carbon-emitting activities by better, more accurate planning and coordination. A parcel that is sent to the wrong place because of a data error produces carbon emissions entirely unnecessarily. Eliminating paper by digitising a data process is just the tip of a much bigger energy saving ice-berg: using data to make better decisions and implementing them more efficiently — decisions involving physical, energy-using processes. This is another aspect of decarbonisation.

A deeper, seismic shift

More generally and fundamentally speaking, the data revolution is shifting the epicentre of economic activity. For the last few centuries the economy has revolved around one central question: how to use energy to transform and move material things? It was all about making stuff.

As a society we are now very good at doing that. What we are less good at is two other, equally important economic activities: making the right decisions in the first place (so that we don’t waste time, energy and materials doing the wrong things), and implementing these decisions more efficiently and effectively (which boils down to the data-intensive tasks of planning, organising and coordinating). This is where the biggest opportunities for improvement now lie — opportunities with carbon reducing possibilities at their heart.

This shift in economic focus from using energy to make and move stuff to using data to make and implement better decisions, is epochal and seismic. It changes the underlying dynamics of all wealth creation. And it is essential for the journey to net zero.

Closely linked to this, is a slow but inexorable shift in how we think about and understand value and wealth. Wealth is not the ability to consume more things. It is human beings’ ability to live better lives. Yes of course, to do that we need to use and consume stuff. But consuming ever more stuff is not the be all and end all of life. There’s lots more to it than that, nowadays coming under the general heading of ‘wellbeing’.

Compared to what we know about how to make stuff, we know much less about how to improve human wellbeing in all its dimensions. To understand this better, we need data: detailed data about how people live their lives and, from this, what contributes to improved wellbeing and what does not. This is part of the ‘life management’ revolution we talked about in our last blog. Because it is so deeply personal, it needs privacy protecting personal data infrastructure to support it.

More concretely, as we also talked about in our last blog personal data stores will help deliver mass scale citizen behaviour change via ‘carbon concierge’ services: services that help each individual understand the unique carbon footprint of the things they buy and do, and to identify ways to eliminate or substitute high-carbon activities for low or negative carbon activities.

A positive alternative

But there is one more way in which personal data stores have an even deeper contribution to make to net zero. Just imagine, for a second, if by some piece of magic, all fossil fuels were rendered instantly irrelevant by a universal take-up of green, renewable energy sources. We’ve done it! We’ve gone net zero!

But if all we have done is gone net zero, nothing else has changed. We still have obscene levels of inequality, deep and growing poverty, the super-exploitation of people working in a rights-free gig economy, a collapsing health and care system, a corrupt extractive financial system with huge, unsustainable levels of debt, endless invasions of privacy by predatory, profit-seeking corporations. Would our lives actually be any better?

Without broader, deeper changes to our economic system net zero, in itself, will not create a fairer, better, more prosperous society. What we need is a journey to net zero that combines with, and brings these broader, deeper changes in its wake. That develops positive alternatives as it does so.

This is what personal data stores are part of: as well as accelerating the journey to net zero, they develop positive alternatives that make an enriching, circular, well-being focused society and economy possible. Not just old wine in a new bottle.

Other blogs in this series are:

The Great Data Delusion 19th century doctors thought bloodletting would cure most diseases. Today’s prevailing theories of personal data are little better. Why is personal data so valuable? Because of two things nobody talks about: reliability and surprise Is it a bird? Is it a plane? No! It’s Super…! With personal data, what sort of a problem are we dealing with? The vital issue people don’t want to talk about Productivity is key to prosperity. But nobody wants to talk about it? Why? When organisations become waste factories The single design flaw at the heart of our economic system, and what happens if we can fix it. Why are joined-up services to difficult to deliver? Because the status-quo organisation-centric database is designed NOT to share data. People: the dark matter of the economy The elemental force that economics never really talks about An engine of economic possibilities How personal data stores open up new vistas of innovation and growth

What has data got to do with net zero? was originally published in Mydex on Medium, where people are continuing the conversation by highlighting and responding to this story.


ValidatedID

JFF (Jobs for the Future) Plugfest 2

JFF (Jobs for the Future) and W3C aim to build an ecosystem for verifiable credentials wallets regarding learning and employment, and Verifiable Credential (VC) issuers and Decentralized Identifiers (DIDs).
JFF (Jobs for the Future) and W3C aim to build an ecosystem for verifiable credentials wallets regarding learning and employment, and Verifiable Credential (VC) issuers and Decentralized Identifiers (DIDs).

PingTalk

The Innovative Advantage in Digital Customer Experiences | Ping Identity

Businesses have embraced digital to engage with their customers. Keeping pace in this rapidly evolving threat landscape requires businesses to look for innovative ways to build experiences that optimise security without impacting user convenience.   Ping Identity and Versent conducted a roundtable luncheon in Sydney and Melbourne in November 2022, with CISOs and related C-level executives

Businesses have embraced digital to engage with their customers. Keeping pace in this rapidly evolving threat landscape requires businesses to look for innovative ways to build experiences that optimise security without impacting user convenience.

 

Ping Identity and Versent conducted a roundtable luncheon in Sydney and Melbourne in November 2022, with CISOs and related C-level executives responsible for ICT and/or cyber security in their organisations or business units. The aim was to spark thought and engage in a robust conversation about the challenges, issues, and approaches to identity access & management orchestration, with a specific focus on using customer identity to optimise security and convenience, especially in the areas of customer and employee experience. Twenty-eight executives from a very broad range of industries and backgrounds attended the event.

 

Steve Dillon, Head of APAC Architecture at Ping Identity, and Eddie Smith, GM Identity and Security at Versent hosted the events. Both 2.5-hour events were moderated by Craig Baty, Principal of DataDriven.

 

Leaders must collaborate with teams across the business to develop and execute a customer-facing digital strategy that ensures a frictionless experience, enhanced security, protection from fraud, and regulatory compliance. Using customer identity, leaders can establish a firm foundation of collaboration with different stakeholders to orchestrate digital experiences that drive sales, lower fraud losses, and meet business objectives.

Tuesday, 24. January 2023

KuppingerCole

Debunking Common Myths about XDR

Join security experts from KuppingerCole Analysts and SentinelOne to help you get an understanding of what eXtended Detection & Response (XDR) really is, and why you should consider this emerging technology in your enterprise security stack. John Tolbert, Director Cybersecurity Research at KuppingerCole and Marko Kirschner, Director of Sales Engineering Central Europe at SentinelOne will def

Join security experts from KuppingerCole Analysts and SentinelOne to help you get an understanding of what eXtended Detection & Response (XDR) really is, and why you should consider this emerging technology in your enterprise security stack.

John Tolbert, Director Cybersecurity Research at KuppingerCole and Marko Kirschner, Director of Sales Engineering Central Europe at SentinelOne will define XDR, including which technical components are necessary for distinguishing XDR from Endpoint Detection & Response (EDR), and how XDR differs from Security Orchestration Automation & Response (SOAR).

They will also discuss why XDR is an important useful amalgamation and evolution of security tool sets, the use cases it solves, and where it should fit on organizational security technology roadmaps.




Spruce Systems

SSX Product Update - Optimization Updates, New Features, and More

We launched SSX in November to provide developers with the easiest way to integrate Sign-In with Ethereum. We are continuously working on a positive developer experience, and additional features to enable builders to work with emerging decentralized identity paradigms.

We launched SSX to provide developers with the easiest way to integrate Sign-In with Ethereum, enable DAO logins, resolve ENS names, and more. We are continuously working on a positive developer experience, and additional features to enable builders to work with emerging decentralized identity paradigms.

Announcing SSX: Self-Sovereign Anything We’re happy to announce the release of SSX: Self-Sovereign Anything - the easiest way to get started with decentralized identity and install Sign-In with Ethereum SpruceSpruce

Our most recent work on SSX focused on two areas: adding optimizations, and adding additional features and support for various dapp setups. We recently took some time to consolidate and optimize code to reduce existing technical debt through some refactoring and added plenty of additional tests to make adding new features to SSX a breeze.

In our efforts to support various existing authentication setups, we've released support for NextAuth via our ssx-react package. Additionally, we've added an example in our growing example library to show how to integrate it. To support wagmi's recent updates beyond version 0.7.15 and NextAuth in a variety of different development environments, we are updating our ssx-react library to improve support for different Javascript module types (ESM and CJS modules).

Additionally, we added support for custom paths for endpoints on both the client and server sides to support various application setups in our core libraries. We also updated the ssx-react API to allow for additional options to pass to the SSX instance, widening the number of supported providers (such as Blocknative’s web3-onboard provider!).

Speaking of providers, SSX now also has multiple examples showing a developer how they can build an SSX-enabled dapp with WalletConnect’s web3modal-v2!

Finally, we’re happy to announce support for Lens profile resolution. This enables any developer using SSX to easily pull in Lens profile information to the dapps they’re building by simply turning on a feature flag.  

For more information on our recent updates, check out the following:

Features: ssx-react now supports NextAuth, and we’ve added corresponding documentation (#49). We added support for resolving Lens profiles (#50). We updated the ssx-react API to use additional providers. We added this support alongside an example using Blocknative’s web3onboard (#34). We added flexibility in SSX to let developers configure what endpoints they want to use for issuing nonces, and handling login/logout functions (#36). We added examples of how to use SSX with web3modal-v2. We have one example for each web3modal v2 package: web3modal/html, web3modal/react, and web3modal/standalone. We also updated our ssx-test-dapp to also support testing this new integration (#58). Testing and Code Refactor: We are over 70% in code coverage with tests and are currently working on continuing to get to full coverage this month. We optimized try/catch blocks, cleaned up existing code, and created a new package for functions and utilities used across our other packages ssx-core (#26). We updated how we bundle our packages to prevent typescript build issues for builds using gulp (#52). What’s Next

The team is working on improving interfaces in SSX to enable any developer to use it as a core building block, and extend it toward various decentralized identity use cases.

Additionally, we’re working on how developers with existing email or OAuth flows can easily enable Sign-In with Ethereum in their existing applications using SSX. We hope to eventually migrate applications to a world where Sign-In with Ethereum is the default way of logging in.

Finally, we think of SSX as a core enabler of user-controlled identity, where anyone can bring their own data anywhere they wish. We are working on a service that enables users to obtain credentials that they can take with them across the web using SSX. Stay tuned for more information late next month, closer to ETHDenver!


Indicio

Web3 Is About Creating the Infrastructure for Digital Transformation

The post Web3 Is About Creating the Infrastructure for Digital Transformation appeared first on Indicio.
Hype over web3 and the Metaverse put selling the future ahead of building the infrastructure for digital transformation. But that infrastructure is the source of business opportunity.

By Trevor Butterworth

The current malaise over web3 and the metaverse feels very much like a case of flying car syndrome: let’s take a shortcut to the future — and watch as we crash. We’re still waiting for flying cars, 70 years after they first seemed like a consumer reality; but if crypto is a similar measure of whether web3 will fail, then we’re missing a crucial point.

The important thing about the 1950s and 60s in terms of business and technology was not flying cars or the belief that their time had come; it was the many, ways in which the infrastructure for doing business changed: processes for creating new and better materials, and generating greater yields, the development of computer chips and their productization, and statistical management techniques and quality control. Individually, few of these developments caught the public interest; yet in the case of Japan, which combined them all, we ended up with a manufacturing and economic “miracle.”

We are in the middle of a similar infrastructural revolution in the way we can do business. Whether digital or analogue, the problems with legacy infrastructure and processes, are that they are too costly (think fraud), too complex (they can’t be managed quickly and easily from a mobile phone or a digital twin), too insecure (logins, passwords, and VPNs are “Bronze-Age” tools for defeating phishing), too fragmented (System A can’t interact with System B), too labor intensive (think the great resignation), too disengaged (they don’t facilitate a relationship with the customer or continuous product feedback), or too burdensome (they struggle to comply with digital privacy rights).

All these interrelated problems need to be substantially solved before we can get to decentralized finance or meaningful interaction on the Metaverse’s Broad Street; and central to their solution is verifiable data, otherwise (and perhaps now confusingly) called decentralized identity. Verifiable data through the use of verifiable credentials based on the W3C’s open standard for decentralized identifiers are the new level up for the internet and digital interaction. They are the infrastructure that will power new processes, new products, and new services—and, perhaps most importantly, address sectoral needs for immediate digital transformation.

A digital shipping container
You might think of a W3C verifiable credential as the digital equivalent of the standardized shipping container. The way it is designed to hold data determines the data’s integrity and verifiability. It can transport any kind of data associated with any kind of entity, whether a person or a device or even a thing—and it has its own direct shipping channel, so you don’t have to route the data through a third party (solving the data privacy compliance issue).

Because the container is built on open standards, it’s interoperable, and it can be used, in a manner of speaking, for air, rail, road, and sea transport. Because it’s built on open-source code, it can be continuously upgraded to provide new functions. And because it’s both a flexible and light piece of technology, you can overlay and integrate your new containers into your existing infrastructure. A container can transport data from System A to System B even if the two systems can’t directly speak to each other—which saves on the cost of a direct integration.

What do you get with this new data shipping system? You get the missing verification layer for interaction online. You get the ability to authenticate data: you can know where it has come from without calling up the source or relying on a third party to manage the transport; you can immediately see if the container has been tampered with compromising the data inside; and based on those two facts, you have the power to make data immediately actionable.

The first practical consequence is a way to deal with fraud. The data is in the container and not stored in some centralized database that the container accesses. If you trust the organization that sent the container, then you’ll trust the information inside it. That’s instantaneous, which leads to the second practical benefit: seamless processes.

At the customer end, seamless processes are being driven by the expectation that everything should be controllable from a mobile device and minimally inconvenient. At the business end, they are being driven by the need to automate mundane processes dependent on manual labor, for which there is either a shortage of workers or little worker interest. This is an acute challenge facing all sectors. Similarly, changing workforce demographics mean that remote is not going to disappear—and a decentralized workforce needs a more secure way of interacting with business systems than VPNs.

Autonomous devices, Industrial IoT, and digital twins will all need verifiable identities so that verified users can interact with them and consume and share verifiable data in secure ways. Smart manufacturing, supply chain transparency, and product lifecycle management all require verifiable data and relationships that can be easily facilitated by verifiable credentials and its rich, direct communications protocol, DIDComm.

In fact, when you look at digital transformation in finance, manufacturing, travel, any sector, verifiable data and verifiable identities provide the critical infrastructure needed to move digital interaction forward and deliver the transformation that overcomes fraud and friction. This is what is needed right now—and when it is in place, then we’ll have a web3 that can take flight.

Image (edited) from Alexandre Debieve, via unsplash

The post Web3 Is About Creating the Infrastructure for Digital Transformation appeared first on Indicio.


Entrust

Data Privacy Day is the time to ramp up your Board-level cybersecurity expertise

This Data Privacy Day it is time to address those who serve on Boards of... The post Data Privacy Day is the time to ramp up your Board-level cybersecurity expertise appeared first on Entrust Blog.

This Data Privacy Day it is time to address those who serve on Boards of Directors because there is an opportunity to turn 2023 into the year of Cybersecurity culture building – starting at the very top.

These days when it comes to an organization’s cybersecurity strategy, the question isn’t IF a breach will occur, but WHEN. Cybercriminals are becoming more sophisticated and the frequency of attacks is rising. Staying on top of each and every threat is like trying to ice skate uphill. And with the average cost of a data breach expected to hit $5 million in 2023, it is vitally important to prioritize your security strategy to protect your most sensitive data.

With the number, frequency and sophistication of threats increasing, it’s important that organizations build a security-minded culture, where employees at all levels feel empowered and accountable to not only protect sensitive data, but also to build resiliency into the business. However, organizations often turn their attention to employee security training and forget to bring along those at the top. As Boards become acutely aware of data privacy and data protection, they are increasingly interested in participating in the governance of both.

Here are five useful steps Board and C-suite members can take now to ramp up their cybersecurity expertise in 2023 and beyond.

Add a cybersecurity expert to the Board. This will help drive a cybersecurity culture and make data privacy governance and data security a priority at the highest level of the organization. Visibility is key to building trust by helping people to understand how data privacy, data security and compliance are maintained in the background. This applies to the Board as well. Having clear and well-understood policies and solutions can drive investment and buy-in. In fact, the U.S. Securities Exchange Commission recently proposed a new rule that mandates cybersecurity experience at the Board level, as well as regular reporting among its recommendations. And not just regulatory bodies are pushing for more of this knowledge at the Board level – MIT recently launched a course to teach Board members security tactics. Create a cybersecurity committee in which qualified Board members can participate in advising and mitigating risks. Creating this committee opens the door to more resources and support as the CISO and their team have the opportunity to build allies and champions within the Board, C-suite, and across the organization. Conduct a cost-benefit analysis on cybersecurity insurance. While cybersecurity insurance can be an effective part of an organization’s overall security strategy, it’s expensive and it usually doesn’t cover everything. Cybersecurity insurance is a tool within a company’s security toolbox to recoup losses from unforeseen incidents – it’s not meant to be a substitute for risk management. An insurance payout may cover the cost of a breach, but it won’t cover the damage to reputation and trust. Each company is different and so are its security needs, so it’s important to evaluate all factors and decide if it meets your business needs. Learn the distinct differences between data privacy and data security. These terms are often mistaken as interchangeable ideas; however, while they are connected, they are fundamentally different. Data privacy focuses on how personal data is collected, used, and shared. Data privacy laws and regulations can vary by regionally, with each having varying degrees of rigor and enforcement. Conversely, data security is focused on how sensitive data is protected from external and internal threats. From a compliance perspective, taking ownership of data security means responsibility to abide by data privacy regulations in place like the EU’s GDPR (General Data Protection Regulation) and the CCPA (California Consumer Privacy Act), to name a few. If an organization gets its data security framework right, then it can achieve data privacy for its clients. If not, then that’s a problem. Of all the information available, a person’s identity is the most coveted data there is and when it’s mishandled that’s when the opportunity for fraudsters occurs. For user authentication, it’s time to embrace some friction For years experts have touted the need to remove friction (or passive authentication) in the consumer, workforce and citizen identity verification experience. However, when friction is completely removed, that’s often when a breach happens – particularly when that friction is reduced by workarounds rather than reducing complexity. There’s an idea circulating among experts that some level of friction can serve as a trust builder – if people have no impediment to accessing applications and services, they start to question whether or not there are any security measures in place at all. This tells us organizations need to strike the right balance between minimizing friction and maintaining customer trust in an organization’s or government’s ability to keep their personal data safe – because when systems are secured, employees are enabled, partners are confident, and customers feel safe doing business with the organization, then you know you’ve got a formula that works.

Board-level involvement in the governance of data privacy and security for an organization can only enable an improved security posture and help mitigate risks. So go ahead and seek out opportunities to engage with your organization’s CISO to see how you can help keep sensitive data safe.

Interested in learning how Entrust can help you with your organization’s cybersecurity strategy? Visit our website: https://www.entrust.com/identities-payments-data-protection

The post Data Privacy Day is the time to ramp up your Board-level cybersecurity expertise appeared first on Entrust Blog.


Optimizing Central Issuance Production Amid Global Chip Shortages

The payment card issuance industry continues to be challenged by production inefficiencies as a result... The post Optimizing Central Issuance Production Amid Global Chip Shortages appeared first on Entrust Blog.

The payment card issuance industry continues to be challenged by production inefficiencies as a result of global pandemic supply chain disruptions. Of course, the usual operational hiccups exist, but the added pressures that come from the on-going global chip shortage have strained the entire industry. In this blog, we’ll explore some of these challenges and how they affect issuers and integrators. Most importantly, we’ll explore how to mitigate these risks and optimize operational footprints with increased agility and production throughputs.

The usual suspects

Vendor diversification is intentionally limited to organizations. Global chip suppliers often act to restrict issuers from looking for alternative suppliers. Pre-printed card stocks limit design and force higher carrying costs with increased volume inventories for anyone downstream. Both contribute to the same MRP inefficiency, steering the front-end procurement of specific items from specific vendors, contributing to higher overall costs and volumes which translate to operational inefficiency.

Too many designs can proliferate supply. With chip issuers and stock card designers constantly adding new card models to their portfolio, there is an added burden on production personnel and procurement systems to keep track of different SKU’s and maintain costs. Vault monitoring, production planning and material flow are all exacerbated by incremental stocks within complex MRP systems, sometimes driving multiple facilities with multiple lines to satisfy issuance demand across different regions. Static “card programs” can easily translate to production inefficiency if respective volumes do not align with overall customer demand.

Personalization systems limit issuance architectures. Most personalization systems in the market do not support the ability to mix and match different chips into one single job, causing significant inefficiencies in production volumes. Similarly, static card designs contribute to an identical concept — running independent batch sizes for specific programs. A recent ICMA study, concluded that the average worldwide production batch-size for financial bureaus is less than 20 cards, contributing to 20% of the total volume per day.

Piling on the on-going global chip shortage

Much of the focus on chip shortages has been on automotive manufacturing and consumer electronics, but payment cards and point-of-sale devices are also under significant strain from global shifts in supply chains. “The entire payments industry relies on semiconductors, chip cards, smartphones and digital point-of-sale devices, which are all impacted by the shortage,” said Oliver Manahan, the San Francisco-based senior director of business development for Infineon Technologies in Neubiberg, Germany. Manhan, who oversees partnerships with card vendors, issuers, payment networks and other firms at Infineon, is also co-chair of the Secure Technology Alliance and is on the steering committee of the U.S. Payments Forum.

The median chip inventory companies have on-hand fell from a 40-day supply in 2019 to less than five days in 2021, according to a 2022 report by the U.S. Department of Commerce that does not take the Russian war in Ukraine and China’s recent coronavirus lockdowns into consideration. Supply chain experts and economists agree that the semi-conductor challenge is expected to persist through 2023 and perhaps into 2024 as wafer capacity is strengthened with new sources of western supply.

It’s highly unlikely that these supply chain disruptions would prevent issuers from producing replacements for lost or stolen cards, or card expirations. However, if we get to that point we’ll have a significant industry-wide customer service nightmare on our hands. Calculated and careful management throughout the supply chain can help the issuance and greater payments industry minimize potential disruption to the cardholder experience.

Overcoming the odds

Amidst each of these challenges, bureaus and integrators can optimize production with the Entrust Adaptive Issuance and Print-on-Demand solutions. Both offer flexibility in a parallel issuance environment:

Adaptive Issuance EMV Data-Prep and Perso Software, in conjunction with rainbow decking (mixed chip card stocks), provide maximized production by aggregating card stocks and running them together in one batch, and Print-on-Demand solutions such as UV-curable Drop-on-Demand and Artista VHD Retransfer offer unlimited efficiency options by allowing individual card designs vs batched, pre-printed stock. Both eliminate major lead-time inconsistencies across chip supply and reduce total Operating Expenses (OPEX) with greater throughput. This design and chip-agnostic approach allows Entrust Central Issuance customers to mitigate the leading supply chain constraint in the industry today—semi-conductor lead-time, affording Issuer’s the ability to choose any combination of card stock suppliers regardless of chipsets.

To learn more about how Entrust card issuance systems can help you overcome production challenges, check out the following resources:

Entrust Adaptive Issuance EMV Data Prep and Perso Software: Entrust Print On Demand: Entrust ID Issuance Solutions: Entrust Financial Issuance Solutions:

The post Optimizing Central Issuance Production Amid Global Chip Shortages appeared first on Entrust Blog.


IDnow

“Always bet on good regulation.”

We sit down with our Director of Global Gambling & Sales, Roger Redfearn-Tyrzyk to discuss the challenges of complying with multi-geographical regulations, why the era of black-market operators is coming to an end, and which gambling trends to bet on in 2023.  Digital identity proofing has become an essential part of every sector’s digital offering. […]
We sit down with our Director of Global Gambling & Sales, Roger Redfearn-Tyrzyk to discuss the challenges of complying with multi-geographical regulations, why the era of black-market operators is coming to an end, and which gambling trends to bet on in 2023.  Digital identity proofing has become an essential part of every sector’s digital offering. Although the application of identity proofing technology may be similar, how important is it to be aware of the intricacies of a particular industry, for example with gambling and offering solutions that promote responsible use? 

I would say there’s two sides to this. On one hand you have the supply side, and on the other side you obviously have the operator side. On the supplier side, it’s important to be aware of where the industry is, what the industry is doing, and what the use cases are. 

Our solutions are being used by operators to make their lives easier, to make the players’ lives easier and safer, and at the same time prevent any harm or any wrongdoing. Nowadays, the regulations are very tight and rightly so, especially with regards to protecting minors from underage gambling, and those who may develop a gambling addiction. 

So, it is very important that we as suppliers know what to offer, how to offer it, and whether the solution can do enough. 

KYC and identity verification is a business enabler for online gambling because operators don’t see their customers face to face, so they don’t know what kind of situation they are in, or what they’re doing. The most superior identity verification experiences are safe, secure, and easy to use.  

As KYC is regarded as an enabler, compliance should be at the top of every operator’s mind, whether they’re at C-level or account management, in fact everyone involved in the technical teams that implement those solutions need to be concerned with compliance. 

As online gambling regulations change and evolve with such regularity, what difficulty does this present to operators, especially those that operate across multiple geographies?  

It can be very challenging because in some countries, for example, Germany, if you wanted to get a gambling license, you would need to fill out the applications in German, which obviously restricts the ability for certain international organizations to enter the market, unless they have somebody on staff who can speak the language. 

Regarding regulations, the biggest problem is that there are different ways to interpret certain rules or requirements. So, one company might read it one way, and another operator might see it another way, and that’s a big problem. A good example is what you see with the Alcohol and Gaming Commision of Ontario (AGCO), which essentially regulates operators in Ontario; they’ve done a fantastic job with their regulation. They want to protect the players, but at the same time, they’re also considering how to enable the operators to do that. 

If you are a clear and proactive regulator, like the AGCO, you will always get better results for that region, or that country.  

There are, unfortunately, many bad or frustrating examples of regulations, which may have four or five different interpretations of one paragraph. This does not do anyone any good because there will always be bad players and unscrupulous operators who want to grow quickly that will exploit those grey zones. 

Gambling operators tend to be well prepared when it comes to these things. They use lawyers and consultants to make sure they are going down the right path and take steps to implement the right technology along the way. 

So yes, it is challenging, but this is also what makes this sector so interesting, because there are always new challenges and new markets to enter, with different regulations and nuances.  

Overall, I would say regulation is necessary, but it is extremely hard to find good regulation.

The new German gamling regulation: 2022 and beyond. Download to get an overview of the various gambling regulations in Germany. Get your free copy Are there any specific risks that operators should be aware of in the gambling sector, and how can operators protect themselves against them? 

Bonus abuse is one example. This is where people create multiple accounts to get the sign-up bonus and then attempt to withdraw the money. Money laundering is another issue in certain countries, as is addiction, and feeding customers’ addiction rather than providing information that helps them to stop. Operators need also to be wary of minors attempting to onboard using the details of their parents, as well as people just not being honest about who they are and what they do. 

What can operators do? Well, there is great technology available now where identity verification providers and KYC providers can monitor how users are behaving. There is a lot of focus right now on behavioral screening of players and how to spot addiction a lot earlier than ever before. This involves monitoring after a customer is verified. Operators are now utilizing technology to perform the screening stage because doing it manually is no longer feasible. In fact, in certain regulations you are not even allowed to do it manually anymore.  

I would even say that in some cases the gambling industry uses more tools, or more innovative applications of solutions than the financial industry, such as technology that can identify ‘at-risk behavior’, predict and prevent incidents of fraud and even customer churn. 

Which region, or country would you say has a) the most developed and lucrative online gambling sector, and b) the most regulated? Is there a link between how much potential a country’s gambling sector may have, and how regulated it is? 

Oh, yes, 100% the link is there. Bad regulation is not good for the players and not good for the operators. If you look in Germany, the tax is very high and player limitations are super strict. So, there’s an argument that people are being forced into the black market, and that’s not good for the country because obviously the government is not going to benefit from the associated tax, and secondly the players are not protected if they go to the black market. 

If you’re using a black-market operator, your winnings are totally in their hands. For example, if you hit a jackpot of €1 million, they don’t have to pay you out, because there is no consumer protection. 

Roger Redfearn-Tyrzyk Director of Global Gambling & Sales at IDnow

The most developed and consistent regulation is probably the UK Gambling Commission, and the Malta Gaming Authority, both of which have been regulating online gambling for many years and making sure it’s an attractive proposition for both the operator and the consumer – because to have a healthy market, you need to appease both sides. Of course, a national market needs protection, but it also needs tax revenue too; it needs to be commercially viable. You must strike a balance. 

With gambling via offshore, unregulated gambling sites still a major problem within the industry, how can operators attract players away from unregistered sites? Are there any unique benefits that regulated operators can provide? 

Regulated operators should promote their compliance with consumer protection laws, and all the aforementioned responsible gambling tools. One thing that regulated operators probably don’t do enough of is mentioning consumer protection. However, adverts have changed in recent years to incorporate responsible gambling messages and moved away from promoting bonuses to showcasing responsible functionality like deposit limits and setting gaming time limits. I still think they could do more in terms of promoting regulation and consumer protection. I also believe the role of the regulator should bear some of this reasonability too.  

Why do you think some bettors go to unlicensed, or offshore gambling operators as opposed to registered/ domestic platforms?  

In Germany, for example, if you have gone over your deposit limit, you’re not going to be able to play anymore within that country as deposit limits also often don’t correspond to your withdrawals. So, let’s say you have deposited €500 euros into your account, and you’ve won €20,000, your deposit limit is not adjusted. It’s fixed as standard. 

This doesn’t make sense, because some people gamble for a living, so if they can’t continue to play, they’re obviously going to go to a black-market operator. Also, regulation forbids certain amounts of bonuses you can give out. An interesting situation is in the New York-New Jersey area, where you won’t see any adverts for bonuses in New York, but on the tunnel to New Jersey, which is just a few hours away, you will see many adverts and billboards promoting bonuses.  

Black market players can obviously lure players with high bonuses, and with the promise they don’t have to do KYC. Black-market operators can literally do the total opposite of what the regulated operators are doing, and still attract players.

Roger Redfearn-Tyrzyk Director of Global Gambling & Sales at IDnow
What online gambling trends do you foresee in 2023?  

I think 2023 will be all about responsible gambling, and I expect a lot of regulators to start introducing regulations to promote this. 

Responsible gambling will be the biggest thing in 2023, especially in the current economic situation where we are heading into uncertainty. 

The second trend to affect the industry in 2023 is going to be a focus on the customer experience, and operators providing an experience rather than just an account to transfer your money into. There’s also going to be a lot more innovative uses of technology. In our industry, we obviously speak with a lot of operators, and some of those solutions are mindboggling. I also think that online gambling will seek to replicate the social aspect of traditional gambling, like the experience of visiting casinos with friends. So, operators will be able to play with friends online and share the winnings online. 

Personally speaking, I hope in 2023, digital IDs will take further strides forward because we probably spend more time in the digital world than in the physical world where we actually use physical IDs. 

Why do you think online gambling and crypto has struck up such an unhealthy relationship?   

Crypto casinos are not very popular, so I don’t think crypto will remain in the gambling market for too long time. Right now, crypto is being used with some of the black marketplace platforms. 

Today, the most famous online casino is a crypto casino, which is being endorsed by big celebrities, and is an unregulated casino. However, in my opinion, I don’t think any major operator will be offering crypto anytime soon.  

All the operators that we work with are heavily AML-regulated, KYC-regulated, and will not be able to bypass such regulations by offering crypto payments. 

From the perspective of an operator, they will have to perform a KYC check to verify a person either way. Every regulator has a certain deposit limit, so users will have to provide the source of their funds, and that will be either via bank statements, or crypto exchange statements.

Find out what our Head of Crypto Sales, Jason Tucker-Feltham had to say when we asked him the same question in the “MiCA will pick up where MiFID left off.” interview.

If you would like to meet our team of identity proofing specialists to discover how a reliable, fast and secure player verification process can help you grow in the Gaming sector, visit us at ICE London 2023, from February 7-9.


If you’re interested in more insights from industry insiders and thought leaders, check out one of our interviews from our Fintech Spotlight Interview series below.

Brandi Reynolds, CAMS-Audit, CCI, CCCE at Bates Group David Gyori, CEO of Banking Reports David Birch, global advisor and investor in digital financial services Alex Pillow, Director, Market Strategy at Moody’s Analytics KYC

By

Jody Houton
Content Manager at IDnow
Connect with Jody on LinkedIn

 


Wider Team

Wider Team had a verifiable 2022

Our small band of strategy consultants gave back to our professional communities in 2022. Here's the recap of our digital identity, ethics, manufacturing, and supply chain talks, papers, standards work, and workshops.

So much!

Lubna Dajani joined Wider Team Wolff co-wrote a paper on accountability for human harms in trust ecosystems Sovrin IoT Paper on traceability for digital twins Shea’s Clinical Internet of Things leadership IHSCM presentation New DIF IoT SIG, new leadership, building on a year of work at Sovrin. Wider at European Identity Conference Wider at Grad School AI Ethics Talk Wider at Future Identity The ethical use of human generated data Workshop on catalyzing change Vienna Digital Identity Meetup Lubna Dajani joined Wider Team

Wider Team welcomed Lubna Dajani to help clients on digital innovation and transdisciplinary collaboration. Lubna’s dives into Hardware Root Of Trust and strategic alliances deepen our strengths. Lubna is based in New York.

Wolff co-wrote a paper on accountability for human harms in trust ecosystems

Wider’s Phil Wolff co-authored a white paper at ToIP with with Nicky Hickman. They identified human harms that can come from trust systems. The Trust Over IP’s “Overcoming Human Harm Challenges in Digital Identity Ecosystems” paper (pdf) calls for the SSI community and identity industries to acknowledge the risks of harm and build in preventive measures to avoid negative externalities. ToIP announced the paper this year as “Why the digital identity juggernaut needs safety belts” (“a systemic view of how human harms function in digital identity ecosystems – and how to mitigate them”). Phil asked “What if your identity ecosystem caused pollution?” as part of a series on applied ethics, risk management, and Accountability By Design to every layer of the trust stack.

WiderPoV: “When thousands of parties become part of a digital identity ecosystem, you diffuse responsibility for side effects that ruin, starve, and kill people. Accountability By Design, anyone?”

Sovrin IoT Paper on traceability for digital twins

Michael Ford of Aegis Software and Wider’s Damian Glover and Lubna Dajani wrote and presented “A practical approach to a holistic digital twin” to the October 2022 IEEE Software Technology Conference. (DOI Bookmark: 10.1109/STC55697.2022.00030)

WiderPoV: Digital transformation is bottlenecked by interop problems. Interoperability is being held up by concerns of data security and privacy. “We don’t compete, we complete each other.” Using SSI establishes a chain of trust, enabling integration of siloed data sets. Lubna explained “Let’s say a chip goes into an engine that goes into a pacemaker, in a human. How does your twin of the pacemaker let you trace all the way down to its components.” The IPC-2551 electronics industry standard can be paired with decentralized identifiers and verifiable credentials to let digital twins provide trust and provenance across silos.

We also presented this to the Global Semiconductor Trusted IoT Ecosystem Security (TIES) group in November.

Drop a line to Damian for a copy of the paper.

Shea’s Clinical Internet of Things leadership

Michael Shea shepherded the IEEE P2933 Trust & Identity team to final draft on their chapter in the Clinical Internet of Things (IoT) Data and Device Interoperability with TIPPSS – Trust, Identity, Privacy, Protection, Safety, Security standard.

WiderPoV: Within a device, “identity of components” is the same as “traceability,” from circuit board parts to modular components. This is another area for security compromise and risk in sensitive devices. A future direction: integrating DIDs and Verifiable Credentials into electronics manufacturing processes.

IHSCM presentation

Damian Glover and Michael Morgan-Curran presented to the Institute of Health And Social Care Managers about how the digitisation of care is impacting healthcare professionals, and how healthcare organisations can support them. We looked at how integrating high assurance digital identity into digital health apps and backend systems can help improve the caregiver experience.

WiderPoV: “High assurance digital identity promotes confident telehealth interactions by providing automatic know-your-patient and know-your-doctor capabilities. It supports appropriate care decisions by letting providers trust data from remote devices.”

New DIF IoT SIG, new leadership, building on a year of work at Sovrin.

Damian Glover and Lubna Dajani started steering the new Decentralized Identity Foundation Internet of Things Special Interest Group at the end of 2022. Bi-weekly tempo in Q1. The IoT Sig Discord channel. Ongoing theme: SSI in digital twins,.

Damian: “This is an exciting opportunity to influence how DIDs and VCs are integrated within the internet of things, and to accelerate adoption.”

Wider at European Identity Conference

In Transition – From Platforms to Protocols” drew an appreciative crowd for Wider’s Michael Shea and Danube Tech’s Markus Sabadello at EIC 2022 in Berlin. “Only a few years ago the identity ecosystem seemed to be ‘set’ with little chance for change or dislocation of the large federated identity providers. Today the entire identity technology ecosystem is in flux. What will emerge? OIDC? OIDC/SIOP? DIDComm?”

WiderPoV: The changing protocol landscape and shifting identity power centers means we are living in a heterogeneous world where standards are the key to building interoperability and nurturing ecosystems.

Wider at Grad School

Damian Glover and Michael Shea gave a Masterclass on the identity of things at the April Bayes MBA London Symposium 2022. It was “about developments in digital identification and the need for decentralised systems to achieve supply chain efficiencies and facilitate ‘reusable, verifiable and universal’ characteristics of online identity.”

WiderPoV: “Someone asked about sanctions on Russia and if SSI affected the ability to implement them and prevent sanction dodging.” (Not so much. Yet.)

AI Ethics Talk

Lubna Dajani spoke on Smart Cities Through The Lens of Human Rights (paper).

WiderPoV: Injecting “liability regimes” into large projects should help their civic and industrial ecosystems improve accountability in ways that preserve public confidence.

Wider at Future Identity

Michael Shea moderated a panel on the Future Identity stage at the Fintech Talents Festival in the City of London. “Leveraging Digital Identity In Healthcare” featured Maria McCann, who is leading implementation of the Individual Health Identifier (IHI) at the Health Service Executive in Ireland; Iain McCallum at techUK, which is being consulted by DCMS on the UK Digital Identity and Attributes Trust Framework; Gillan Ward from ID Crowd, which is working on a framework enabling NHS Trusts to use digital credentials issued by other Trusts during staff onboarding; and Kay Chopard of Kantara Initiative.

WiderPoV: “Digital ID is driving data quality up & healthcare costs down.”

The ethical use of human generated data

Lubna Dajani gave an ethics talk to the IEEE SA P2895 standards group (Standard Taxonomy for Responsible Trading of Human-Generated Data) in November. Themes included whether human-generated data is personal property under law, which human-generated data can be traded (or not) and how to address privacy concerns when trading human-generated data.

Workshop on catalyzing change

Lubna Dajani designed and hosted a workshop at the 2022 Catalyzing Change Week. “Humanity is in a state of flux. Digitization, Decentralization, and Decarbonization are disrupting every sector. … Achieving a sustainably prosperous society can be as simple as choosing to shift where we place emphasis and measure value. Do we want to continue to be the humankind we were taught to be or the ‘kind human’ we can be?”

WiderPoV: Viability grows when your enterprise values and strategy choices connect.

Vienna Digital Identity Meetup

Michael Shea’s cabal of pioneers focused on commercialization of verifiable credentials in 2022. Check out the Vienna Digital Identity Meetup archive going back four years.

Vienna Digital Identity Meetup archive, screenshot 23-January-2023

Wider Team are experts in decentralised identity, helping clients assess risks, identify opportunities and map a path to digital trust. For more information please connect on LinkedIn or drop us a line hello@wider.team.


1Kosmos BlockID

3 Key Considerations in Your Passwordless Journey

Problems with Passwords Are passwords the weakest link in cybersecurity? We all know that it is risky to authenticate workers, citizens, and customers with passwords. The proof is in the seemingly endless list of credential-based security breaches that we see in the news every day. There is also no doubting the devastating business impact of … Continued The post 3 Key Considerations in Your Pass
Problems with Passwords

Are passwords the weakest link in cybersecurity? We all know that it is risky to authenticate workers, citizens, and customers with passwords. The proof is in the seemingly endless list of credential-based security breaches that we see in the news every day.
There is also no doubting the devastating business impact of these breaches. It’s estimated that the average ransomware payment reaches almost $1.5 million and the average cost of business interruption from ransomware tops $5 million, according to a Lockton report.
But what is actually the root cause of all of these breaches? Passwords really aren’t the problem. It’s anonymous users hiding behind compromised credentials that represent the weakest link in cyber security.
What can we use instead of passwords that will prevent these breaches and keep our workforce, citizens, and customers safe? And when we finally decide to go passwordless, what should our strategy be? If eliminating passwords isn’t enough, what are the three things that are missing in your current passwordless strategy?

1. Password Reset

When you go passwordless, it’s almost impossible to get rid of all of your passwords simultaneously. That’s why it’s important to build a password reset mechanism into your passwordless strategy. With BlockID Workforce, the user never has to contact the help desk to reset a legacy password. Instead, all a user needs to do is open the account screen on his or her BlockID application, select the persona associated with the invalid password, enter and confirm a new password, and authenticate with LiveID. Not only does self-service password reset create a substantially easier user experience, it also saves your company $50-$70 each time a user doesn’t need to contact the help desk to reset their password. Another benefit with 1Kosmos legacy password self-service is that in less than 30 minutes, BlockID can integrate with your workstation, network, cloud apps, remote access solutions, or identity platforms.

2. Interoperability

You work with a multitude of platforms, all with a multitude of requirements and technologies, making it difficult to scale, securely manage, and modernize. Today one of the biggest challenges organizations face is interoperability. Interoperability is one of the limiting factors to digital transformations and passwordless experiences. You need to blend the new with the old to deliver services and data securely and efficiently. But how? You can’t rip and replace everything as that’s cost-prohibitive. When looking to new security standards your investment needs to look beyond the problem at hand. Many times when asking prospects what their 1-3-5 year road map is, they don’t have one, and that’s a problem. Without a plan, we end up with siloed infrastructures held together with duct tape and hope. So when investing in new technologies there are a couple of questions you need to ask yourself:

Is it built on modern architectures, and will it meet demands beyond my current needs? Does it have open APIs and an SDK so that you can integrate with new and old technologies? Can it scale to meet current and future requirements? Is it certified? Does the technology adhere to industry standards and regulations like – ISO/IEC 27001, Kantara for Identity Proofing, FIDO for Passwordless and iBeta Biometric certification, etc.?

Answering these questions can set you on the path to success as you’ll implement a technology that is fit for purpose and will grow with the business. You’ll be able to connect the old with the new while you continue down your modernization path keeping everything online, accessible and secure.

3. Identity

Identity is the foundational element to security. It’s critical you know who is accessing resources, so you can better determine what they should or shouldn’t have access to. The more you know about that identity the better. That’s why the 1Kosmos BlockID platform ensures that individuals are who they claim to be by using an identity-based approach to authentication. We bring worker, citizen, and customer identity into the security perimeter so that organizations know with certainty who is accessing IT assets and online services.
This means we have a quick and convenient way for users to self verify their identity using government, telco, and banking credentials. Then, once verified, workers, citizens, and customers use their digital identity to be utilized at login or transaction approval. This provides users with a frictionless experience and organizations with a high level of assurance for the identity on the other side of the digital connection.
By adding identity as a key pillar to network security, we help CISOs and Digital Experience leaders regain control of their IT services from anonymous users hiding behind compromised logins. With identity based authentication, organizations will no longer be held hostage to data breaches, ransomware, and financial fraud perpetrated via identity deception.

Are you interested in learning more? Edward Amoroso and I dig much deeper into all three of these points in our on-demand webinar, Identity-Based Authentication and the Journey to Passwordless. Watch the session to explore passwordless security, zero trust, and understanding who is on the other side of your digital connections.

The post 3 Key Considerations in Your Passwordless Journey appeared first on 1Kosmos.


KuppingerCole

Mar 09, 2023: Re-Imagining Identity Management for the Digital Era

An explosion of digital identities, coupled with multi-cloud adoption and the trend of working from anywhere, is adding complexity to managing identities and access rights. An innovative strategy is needed to enable organizations to support business and security needs in the digital era.
An explosion of digital identities, coupled with multi-cloud adoption and the trend of working from anywhere, is adding complexity to managing identities and access rights. An innovative strategy is needed to enable organizations to support business and security needs in the digital era.

auth0

Use React and Spring Boot to Build a Simple CRUD App

React is one of the most popular JavaScript frameworks, and Spring Boot is wildly popular in the Java ecosystem. This article shows you how to use them in the same app and secure it all with Okta.
React is one of the most popular JavaScript frameworks, and Spring Boot is wildly popular in the Java ecosystem. This article shows you how to use them in the same app and secure it all with Okta.

Monday, 23. January 2023

Holochain

Holochain Beta Release Sequence Has Begun

Dev Pulse 131

With the expected date of Holochain’s first-ever beta release this week, I thought it’d be good to go into the sequence of events that are coming before and after that moment. If you’re a developer who’s been waiting for stability before you jump into developing hApps, or if you’ve got an existing codebase and have put it on pause, this will tell you when you can jump back in.

Keep in mind that this is our best estimate of timelines; things may roll out slightly differently depending on what we encounter. In a project like this, there’s a big difference between “we’ve made a release” and “our whole ecosystem of dev tools is ready for you to use”. So there’ll be a bit of time after the release date as we update all the dependencies.

After that, I’ll share some release notes for updates that’ve happened since last Dev Pulse. Note that these are still just release candidates, so they’ll only be relevant if you’ve been keeping up with them in order to help us hunt for bugs and missing features in the leadup to beta.

And lastly, we've got another developer course coming up! If you wanted to attend the first one but couldn't, now's your chance — it's all online and happening in March.

Releasing a full DWeb app dev stack, piece by piece

Because Holochain is a full development stack for P2P apps, dev tools are important. And with every change in the core runtime and its APIs, all those tools have to be updated and re-released. We learned a lot of valuable stuff about what this means after our first release candidate in December — it was the first time we tried to update all the components in sync with each other. It became a dress rehearsal for 0.1.0-beta, and we feel well prepared now.

Last week we released Holochain 0.1.0-beta-rc.3 and rc.4. We intend for rc.4 to be the final one before beta, which should happen later in the week — 26 January is the date we’re aiming for.

‘Beta release’ means simply that there will be an 0.1.0 release tag on GitHub (note the absence of -beta), along with downloadable source code so you can compile your own binaries.

After that, devs will be able to update all the things that depend on it — specifically developer and user tooling. We’re anticipating that this will take a couple weeks, which should mean that everything will be ready to use by 9 February.

Here’s a roughly sequential list of everything that needs to happen for a ‘minimum usable release’:

Holochain 0.1.0 release: A particular commit in the Git history is marked as the official release. It includes the conductor runtime, the host SDKs (HDI and HDK), the proxy service binary, and the hc dev tool. You’ll be able to see it on our GitHub releases page and download the source code to compile yourself. Binaries will appear on crates.io a day or two afterward, along with the HDI, HDK, and conductor API documentation reference. The JavaScript and Rust clients, for building application front-ends, will be updated, released, and made available on NPM and crates.io. Many core parts of the dev and user ecosystem rely on these clients. The Tryorama test framework, used to write tests for your hApps and ours, will be updated, released, and made available on NPM. After the above components are updated, the scaffolder’s templates will be updated to include any changes. This is a new iteration of our rapid app development tool that can generate boilerplate back-end and front-end code for your hApp, along with tests and a Holonix dev environment setup to match your project’s Holochain dependencies. The hc launch dev command will be updated. This runs a bare-bones version of the Launcher to provide a minimal conductor environment, UI, and one or more agent nodes to test your full hApp. DevHub, a hApp package repo (that’s right, a hApp for distributing hApps) will be updated to support any changes in Holochain, and will be incorporated into the Launcher. Internal QA testing will make sure it all works. The Launcher, an easy-to-use container for downloading and running hApps, will be updated and released, with installers for Linux, macOS, and Windows available on GitHub. Launcher includes DevHub so devs can upload apps and users can download them. Holonix, a shell environment that includes all the software needed to develop and test hApps, will be updated to include all of the above. It will be made available via the quick start installation method. Time to start building! Holochain release notes Holochain 0.1.0-beta-rc.3: Bugfixes, zome name included in signals

Release date: 17 January 2023
HDI compatibility: 0.2.0-beta-rc.1, 0.2.0-beta-rc.2
HDK compatibility: 0.1.0-beta-rc.1, 0.1.0-beta-rc.2
Breaking changes: app API

This is mostly a bugfix release, with one breaking change requested by hApp developers.

Breaking (app API): App-generated signals now include the name of the zome that generated them, along with the existing cell ID. (#1750) New/bugfix: Pressing Ctrl+C on the command line gracefully shuts down the conductor, processing and cleaning up remaining cell tasks. (#1761) Bugfix: Calling emit_signal` from a post_commit callback caused a panic; now it succeeds. (#1749) Bugfix: Disabling and re-enabling an app caused all of its cells to become unresponsive when it tried to access the DHT. This happened because the agent had a zero storage arc when they rejoined; now their storage arc is recreated. Known issue: cells are still occasionally unresponsive when they’re re-enabled; the workaround is to disable and re-enable the whole hApp again. (#1744) Holochain 0.1.0-beta-rc.4: WASM metering, host API versioning, bundle format change

Release date: 20 January 2023
HDI compatibility: 0.2.0-beta-rc.1 to 0.2.0-beta-rc.3
HDK compatibility: 0.1.0-beta-rc.3
Breaking changes: admin API, app API, host API, hApp bundles, HDK

In addition to some new and clarified content in the documentation (#1778, #1765, #1767), there are a few bugfixes and some low-level WASM, host interface, and bundling format changes that will require rebuilding your hApp but probably won’t require you to modify your code.

Breaking (host API): Functions in the host interface are now versioned, and are also prefixed differently. For instance, __zome_info becomes __hc__zome_info_1 until its signature changes. This helps end-user hApp managers like Launcher and DevHub manage conductor/app compatibility. If you’re using the HDI and HDK, this will not require code changes; the names of the functions you use will stay the same (e.g., zome_info) and simply point to the newest supported underlying host function. You will need to recompile your zomes though, and of course if you’re bypassing HDI/HDK and calling the external functions directly, you will need to change your code. Also note that guest callbacks such as validate() are not yet versioned. (#1781 and holochain-wasmer #90) New, possibly breaking (hApps): WebAssembly metering has been re-enabled, this time increased from 10 to 100 giga-ops. This means that, after 100 billion WASM instructions, any zome call or callback will terminate, preventing buggy or malicious zomes from using up a machine’s resources. While this is 10× higher than the previous limit, we’re still asking you to test your code to see if it hits this limit. Also note that you’ll need to recompile your zomes, as the means of getting and setting input and output data has changed. (#1781 and holochain-wasmer #88) Breaking (admin API, hApp bundles): The resources of a bundle are now encoded as byte arrays rather than sequences of numbers, changing the format of hApp bundles. You’ll need to repackage your hApps to update them. (#1723) Breaking (app API): Each CellInfo variant has its own cell type now: Cell is split into ProvisionedCell and ClonedCell. Additionally, CellInfo enum variants are snake-cased during de-/serialization. The CreateCloneCell and EnableCloneCell endpoints now return ClonedCell instead of InstalledCell. The unimplemented endpoint SignalSubscription has also been removed. (#1763) Breaking (app API): The AppInfo endpoint now includes the agent public key used to instantiate the app in its return value. (#1786) Breaking (admin API): The StartApp endpoint, meant as a companion to the already removed PauseApp, has been removed. (#1785) Breaking (HDK): Links returned from get_links now contain the agent key of the author. Note: this means that otherwise identical links (that is, whose base, tag, and target are the same) are not deduplicated, unlike entries. This was the behaviour already; the new author field merely makes that more explicit. (#1782) Bugfix: When you tried to install a hApp for an existing agent and one of the cells being created already existed in another hApp, it would allow it. Now it returns an error. This error was unlikely to be encountered in daily use, because the usual method of installing a new hApp involves creating a new agent. (#1773) Bugfix: Disabled clone cells were being reenabled on conductor restart; now they stay disabled. (#1775) Improvements: The number of warnings in the conductor log has been reduced (#1779) and failed zome call errors have gotten more informative (#1783).

Read the full changelogs for the last two releases.

Known issues There are still intermittent issues with re-enabled hApps becoming unresponsive when they’re trying to access their DHTs. The current workaround is to disable and re-enable the hApp again. Disabling cells should now make them unresponsive, but some activity will continue in the background, namely they will continue to gossip. We intend to release a fix for this soon, though it may not go into the beta. As mentioned above, the new WASM metering might cause CPU-heavy zome functions to terminate early. This is not a bug but a new constraint to test your code against. We’re currently testing it against our own DevHub, which does some CPU-heavy data crunching in its zomes. Client release notes Javascript client 0.11.10: Signal handler changes, updated types

Release date: 11 Jan 2023
Holochain compatibility: 0.1.0-beta-rc.2

New/breaking: App-generated signals now carry the name of the zome they were generated by. Breaking: A lot of changes have been made to various admin and app API types. Breaking: The final parameter, a signal-handling callback, has been removed from WebSocket-based connection-establishing functions, leaving only the traditional JavaScript-style .on("signal", cb) event-registering function. Changed: All Launcher environment properties are now optional. Bugfix: Signals are now only emitted to the AppAgentClient that is set up to communicate with the cell the signals came from. JavaScript client 0.11.11: Async crypto imports

Release date: 14 Jan 2023
Holochain compatibility: 0.1.0-beta-rc.2

New: Utility functions that depend on crypto are now imported asynchronously. JavaScript client 0.11.14: Zome name in signals, API documentation

Release date: 19 Jan 2023
Holochain compatibility: 0.1.0-beta-rc.3

New/breaking: App-generated signals now carry the name of the zome they were generated by. Bugfix: The cell provisioning strategy type for clone cells has been fixed. New: The API now has auto-generated documentation, available in the GitHub repo. JavaScript client 0.11.15: Bundle resource serialisation change

Release date: 23 Jan 2023
Holochain compatibility: 0.1.0-beta-rc.4

The changes in this version are all to keep compatibility with the rc.4 release of Holochain.

New/breaking: In-line binary resources in a hApp bundle passed to InstallApp are now encoded in a UInt8Array rather than an array of numbers. Breaking: CreateCloneCell and EnableCloneCell return a ClonedCell object instead of InstallledCell. Breaking: The unused admin API endpoint StartApp has been removed. Breaking: Wherever Cell was previously returned from an API end point, it’s now a union type that can be either a ProvisionedCell or ClonedCell. Breaking: CellInfo variants are renamed to snake case.

Read the full changelogs.

Test framework release notes Tryorama 0.10.2: AppAgentWebsocket exposed

Release date: 15 Jan 2023
JavaScript client compatibility: 0.11.11
Holochain compatibility: 0.1.0-beta-rc.2

Along with an update to the JavaScript client, this release brings:

New: The AppAgentWebsocket is available to the local conductor object via a conductor.appAgentWs() getter function. This is now used by default by Scenario.addPlayerWithApp(). Removed: In line with the JS client update, signals no longer have type and data properties. Tryorama 0.10.3 and 0.10.4: JS client updates 0.10.3, released on 16 Jan 2023, updates to the unreleased bugfix JS client 0.11.12. 0.10.4, released on 16 Jan 2023, updates to the unreleased bugfix JS client 0.11.13. Known issues

Tryorama hasn’t yet been updated to support the breaking changes in Holochain 0.1.0-beta-rc.4.

Read the full changelogs.

Holochain Developer Training again — this time it’s online!

A lot of people have asked us when there’ll be another dev training course, and specifically when there’ll be an online one. Well, we now have an answer for you: in March! It’ll be a six-day course spread out over two (long) weekends, and will have the same all-day intensive format as the in-person course last July.

This course won’t cover Rust, so it’ll be the best fit if you’re already familiar with Rust and are ready to jump right into Holochain. If you want to learn Rust so you can attend, the Rust website has three great getting-started resources.

Having attended the course in July (mostly to take photos and cook soup), I can say that the instructors have created a really effective course. Their goal is to leave nobody behind, and the lead facilitator Marcus has developed a unique course format over the last decade to do exactly that. He co-founded and led the famous Hack Reactor boot camp, which enjoys a remarkable amount of success equipping devs to be ready to build real software. Marcus and his co-facilitators are really excited to be adapting this format for teaching Holochain.

Learn more about the course on our website and apply here.

Cover photo by SpaceX on Unsplash


Shyft Network

Celebrate the Lunar New Year With a Special Shyft Giveaway

Ni Hao Shyftoshis! Happy Lunar New Year ✨🐰 To kick off the Year of Rabbit celebrations in style, we have come up with a fantastic community giveaway. And there is not one but TWO Ledger Nano S Plus up for grabs. How to Enter the Raffle? All you need to enter the giveaway is to check out the Gleam page here. Complete the tasks and enter our giveaway! The Timeline The giveaway wi

Ni Hao Shyftoshis! Happy Lunar New Year ✨🐰

To kick off the Year of Rabbit celebrations in style, we have come up with a fantastic community giveaway. And there is not one but TWO Ledger Nano S Plus up for grabs.

How to Enter the Raffle?

All you need to enter the giveaway is to check out the Gleam page here. Complete the tasks and enter our giveaway!

The Timeline

The giveaway will start on January 23 and end on January 29. The two lucky winners will be chosen randomly and announced on January 30th. So, don’t miss out on this opportunity to win a Ledger Nano S Plus.

Don’t forget to celebrate the Lunar New Year with your loved ones, as it’s a time of renewal and new beginnings.

Join the Shyft community today, and let’s welcome the Year of the Rabbit on a happy note 🎉

In case you have any questions regarding the giveaway, drop them in the Shyft Discord channel.

Good luck and Gong xi fa cai!

______________________________

Shyft Network powers trust on the blockchain and economies of trust. It is a public protocol designed to drive data discoverability and compliance into blockchain while preserving privacy and sovereignty. SHFT is its native token and fuel of the network.

Shyft Network facilitates the transfer of verifiable data between centralized and decentralized ecosystems. It sets the highest crypto compliance standard and provides the only frictionless Crypto Travel Rule compliance solution on the blockchain while ensuring user data is protected.

Visit our website to read more: https://www.shyft.network, and follow us on Twitter, LinkedIn, Discord, Telegram, and Medium. Also, sign up for our newsletter to keep up-to-date.

Celebrate the Lunar New Year With a Special Shyft Giveaway was originally published in Shyft Network on Medium, where people are continuing the conversation by highlighting and responding to this story.


auth0

What is a Yubikey and how to set it up with Auth0?

Learn what is a Yubikey and how you can set one up with your Auth0 account.
Learn what is a Yubikey and how you can set one up with your Auth0 account.

Forgerock Blog

Data Privacy Week 2023: Privacy Gains Power From Other Societal Forces

Privacy was a major topic of conversation in 2022 — beyond just keeping personal data out of the hands of cybercriminals. Privacy now ties into many adjacent areas, including Zero Trust security, antitrust concerns, usage of services provided by "Big Tech," and biometric authentication. As the conversation around privacy progresses beyond a focus on security infrastructure and best practices for p

Privacy was a major topic of conversation in 2022 — beyond just keeping personal data out of the hands of cybercriminals. Privacy now ties into many adjacent areas, including Zero Trust security, antitrust concerns, usage of services provided by "Big Tech," and biometric authentication. As the conversation around privacy progresses beyond a focus on security infrastructure and best practices for preventing data breaches, regulations are working to catch up.

In 2023, the larger implications of privacy — including the ethics of using artificial intelligence (AI) and biometrics, the management of consumer-to-business relationships, and public issues such as consumer protection — will become much clearer through regulatory and legal action.

With today marking the start of Data Privacy Week, and ForgeRock proudly championing the cause, it's an ideal time to explore what to expect now and in the future in the evolving world of data privacy.

A Whole New World of AI, Consent, and Personal Data Management

What if AI could be used to power user consent, making life easier for individuals and businesses alike by enabling "smart" choices at the right moments? Offering personal data controls through AI could turn the tables on ethical concerns about this technology. A challenge is that privacy regulations typically expect consent to be concrete — explicit and specific to a purpose — and they look askance at automated data processing.

This is where the new UK General Data Protection Regulation (GDPR) offers hope. This bill allows for "automatic consent," which opens the door to more abstract forms of user permission that can smartly bundle up related usage patterns for consent and make it easier to manage consent in IoT environments, such as home automation and connected cars.

As organizations navigate the complex new world of AI, we recommend the following best practices to remain compliant and bolster data privacy measures:

Make user permissions for initial data sharing necessary and meaningful. That means letting people choose – and change their minds – without constraints or pressure. Allowing people to make consent decisions at times far removed from critical data-sharing moments and offering dashboards for monitoring and revoking consent help to make consent more meaningful. Enable data permissions that usefully anticipate the user's intentions. It may be impossible for an individual to say "yes" or "no" honestly to each request for data collection in the course of an average day, particularly when interacting with smart devices. Being asked for consent for a new purpose of use while in the middle of navigating one's car, for example, could be an extremely dangerous experience. Therefore, identity and access management (IAM) technologies and other systems must generalize rules for permissions and anticipate the user's intentions. This is where AI-first data management will derive new consents based on previously granted permissions, or by offering users simplified consent options that will justify actions for related permissions. Design AI-enabled permission experiences that can grow as the data collected about the user grows. Permissions need to scale so that the sheer amounts of data collected and generated by users, services, and devices can be automated and empower users to make more informed choices about the use of their data. What's Next? Data Privacy Takes Center Stage in 2023 and Beyond

With the new year officially upon us, and with data privacy finding common cause with trends such as Zero Trust security, consumer protection, and even Web3, we're predicting that privacy makes dramatic inroads as a motivator for every organizational stakeholder, enabling solutions that address multiple problems at once. Here are some predictions about these adjacent areas:

More passwordless authentication: As organizations take advantage of digital wallets for payment, biometric authentication, and passwordless authentication, we can expect to see more pressure on technologists to adapt. They must ensure high authentication assurances in every user journey, while keeping the customer experience within and across channels seamless. Decentralized identity will get a boost: This year, decentralized identity will solidify its role in society as wallet technology becomes more broadly adopted for identity purposes. This will open up opportunities both for strong, passwordless authentication to merge with wallet tech and for better methods for user control of personal information. AI at the heart of identity: In 2023, we'll see increased adoption of AI to secure identity and access management. AI, when made explainable and when appropriately paired with human oversight, has the potential not only to make identity safer for consumers and employees, but also to improve the lives of the cybersecurity professionals who must make sense of massive amounts of data.

Let's make 2023 the year enterprises step up to become leaders in privacy — to compete, not just comply, and take proactive action to better secure and respect the personal information in their custody.

For more information on Data Privacy Week, click here. Read more of our 2023 data privacy predictions in this piece here.


Trinsic Blog

The IDtech Builder’s Guide

How to Launch a Verifiable Credentials Startup in 2023 Table of Contents The One Tap Future of Digital Identity Is Coming At Trinsic we believe in a “one tap future” where people can prove statements with a single click of a button and gain access to what they need. Past attempts at this vision have […] The post The IDtech Builder’s Guide appeared first on Trinsic.
How to Launch a Verifiable Credentials Startup in 2023 Table of Contents The One Tap Future of Digital Identity Is Coming

At Trinsic we believe in a “one tap future” where people can prove statements with a single click of a button and gain access to what they need. Past attempts at this vision have had to compromise on privacy by using things like “sign in with Facebook”, leading to user data centralized on big technology company servers. Emerging technology standards like verifiable credentials and decentralized identifiers present the opportunity to put users at the center of their internet experience, while offering more convenience and ease of use.

Great Product Experiences Are Key to Adoption

The standards, protocols, and infrastructure for using verifiable credentials are ready to support millions of people every day, but we haven’t seen adoption take off yet. In short, it’s because it’s been very hard to build great product experiences until recently.

 

Requiring users to download new applications and scan QR codes was how many decentralized identity platforms, Trinsic included, have shown proofs of concept. The user experience that is ready for mass adoption is one that is more seamless for the everyday internet user and offers major convenience benefits without sacrificing privacy.

When a user accesses an app powered by Trinsic, they don’t have to make an account, create a username, or remember a password. They simple enter an identifier (like an email), verify ownership of that identifier, and now they can utilize any credentials associated with it.

How to Build the Next Great Identity Application

The need for better digital identity spans industries from education to workforce to social media, healthcare, finance, and more. Because of this, most entrepreneurs and developers won’t always define what they’re building by its implications for identity. Instead, the most successful builders are hyperfocused on solving a specific user problem in a given niche. So here’s our guide on how to get started and help usher in a new wave of IDtech products that put users at the center of their identities and come with privacy and consent built-in.

Step 1: Define the Niche Problem You Want to Tackle

Every great idea starts with a focused audience. Facebook started on a singular college campus, where it captured significant market and expanded out from there. Here are a few sample problems you could tackle with verifiable credentials, but recognize that even if you choose one of these ideas, you will have to narrow it down to a more specific problem and audience.

 

Sample identity problems in specific sectors:

 

Education: Students don’t have ownership of their diplomas and must request them from institutions whenever they apply for a new job. Workforce: Anyone can claim anything on LinkedIn, like saying they went to a certain college or worked at a certain company even if the information is false. Social media: You can’t prove and reward fans based on how early they followed you and supported your content. Healthcare: Every time you go to a new doctor’s office you have to fill in lots of repetitive paperwork. Finance: You have to re-scan your license or passport every time you want to open a different account.

 

These are all problems that entrepreneurs could address by building an IDtech product. If you’re serious about starting a company, you’ll want to do customer interviews, talking to dozens of people to more deeply understand the problems they face. But for now, let’s imagine you’re already an expert in one of these fields, you’ve done the user research, and you have a unique insight that you’re going to translate into a product.

Step 2: Formulate Your Value Proposition

Once you’ve developed a deep understanding of a specific problem, now you have to imagine the kind of solution that would solve this problem. Carrying forward our sample identity problems from before, here are some potential value propositions.

 

Sample value propositions in different sectors:

Education: Giving students ownership of their diplomas will make it easier for them to prove their learning achievements. Workforce: Utilizing verifiable credentials to prove your work history on a social media site like LinkedIn would make hiring more efficient. Social media: Allowing fans to prove their loyalty and fanhood will let creators reward their supporters for finding them early. Healthcare: Giving patients an easy way to hold their medical history and selectively share relevant information will reduce friction in getting quality healthcare. Finance: Storing a verifiable credential of your proven identity would allow you to more seamlessly access and open new accounts.


These are the types of insights that you could start prototyping concepts around using Trinsic’s platform. Since you’ve defined a problem and imagined a potential solution, the next step is to start sketching out how the product interactions would look.

Step 3: Determine the Information You’re Going to Represent as a Verifiable Credential

At the core of any IDtech company is giving users ownership of their information in the form of a verifiable credential. For each of the potential scenarios above, there is a foundational credential that would begin to unlock value for the user. In the education example, you might be representing a diploma as a credential. Or in the social media example, you are going to allow users to prove that they followed a creator in their first year of being on YouTube.

 

In Trinsic’s dashboard, you can define a verifiable credential schema, specifying the data types and which fields are required and optional in the credential. In the provable fanhood example, your credential schema could be something simple like:

 

Fan username Creator username Date followed Step 4: Determine Your Data Source + Start Issuing

Once you’ve defined a credential template, you’ll need to figure out how you’re going to populate credentials with the correct information. If you’re in the proof of concept stage, you can issue credentials manually through the Trinsic dashboard. As you develop your use case further, you will likely need to integrate with other APIs or existing databases in order to programmatically populate credential information and issue them to the corresponding users.

 

Once you have this all set, you can start issuing credentials. Right now you can issue credentials to a user’s email, and they’ll receive a notification that they have a new credential. The real value comes from utilizing the credential though, so the critical next step is building out the verification experience.

Step 5: Build Your First Verification Experience

The point of putting information in a verifiable credential is so someone can present it, prove a statement, and gain access to something they want. So what will you give your user access to? It depends on the use case, but let’s first think about the social media example. If you’re a creator, maybe you will give your first 100 YouTube followers access to a merchandise store with a 50% discount applied.

 

In this case, you’re going to build a way for an e-commerce store to show a discount based on information shared in a credential. Luckily we already built a demo of how this would work with our OkeyDoke e-commerce store. 

Step 6. Consider Trust and Governance

Now that you have a rough prototype of how you will issue credentials and how users can verify them to gain access to something they want, you’ve probably realized there is an opportunity for fraud here. Without a list of trusted credential issuers, what is to stop someone else from issuing themselves a credential that “proves” they were one of the first 100 subscribers on YouTube?


Trinsic’s trust registry service and dashboard governance feature make trust establishment easy. The core function here is to establish a list of which issuers are allowed to issue which credential schemas. Once you’ve put your trust registry in place, credential verifications will fail if the issuer is not trusted to issue the schema that is being checked. While the potential for abuse in a testing environment is relatively low, once your application launches to more users, you will have to carefully consider the implications of how your trust ecosystem is designed.

Step 7. Test and Get Feedback on the Product

Building out your proof of concept is just the first step toward developing a successful IDtech product. Hopefully with Trinsic’s platform you’ll be able to spin up prototypes in a matter of weeks, rather than months. This gives your team more time to seek feedback from early users and potential customers in order to refine the experience. Being able to show early traction and even beta customers can be the strongest proof point when pitching to investors.

We Need Great Identity Products

It’s time we take identity back from big technology companies and put users at the center of the internet experience. If you’re considering building an identity product, this framework should help you prioritize what’s important to get your ideas to market.

 

The one tap future is coming. The opportunity of a generation is building products that establish digital trust and bring more safety and privacy online. If you’re inspired to start building, you can sign up for a free Trinsic account today, and if you have questions along the way, drop into our Slack community and talk with our team.

The post The IDtech Builder’s Guide appeared first on Trinsic.


MyDEX

An Engine of Economic Possibilities

Image generated by AI using openai.com/dall-e-2 This is one of a series of blogs exploring Hidden in Plain Sight: The Surprising Economics of Personal Data, the subject of Mydex CIC’s latest White Paper. Imagine for a moment you are a blacksmith in a mediaeval village. There are thousands dotted around the countryside making horse shoes, stirrups, ploughs, pitchforks, shovels and so on. Every
Image generated by AI using openai.com/dall-e-2

This is one of a series of blogs exploring Hidden in Plain Sight: The Surprising Economics of Personal Data, the subject of Mydex CIC’s latest White Paper.

Imagine for a moment you are a blacksmith in a mediaeval village. There are thousands dotted around the countryside making horse shoes, stirrups, ploughs, pitchforks, shovels and so on. Every manor has one.

Now fast forward to today, a society and economy dependent on metal working in bridges, skyscrapers, cars, trains, planes and cranes, plus a phantasmagorical array of machines, gadgets, utensils and devices enabling us to do a breathtaking range of things.

You could say nothing has changed. After all, it’s all just metalworking isn’t it? Or you could say our world has been utterly transformed. The nature, scale and scope of metalworking today makes it qualitatively different to what came before.

The same is happening today with data. People have gathered, shared and used information ever since there were people to do so. Today, we have data gatherers and users dotted right across our land — a myriad of different organisations and service providers collecting and using personal data to do a wide range of things. Like mediaeval blacksmiths. Yet in many ways, compared to what’s still to come, it’s just a tiny beginning.

Just imagine

As with that blacksmith, many people find it hard to imagine or countenance the changes that are now unfolding.

This is the realm of innovation and growth — two words that are bandied about so casually and so often that often they’re little more than meaningless cliches. Yet the innovation and growth that’s now being made possible by the personal data store — new ways of collecting and using personal data — are real and important. Just like today’s metalworking versus that of yesteryear.

Four critical developments are now coming together to trigger an explosion in personal data-driven innovation. They are:

Radical cost reductions made possible by personal data stores stripping out the FERC (friction, effort, risk and cost) of data use by enabling its safe, efficient sharing and re-use. (Most people don’t think of this as ‘innovation’, but as we explain below, it’s the essential foundation and driver.) Making individuals the point of integration of data about themselves. Currently with today’s data systems, individuals’ data is dispersed across the hundreds of separate, different organisations that collect data about them, With each organisation jealousy guarding the data it has collected, it can never be brought together to create a rich, rounded picture of that individual (without creating horrendous privacy problems). This is a fundamental design flaw. It makes it structurally impossible to create a complete, holistic data picture of each individual. By enabling individuals to aggregate data about themselves safely and privately, personal data stores create an entirely new-to-the world, incomparably rich source of data about people and their lives — one that has been impossible to create up to now. It has the potential to act as a sort of ‘stem cell’ driving all future service provision, including a dazzling array of new person-centric services. A proliferation of new sources of data generation. If you had told that blacksmith that one day you could walk around with a metal device in your pocket that automatically counted the number of steps you take, he’d have said you are barking mad. But every smartphone can do that. And this is just a small beginning of the (highly personal) data that smart, internet of things enabled devices can and will soon be generating. Individuals in control of their data. Such a world of data proliferation could become — indeed risks becoming — a totalitarian, exploitative nightmare where super-powerful predatory corporations hoover up all the data there is, to exercise control over all and sundry, and extract whatever value they can. But if individuals are empowered with their own data, so that any sharing of such data remains under their control, the nightmare scenario can be avoided and benefits achieved, both efficiently and fairly.

It’s always hard to imagine a world that hasn’t come into being yet. That’s precisely the point about innovation — that it hasn’t happened yet. So what sort of innovations, and growth, are we talking about?

Innovation as democratisation

The first key point — the one most routinely overlooked — is that ‘innovations’ remain irrelevant until they are democratised. Until that is, they scale by becoming affordable and available to ordinary people. It is this that generates social and economic impact.

Take the motor car. It remained a plaything of the super rich until Henry Ford found a way to slash its costs of production by well over 90% — so that ordinary people could start buying and using cars. The same is true of every life-changing, transformational innovation: access to electricity, air travel, the internet.

The Internet was a social and economic irrelevance when it was a tool for geeks working in physics laboratories. Email was marginal when it was only for academics in universities. No matter what technology or innovation you are talking about, unless and until it is made safe, easy to use, affordable, and accessible/available its social and economic impact will be limited.

That is what personal data stores are doing with personal data: making it safe, easy to access and use at very low cost for all those wanting to use it — individuals themselves and services working for them. Personal data stores are turning something that was once reserved for the super-rich organisation — the collection and use of data — into something everyone can do, every day of their lives. That is transformational in itself.

Innovation as enrichment

Now factor in the individual as the point of integration of their own data. There are countless services today that either only exist in peoples’ imaginations or eke out a survival in the margins because the costs and complexities of amassing the data needed to provide them are currently too great.

Take something as simple as financial advice. To provide good financial advice, an advisor needs to access data held by many different parties in many different places (e.g. current accounts, savings accounts, loans, investments, pensions, assets owned, legacies and so on). Today, most of the costs of providing advice are absorbed by the costs of acquiring the data needed to inform it, not of arriving at the advice itself.

Personal data stores will transform the cost (and quality) of such advice, democratising it as they do so. The same is true of all aspects of our lives. For example: joined up, integrated health and care services that take full account of individuals’ particular genetic make up, medical histories, lifestyles, exercise regimes and so on; or integrated, joined up. life-long skills and career planning services that take full account of all the individual’s educational and training attainments, their personal skills, their work and life experience and their personal goals. And so on. And so on.

Now add in new sources of data. Not just the Fitbit data that may inform health advice. What about electronic receipts that enable you to automatically record every item you purchase, to see trends, to dig deeper into the quality, materials and provenance of particular items, to manage warranties, build personal asset inventories, compare prices, plan future purchases, identify possible substitutes, and so on?

You could, for example, generate a carbon footprint of every item purchased and every activity undertaken to identify your personal carbon footprint, with carbon concierge services offering ways of reducing this footprint.

New sources of data, made available safely and cheaply, making completely new services possible.

A social and economic transformation

Back in the 20th century, western societies and economies were transformed by what came to be known as the consumer goods revolution. Thanks to the advent of electricity national grids which made universal access to electricity possible, plus a productivity revolution in manufacturing made possible by Henry Ford-style mass production, suddenly people had affordable access to a dazzling array of devices, gadgets, machines and services that transformed their lives: washing machines, fridges, vacuum cleaners, kettles, radios, TVs, central heating, microwaves, computers, mobile phones, and so on.

The washing machine saved (mostly) women seven hours of labour a week. Central heating removed six hours a week shovelling coal and dust. Fridges helped people reduce time spent shopping and food waste. We can’t imagine our lives now without computers, the internet and mobile phones. Providing these new gadgets and devices created new industries with millions of new jobs and tax revenues to boot.

All this was made possible by democratising the machine. By taking the machine out of the factory where it was held exclusively by organisations and turning it into something that individuals and households could use in their own lives.

Personal data stores are doing the same with personal data. Democratising it. Taking it out of the organisational database, and turning it into a tool in the hands of individuals.

The 20th century consumer goods revolution was all about using energy to provide more, better products. The 21st century citizen services revolution will be all about using data to help individuals and households make better decisions and manage their lives better; services built around the unique circumstances and wellbeing of each individual, not the industrial logic of the corporation.

That is a vast transformation: not only the creation of entirely new services but also a transformation of what every existing service provider does and how, just as industry transformed agriculture). It is a transformation as big as the journey from the mediaeval blacksmith to modern industry.

It will also, by the way, accelerate the trend towards a net zero economy. That’s the subject of our next blog.

Other blogs in this series are:

The Great Data Delusion 19th century doctors thought bloodletting would cure most diseases. Today’s prevailing theories of personal data are little better. Why is personal data so valuable? Because of two things nobody is talking about: reliability and surprise Is it a bird? Is it a plane? No! It’s Super…! With personal data, what sort of a problem are we dealing with? The vital issue people don’t want to talk about Productivity is key to prosperity. But nobody wants to talk about it? Why? When organisations become waste factories The single design flaw at the heart of our economic system, and what happens if we can fix it. Why are joined-up services to difficult to deliver? Because the organisation-centric database is designed NOT to share data. People: the dark matter of the economy The elemental force that economics never really talks about

An Engine of Economic Possibilities was originally published in Mydex on Medium, where people are continuing the conversation by highlighting and responding to this story.


SelfKey

Living Avatar NFTs.

An identity layer composed of Living Avatar NFTs will be necessary for user interactions on the Metaverse to be reliable. Identity verification and NFTs are combined to form the Living avatar concept. #SelfKey #Web3 #Metaverse #LivingAvatar #NFT #DigitalIdentity 🌏

An identity layer composed of Living Avatar NFTs will be necessary for user interactions on the Metaverse to be reliable. Identity verification and NFTs are combined to form the Living avatar concept.

#SelfKey #Web3 #Metaverse #LivingAvatar #NFT #DigitalIdentity 🌏

Sunday, 22. January 2023

KuppingerCole

Analyst Chat #157: How to Refine Data like Oil - Data Quality and Integration Solutions

Who has not heard of the statement that "Data is the new Oil". But oil needs to be refined and so does data. The challenge of gathering, integrating, cleansing, improving, and enriching data across the complete range of data sources in an organization, for enabling use of that data as well as enabling data governance and supporting data security initiatives, that is the topic of this episode. Mart

Who has not heard of the statement that "Data is the new Oil". But oil needs to be refined and so does data. The challenge of gathering, integrating, cleansing, improving, and enriching data across the complete range of data sources in an organization, for enabling use of that data as well as enabling data governance and supporting data security initiatives, that is the topic of this episode. Martin Kuppinger joins Matthias and explains this market segment and its relevance on the occasion of the publishing of a new Leadership compass covering "Data Quality and Integration Solutions".



Friday, 20. January 2023

Finicity

Highnote Partners With Mastercard Engage To Accelerate The Adoption of Embedded Finance

Highnote has become the latest Mastercard Engage partner.  As a qualified technology partner within the Mastercard Engage network, Highnote can help businesses quickly build and deploy embedded finance solutions at… The post <strong>Highnote Partners With Mastercard Engage To Accelerate The Adoption of Embedded Finance</strong> appeared first on Finicity.

Highnote has become the latest Mastercard Engage partner.  As a qualified technology partner within the Mastercard Engage network, Highnote can help businesses quickly build and deploy embedded finance solutions at scale. Through this partnership, cardholders can initiate transfers directly in and out of card products powered by Highnote. 

This partnership additionally enables customers to not only take advantage of Highnote’s industry-leading ACH origination, card issuance, and money movement capabilities but also instantly verify cardholders’ accounts, with their permission.

“The pandemic accelerated the adoption of digital payments through embedded finance,” says Mastercard’s EVP, U.S. Open Banking, Andy Sheehan. We see this partnership as an excellent opportunity to give consumers and businesses more choice in how they pay for goods and services. We’re thrilled to bring together Highnote and Mastercard teams to allow enterprises to seamlessly embed payments into their existing digital products.” 

Take a deeper dive into the innovation of this partnership here.

The post <strong>Highnote Partners With Mastercard Engage To Accelerate The Adoption of Embedded Finance</strong> appeared first on Finicity.


Civic

Civic Milestones and Updates: Q4 2022

Ecosystem The fourth quarter of 2022 brought a downward slide for crypto markets as we head into 2023. A CoinDesk benchmark index, which encompasses 163 digital assets, fell 12% from September through Dec. 15. The downturn is clearly multifaceted, and is partially attributed to central bank action against rising inflation. But, with the collapse of […] The post Civic Milestones and Updates: Q4 2

Ecosystem The fourth quarter of 2022 brought a downward slide for crypto markets as we head into 2023. A CoinDesk benchmark index, which encompasses 163 digital assets, fell 12% from September through Dec. 15. The downturn is clearly multifaceted, and is partially attributed to central bank action against rising inflation. But, with the collapse of […]

The post Civic Milestones and Updates: Q4 2022 appeared first on Civic Technologies, Inc..


This week in identity

E19 - The Regulation Episode / Guest interview with Kristian Alsing / NIS-D / NIST 800-63-4 / PSD2-SCA / GDPR

Welcome to the first episode of 2023! After a short festive break, Simon and David are back to bring you the latest industry analyst views on a range of different identity and access management topics.  This week, they have a special guest: Kristian Alsing - a Senior Cyber Security and Business Resilience Executive - with 20 years experience working for the likes of Accenture and Deloitte. &n

Welcome to the first episode of 2023! After a short festive break, Simon and David are back to bring you the latest industry analyst views on a range of different identity and access management topics.  This week, they have a special guest: Kristian Alsing - a Senior Cyber Security and Business Resilience Executive - with 20 years experience working for the likes of Accenture and Deloitte.  Kristian recently wrote a great guest article for The Cyber Hut on NIS-2. In this episode the guys cover a range of topics relating to regulation and the role of IAM - covering critical infrastructure, the ever increasing supply chain and the rise of destructive attacks in waiting!


Spruce Systems

Announcing did:day - An Exploration of Decentralized Identity at ETHDenver BUIDLWeek

Announcing did:day - An Exploration of Decentralized Identity at ETHDenver BUIDLWeek. Web3 has enabled countless users to take control of their financial assets across the web, and we aim to take this a step further - allowing users to control their identity and data.

Web3 has enabled countless users to take control of their financial assets across the web, and we aim to take this a step further - allowing users to control their identity and data.  

We’re happy to announce did:day: a half-day event hosted by Spruce during ETHDenver’s #BUIDLWeek to explore all things decentralized identity in collaboration with other leaders and pioneers in the ecosystem. This is the first event dedicated to decentralized identity hosted during ETHDenver. It will take place on March 1st at The Source Hotel in Denver, CO.

did:day did:day 2023 is a half-day summit during BUIDLweek at ETHDenver, filled with lightning talks, panels, and keynotes from the Web3 Identity ecosystem.

At did:day, we’ll be joined by speakers representing other leaders in the space–including Disco, Ceramic, ENS, Gitcoin Passport, PolygonID, Unlock, Lit Protocol, CyberConnect, and Lens–to bring together visionaries who are building the foundations to turn decentralized identity into a reality.

Our mission with did:day is to create a forum dedicated to educating about decentralized identity, how it impacts web3 and beyond, and how we cross the chasm to mass adoption. The event will feature several lightning talks and panels, covering numerous topics around user-controlled identity and data, decentralized social media, and key management.

We invite builders, industry leaders, and individuals curious about Web3 identity to attend and learn about what’s happening in the world of user control beyond finance.

Registration is now open, and we’re using Unlock Protocol for a native web3 ticketing experience. Space is limited, so please only register if you intend to attend! If you have any questions or issues, please feel free to drop us a line in our Discord, or on Twitter. We’ll see you in Denver!


SelfKey

Selfkey Identity Mobile Wallet.

The latest version 1.0.0 of the SelfKey Identity Mobile Wallet has just been released, which we are happy to share. Features of the newly released mobile wallet include a fresh design and bug fixes.

Greetings to our valued community! We are pleased to announce the release of the most recent version 1.0.0 of the SelfKey Identity Mobile Wallet.

Features of the newly released mobile wallet include a fresh design and bug fixes.

#SelfKey #Web3 #Metaverse #DigitalIdentity #IdentityWallet


Metadium

Metadium 2.0: Our Roadmap

Dear community, Metadium started five years ago as a blockchain platform focused on Decentralized Identifier (DID) technology, and it has continued expanding in the web3 market. Over the years, Metadium’s technology has been used in multiple services and solutions, ranging from identification and authentication to mobile payment and transportation services. Our ecosystem is constantly growing and

Dear community,

Metadium started five years ago as a blockchain platform focused on Decentralized Identifier (DID) technology, and it has continued expanding in the web3 market. Over the years, Metadium’s technology has been used in multiple services and solutions, ranging from identification and authentication to mobile payment and transportation services. Our ecosystem is constantly growing and has recently reached industries like logistics, automotive, tourism, and more.

We are happy to share with you Metadium 2.0’s roadmap as well as the plans for our future:

Metadium 2.0

Blockchain Identification for All

Metadium is an open-source public blockchain that provides identification by issuing, authenticating, inquiring, verifying, and managing history through various services based on decentralized identifier (DID) technology.

1. Platform Technology

Ethereum equivalence support and EVM compatibility

The Metadium platform is developed with the expansion of META ID and the availability of smart contracts as a top priority. Provide EVM compatibility for an easy onboarding process and the smooth processing of tasks (deployment, execution, etc). Reflect Ethereum Improvement Proposals (EIPs) to ensure an environment that promotes Metadium ecosystem’s growth and contributes to the blockchain ecosystem with open source library updates. 2. Governance Enable active governance operations through an autonomous authority configuration to develop and grow the Metadium ecosystem. Optimize Metadium’s cloud infrastructure and governance contracts for more active participation and platform utilization of node authority. Create an environment where everyone interested in blockchain and its services (DApps), including Metadium users, can make suggestions and be considered. Evolve Metadium’s global vision of DAO organizations with a more professional and detailed operational direction. 3. META ID 2.0

Scaling up through META ID

Provide a stable and seamless user experience and META ID smart contracts despite a growing ecosystem and the number of users. Expand Metadium’s ecosystem with Decentralized ID (META ID) and ID Hub infrastructure. Provide an environment with functions and information about the decentralized ID (META ID), and ID Hub can be applied to each of the DApp’s settings . It also provides model discovery and functionality that combine blockchain information and user authentication information like META ID user labeling and inter-blockchain VC. Increase scalability by taking a “Hub” role that connects authentication information, such as user KYC and other credentials, through a Metadium-based bridging technology that can interoperate with other blockchains and services. 4. Development Support Create an open and developer-friendly environment for onboarding various projects into the ecosystem. Simplify the onboarding process by providing SDK and open source that can be easily deployed and META ID & ID Hub packages in a library format that includes the W3C standard. Provide data about Metadium, data flow visualization (API) in META ID and ID Hub, and tools that promote a more transparent and reliable development environment.

Please stay tuned for exciting news, including our DEX, CEX, listings, partnerships, and more. The Metadium Team is constantly working hard to bring innovative services and solutions as well as to expand our ecosystem on the global scene.

- Metadium Team

안녕하세요. 메타디움 팀입니다.

5년간 Metadium은 탈중앙화 신원인증 DID(Decentralized Identifier) 기술 분야에서 선도적인 블록체인 메인넷으로 출발하여 Web3 마켓과 생태계와 함께 성장해왔습니다. 메타디움의 기술은 인증, 검증부터 모바일 결제 및 대중교통 서비스에 이르기까지 다양한 서비스와 솔루션에서 수년간 사용되고 있습니다. 메타디움의 생태계는 지속적인 성장을 이루고 있으며 최근 물류, 모빌리티, 관광 등의 산업까지 범위를 넓혀가고 있습니다.

Metadium 2.0

모두를 위한 신원인증 블록체인

Metadium은 블록체인 서비스 이용자의 신원(탈중앙화 ID)를 통해 발급, 인증, 조회, 검증, 이력관리 등을 수행하며, 신원을 필요로 하는 모든 사람들을 위한 오픈 소스 퍼블릭 블록체인입니다.

1. 플랫폼 기술 2.0

Ethereum equivalence support and EVM compatibility

메타디움 플랫폼은 META ID의 확장과 스마트 컨트랙트의 가용성을 최우선적으로 고려하여 개발합니다. 생태계 진입 편의성 및 원활한 작업 처리(배포, 실행)를 위해 EVM의 호환성을 제공합니다. 메타디움은 플랫폼 개발자 생태계 확장을 위해 EIP(Ethereum Improvement Proposals)를 지속 반영하며 오픈 소스 라이브러리 업데이트로 블록체인 기술 생태계에 기여합니다. 2. Governance 2.0 Metadium 2.0에서는 기존 Metadium 생태계를 개발하고 성장시키는 것을 목표로 자율적인 Authority 구성을 통해 적극적인 거버넌스 운영이 실현됩니다. 노드 Authority의 보다 적극적인 참여와 플랫폼 이용을 위해 Metadium 클라우드 인프라와 거버넌스 컨트랙트 최적화 업무들을 올해 진행합니다. Metadium 사용자를 포함하여 블록체인과 서비스(DApp)에 관심을 가진 모두가 자유롭게 제안할 수 있고 다양한 의견과 요구사항이 수용 가능한 환경을 조성하여, 글로벌 비전 DAO 조직의 큰 틀에서 보다 전문적이고 자세한 논의가 필요한 카테고리 분류와 세분화된 운영 방향을 통해 발전해 나가도록 합니다. 3. META ID 2.0

Scaling up through META ID

Metadium은 탈중앙화 ID (META ID)와 ID Hub 인프라로 생태계를 확장해 나갈 수 있도록 생태계 네트워크의 성장과 사용자 수 증가에도 안정적이고 원활한 이용 환경과 META ID 컨트랙트를 제공합니다. 탈중앙화 ID (META ID)와 ID Hub에 대한 기능과 정보를 각 DApp의 환경에 적용할 수 있는 환경을 제공하며, META ID User Labelling, Inter-blockchain VC 등 블록체인 정보와 사용자 인증 정보를 접합하는 모델 발굴과 기능 제공을 통해 흩어진 데이터를 보다 가치 있게 활용 가능토록 합니다. Metadium 플랫폼을 기반으로 기타 체인 및 서비스와의 상호 운영이 가능한 브릿징 기술을 통해 사용자 KYC 등의 인증 정보를 이어주는 “Hub” 역할을 실현하여 확장성을 증대하는 기술 개발을 추진합니다. 4. 개발자 생태계 지원

Development Support

Metadium은 개방적이고 개발자 친화적인 환경을 조성하며 기존 프로젝트 생태계 외에도 새롭고 다양한 프로젝트 온보딩 및 생태계 확장을 목표로 합니다. 온보딩에 필요한 작업 및 절차의 단순화 하기 위해 플랫폼 위에 구축할 수 있는 SDK 및 오픈소스를 제공하며, META ID & ID Hub 패키지는 W3C 표준을 포함하는 라이브러리 형태로 개발/제공됩니다. Metadium에 대한 데이터, META ID & ID Hub의 흐름의 시각화 정보(API)를 제공하며, 보다 투명하고 안정적으로 개발할 수 있는 환경을 제공하기 위한 도구들을 제공합니다.

DEX, CEX, 신규 리스팅, 파트너쉽 등을 포함한 메타디움의 새로운 뉴스에도 많은 기대 부탁드립니다. 메타디움 팀은 혁신적인 서비스와 솔루션을 제공하고 세계를 무대로 생태계를 확장시키기 위해 끊임없이 최선을 다하고 있습니다.

Website | https://metadium.com

Discord | https://discord.gg/ZnaCfYbXw2

Telegram(EN) | http://t.me/metadiumofficial

Twitter | https://twitter.com/MetadiumK

Medium | https://medium.com/metadium

Metadium 2.0: Our Roadmap was originally published in Metadium on Medium, where people are continuing the conversation by highlighting and responding to this story.

Thursday, 19. January 2023

KuppingerCole

Championing Privileged Access Management With Zero Trust Security

A modern approach to securing privileged accounts is to apply the principle of Zero Trust: Never trust, always verify. While Zero Trust is not an off-the-shelf solution, it is modern vendors of PAM solutions that recommend using this security principle to cement the technical capabilities of their products. This webinar will provide actionable insights for organizations to employ Zero Trust securi

A modern approach to securing privileged accounts is to apply the principle of Zero Trust: Never trust, always verify. While Zero Trust is not an off-the-shelf solution, it is modern vendors of PAM solutions that recommend using this security principle to cement the technical capabilities of their products. This webinar will provide actionable insights for organizations to employ Zero Trust security in their overall PAM strategy and operations.

Paul Fisher, Senior Analyst at KuppingerCole, will look at the origins of Zero Trust theory and its development as well as how Zero Trust fits into the KuppingerCole identity and cybersecurity fabric architecture concepts. He will address how Zero Trust can assist in managing cloud entitlements and discuss we 100% Zero Trust is possible. Srilekha Sankaran, product consultant for PAM solutions at ManageEngine, will address the risks of insider threats and privilege misuse in the era of hybrid work, and discuss the management and elimination of the risks posed by standing privileges.




1Kosmos BlockID

What is Blockchain Verification & Validation?

What is Blockchain Verification & Validation? Modern network infrastructure is turning towards decentralized models of record keeping. Authentication and identity management are no different. What is blockchain verification? Blockchain verification uses private blockchain technology to store and verify identity credentials. What Is a Blockchain? A blockchain is a relatively new ledger and data
What is Blockchain Verification & Validation?

Modern network infrastructure is turning towards decentralized models of record keeping. Authentication and identity management are no different.

What is blockchain verification? Blockchain verification uses private blockchain technology to store and verify identity credentials.

What Is a Blockchain?

A blockchain is a relatively new ledger and database technology invented to support decentralized information management. Initially conceived by Satoshi Nakamoto in their Bitcoin whitepaper, the blockchain serves as a solution to two specific problems–one, to avoid record duplication in the ledger, and two, to provide a decentralized mechanism for verification modeled on peer-to-peer transactions.

While the internal workings of a blockchain can get quite complex, the simple application of any blockchain is in contexts where users or organizations want decentralized records management. Users collectively provide resources to verify credentials and other information via built-in cryptographic standards. At the same time, they get to retain their information without relying on monolithic databases.

Several blockchain types serve this purpose, all of which fit into different contexts. These blockchains include:

Public: Public blockchains are (obviously) public ledgers in which any user may participate in transaction verification and (if the blockchain supports it) information exchange. Centralized organizations do not maintain these blockchains, although an organization may participate as users. Common examples of public blockchains include most cryptocurrencies like Bitcoin or Ethereum.

The advantages of public blockchains are that they are open and trustable because no one can alter records independently. Conversely, they tend to be inefficient in terms of energy consumption and performance, and they don’t scale as well as their private counterparts.

Private: Private blockchains adopt the decentralized approach of the blockchain on a smaller scale, typically within an organization or application. It still relies on peer-to-peer transactions and decentralized data management, but there are often additional controls in place managed by a central authority.

While not as free or transparent as their public counterparts, private blockchains are typically scalable, secure, and fast.

Consortium: Consortium blockchains are a collection of blockchain systems owned by private interests used to streamline information sharing and workflows. Hybrid: A combination of public and private systems where an organization (or group of organizations) may segregate private blockchain data internally while sharing public data on a public blockchain system.

Additionally, two significant access categories may apply across different blockchain types:

Permissionless: Permissionless blockchains allow any and all users to join and participate in the network without centralized control. Almost all public blockchains are permissionless, but it is possible to have permissioned, public systems. Permissioned: These blockchains are those where users must follow specific rules and regulations to participate. This participation is almost always predicated on the authority of a central organization or consortium.

These categories are not exclusive to a blockchain type, but more often than not, a public blockchain will be permissionless while private chains are permissioned.

How Does the Blockchain Support Identity Verification?

Traditionally, authentication and identity verification work through a series of applications and databases–a centralized store of user information and credentials.

This presents a few problems:

Honeypots: Singular databases are known as “honeypots,” or attractive targets for hackers. If an identity database is compromised, then every user’s identity is threatened–including any connected information throughout that system. Ownership: Individual users do not own or manage databases… large companies do. As such, it’s increasingly difficult for users to disentangle their personal information from large companies that store it. Blockchain schemes allow users to manage their own information within a blockchain system without relying on a major company. Internet of Things (IoT) and Distributed Devices: The increasing adoption of smart devices and Bring-Your-Own-Device (BYOD) work models makes centralized authentication and verification challenges. A blockchain authentication model can help make these networks more scalable and secure.

A blockchain can address these issues through decentralized management and user-focused participation.

Benefits of Digital Identity and Blockchain Verification

Because blockchains address questions of security, distribution, and scalability, they bring significant benefits to organizations that adopt them.

Some of these benefits include:

Self-Sovereignty: A major strength of the blockchain is that users own their own data on their devices. Rather than rely on large databases, the system can authenticate user credentials against those stored locally on a user’s device.

We often lose sight of the importance of data ownership, and blockchain verification can go a long way in foregrounding self-sovereign identity management.

Transparency: Blockchains are essentially transparent–that is, anyone participating on the chain has access to information relevant to the chain. Likewise, these records are under the user’s control, which means they know exactly what information is on the network and, if necessary, correct or remove it. Portability: Modern security standards emphasize data portability, or the capacity to move data from one location or system to another. With blockchain verification, it becomes much easier to move information between compliant systems without having to have, for example, multiple accounts or worry about data format and compatibility. Security: Additionally, such portability will strengthen the security around modern authentication approaches like federated identity management and Single Sign-On (SSO) schemes. Rather than having shared databases and complex APIs, blockchains could make moving between participating systems much easier–all while providing more control over what data is and is not exposed. Decentralized Key Management: A major issue in security and cryptography is key management, or the secure sharing of decryption keys so that users can keep their data obfuscated without impacting their usability or compromising overall security. A blockchain can provide a resilient form of key management that doesn’t present singular points of failure. Relying on Secure, Decentralized Identity Verification with 1Kosmos

Blockchain technology is quickly becoming a staple of enterprise record keeping, which is very apparent in authentication and identity verification. Private blockchains are helping support companies manage distributed users worldwide in a scalable and safe way, putting ownership of private data back in the hands of end users.

With 1Kosmos, you get this blockchain verification technology as part of our feature set. These features include:

Private and Permissioned Blockchain: 1Kosmos protects personally identifiable information in a private and permissioned blockchain, encrypts digital identities, and is only accessible by the user. The distributed properties ensure no databases to breach or honeypots for hackers to target. Identity-Based Authentication: We push biometrics and authentication into a new “who you are” paradigm. BlockID uses biometrics to identify individuals, not devices, through credential triangulation and identity verification. Cloud-Native Architecture: Flexible and scalable cloud architecture makes it simple to build applications using our standard API and SDK. Identity Proofing: BlockID verifies identity anywhere, anytime and on any device with over 99% accuracy. Privacy by Design: Embedding privacy into the design of our ecosystem is a core principle of 1Kosmos. We protect personally identifiable information in a distributed identity architecture, and the encrypted data is only accessible by the user. SIM Binding: The BlockID application uses SMS verification, identity proofing, and SIM card authentication to create solid, robust, and secure device authentication from any employee’s phone. Interoperability: BlockID can readily integrate with existing infrastructure through its 50+ out-of-the-box integrations or via API/SDK.

To learn more about private blockchain and identity management, sign up for our newsletter and read more about 1Kosmos Identity Proofing.

The post What is Blockchain Verification & Validation? appeared first on 1Kosmos.


Forgerock Blog

2023: Perspectives from the ForgeRock C-Suite

Predictions on insider threats, passwordless authentication, artificial intelligence, and more Few industries move as quickly as cybersecurity, broadly, and the identity and access management (IAM) segment, specifically. There is a constant barrage of novel threats that can lead to costly breaches — and innovative, new ways to combat them. Seismic changes in the workplace push IAM leaders to cr
Predictions on insider threats, passwordless authentication, artificial intelligence, and more

Few industries move as quickly as cybersecurity, broadly, and the identity and access management (IAM) segment, specifically. There is a constant barrage of novel threats that can lead to costly breaches — and innovative, new ways to combat them. Seismic changes in the workplace push IAM leaders to create solutions that protect employees and data no matter where they are. And an evolution in the way people consume apps and services — conducting more and more of their personal business online — means we must help our enterprise customers meet new expectations for security, personalization, and seamless digital experiences.

The changing nature of the industry is why it's become something of a tradition to look at emerging trends and predict where they may take us in the coming year.

Recently, ForgeRock executives shared their views on trends related to digital identity and the challenges organizations will face in 2023. Their perspectives are based on their own expertise along with what they're hearing from extensive discussions with customers and what they're observing across the industry at large.

Fran Rosch Chief Executive Officer Sweeping corporate layoffs will cause insider threats to rise

As we head into 2023, the security risk associated with third parties is not going away. With the threat of an economic downturn, many companies are conducting hiring freezes and, unfortunately, massive layoffs, causing insider threats to rise to crisis levels. To fill in workforce gaps, many companies will turn to consultants to get them through this tumultuous period of economic uncertainty.

However, consultants and contractors can bring the unintended risk of breaches to an organization's doorstep. They often get access to sensitive information and are allowed on company networks, but their security practices and training may differ from full-time employees. If a consultant's device is compromised, it's too easy for malware to make its way into an organization's network and spread to other devices, putting the whole organization at risk.

One solution is a more robust governance solution to give enterprises better visibility into who has access to what information, on what device, and from any location. For large enterprises, the only way to truly manage this governance is with the aid of artificial intelligence (AI) and machine learning.

Peter Barker Chief Product Officer Artificial intelligence adoption in identity will accelerate

The integration of AI has been growing in cybersecurity and we can expect to see further adoption in the identity and access management space in 2023. The massive transformation to digital engagement, paired with the remote nature of our working lives, has opened the door for new and more relentless types of attacks, like account takeovers, inappropriate access, and fraud. Alongside the widening skills gap facing the cybersecurity industry, and the increasing sophistication of threat actors, enterprises need to transform their solutions to stay ahead.

Enterprises should use all the tools at their disposal to stay ahead of cyberattackers and secure their systems, while ensuring a seamless experience for end-users. AI-powered cybersecurity defenses are among the strongest tools organizations have in their arsenal against cybercrime and will be front and center in the next big tech wave for preventing cyberattacks.

Eve Maler Chief Technology Officer Retailers will blaze the trail in implementing passwordless authentication for consumers

Passwordless has been in our crystal ball for a very long time – but never has it been closer than now. Retailers, in particular, are facing increased security, fraud, and account takeover threats as they adopt new digital channels and technologies. We see them leading the way in implementing broad consumer adoption of passwordless authentication.

Digital wallets and biometrics have become critically important for unlocking consumer devices and enabling easy next steps such as purchase approval. In self-checkout scenarios, retailers face unique challenges since physical fraud can also be a major concern. Many retailers are feeling the pressure to go fully self-service in a legally compliant way even in the case of selling age-restricted goods such as liquor. Typically, such purchases require intervention by staff to check someone's physical ID, which slows checkout.

In these scenarios, digital wallets are getting a second look as a source of not just payment, but also verified user information presented in a format that the user can't tamper with. As more retailers adopt passwordless and make it more mainstream, we're going to see more and more consumers pulling it into their everyday lives. This is the nail in the coffin for passwords long-term, and in 2023 retailers will make more deliberate efforts toward the integration of the passwordless society we've been working toward for so long.

David Burden Chief Information Officer Workplace volatility will shift cybersecurity practices in 2023

Gartner predicts that by 2025 "labor volatility" will "cause 40% of organizations to report a material business loss, forcing a shift in talent strategy from acquisition to resilience." I believe we're going to see this truly begin in 2023, and it's going to put an even greater emphasis on the identity perimeter as related to the workplace.

As more cybercriminals target employees to gain unauthorized access to the greater organization, businesses will take measures to reduce vulnerabilities that attackers are slipping through. Overprovisioning, rubber-stamping access requests, and out-of-control shadow IT are putting a strain on enterprises and their ability to manage the volume and velocity of IT requests. Prepare to see more passwordless technology gaining traction in the workplace, plus greater security measures related to things like physical access to buildings, registering for hotdesk using software, digital access to services, and collaboration tools.


Anonym

How Do Blockchains Provide the Trust Foundation for Decentralized Identity-Based Apps?

By Dr Paul Ashley, co-CEO and CTO, Anonyome Labs This article was originally published in Michael Bazzell’s Unredacted magazine, September 2022. To learn more, view the latest issue. My first article, in the June 2022 issue of Unredacted, introduced some initial concepts around decentralized identity – the new technology that’s giving users greater control over their personal d

By Dr Paul Ashley, co-CEO and CTO, Anonyome Labs

This article was originally published in Michael Bazzell’s Unredacted magazine, September 2022. To learn more, view the latest issue.

My first article, in the June 2022 issue of Unredacted, introduced some initial concepts around decentralized identity – the new technology that’s giving users greater control over their personal data and identity. Here, I’ll expand on those initial concepts and focus on decentralized identity blockchains—the trust foundation for decentralized identity-based applications and an essential part of the decentralized identity story. 

Blockchain = verifiable data registry

The technical term for the decentralized identity component that provides the trust foundation for the whole system is a verifiable data registry (VDR). Similar in its application to centralized public key infrastructure (PKI), the VDR allows users and services to verify their authenticity by proving they hold the private key corresponding to the public key written to the VDR. It is often called a decentralized PKI.

Most people don’t use the term VDR, but rather the more widely known terms distributed ledger or blockchain, to describe how the component is implemented in the system. In reality, there are over 100 distinct VDRs available, built using well-known technologies such as Hyperledger Indy, Ethereum, Bitcoin, Interplanetary File System (IPFS), Hyperledger Fabric, Cosmos, and so on.

Hyperledger Indy 

The most successful VDR is Hyperledger Indy. Created within the Linux Foundation, Indy is a public ledger designed specifically and only for privacy-preserving decentralized identity (also called self-sovereign identity). The Hyperledger Indy ledger is for credential issuers to publish data necessary for issuing verifiable credentials, and for holders to construct presentations based on those verifiable credentials. 

Anonyome Labs supports and maintains validator nodes for both the Sovrin and Indicio Hyperledger Indy networks. Let’s look at Hyperledger Indy’s characteristics:  

Public ledgerAnyone can read the data on the ledger.Permissioned networkThis statement has two different aspects:   1) The Hyperledger Indy network comprises a set of validators that have been approved (permissioned) to run the network. For example, the Sovrin Foundation approves validators to run on its three networks (Dev, Staging, and MainNet).  2) The ability/permission to write data to the Hyperledger network, which allows for writing decentralized identifiers (DIDs), schemas, credential definitions and other decentralized identity-related items. To write to the ledger, an organization must have endorser permissions (which includes paying the appropriate fee to become an endorser). Consensus algorithmJust like Bitcoin, validator nodes must come to agreement (consensus) before anything is written to the Hyperledger Indy network. Indy uses the Redundant Byzantine Fault Tolerance (RBFT) consensus algorithm. Indy networks typically have 24 validator nodes.GovernanceHyperledger Indy uses an offline and centralized governance model. In practice, that means people work together to create the network’s governance policy, write policy documents and have validators and Endorsers sign the agreements. These governance activities happen in the physical world.Economic modelApplication developers pay the organizing company or foundation (e.g. Sovrin) for permission to write to the network.

Cosmos

Anonyome Labs runs a Cosmos validator node on the cheqd network. For comparison with Hyperledger Indy, here are some characteristics of a Cosmos-based VDR network:

Public ledgerAnyone can read the data on the ledger.Permissionless networkThis statement has two different aspects:  1) Any organization can join the network as a validator.2) Any organization can write to the network.Consensus algorithmCosmos uses Tendermint Byzantine Consensus Algorithm (BCA) to establish a consensus between validator nodes.GovernanceCosmos uses an online and decentralized governance model. In practice, that means that any validator can submit a governance proposal to the network, which validators will vote on and either approve, decline, abstain, or veto. All this happens online.Economic modelCosmos has a very well-defined crypto-economic model: Validators earn tokens through commissions to help run the network.Writers to the network spend tokens. Cosmos also has an economic model for verifiable credentials.

DID methods

DID methods define how to interact with decentralized identity networks. Since Hyperledger Indy networks are so popular, I’ll focus on the Indy DID method.

Written as did:Indy this DID method describes how to interact with a Hyperledger Indy network, including creating decentralized identifiers and issuing, verifying, and revoking verifiable credentials. It is proposed that in the future there could be hundreds of separately run Hyperledger Indy networks. The purpose of the did:indy method is to facilitate full interoperability between each of these networks.

The Indy DID method defines methods for writing the following data:

Decentralized identifier (DID)

A DID is the fundamental identifier of a decentralized identity in the network. It may be the identifier for a user, a verifiable credential issuer, a service, an IoT device, etc. The DID structure is similar to a web URL.

The DID has four components, which are concatenated:

DID: The hardcoded string did: indicating that the identifier is a DID DID Indy method: The hardcoded string indy: indicating that the identifier uses the Indy DID method specification DID Indy namespace: A string that identifies the name of the primary Indy ledger, followed by a colon. The namespace string may optionally have a secondary ledger name prefixed by another colon following the primary name.  Namespace identifier: An identifier unique to the given DID Indy namespace. 

The components are assembled as follows:

did:indy:<namespace>:<namespace identifier>

Some examples of did:indy DID method identifiers are:

A DID written to the Sovrin MainNet ledger: did:indy:sovrin:7Tqg6BwSSWapxgUDm9KKgg A DID written to the Sovrin StagingNet ledger: did:indy:sovrin:staging:6cgbu8ZPoWTnR5Rv5JcSMB A DID on the IDUnion Test ledger: did:indy:idunion:test:2MZYuPv2Km7Q1eD4GCsSb6 DID documents (DIDDocs)

DIDDocs contains two primary data elements:

Cryptographic material the DID owner can use to prove control over the associated DID (i.e., public keys and digital signatures) Routing endpoints for locations where one may be able to contact or exchange data with the DID owner.

DID resolution is the process of obtaining a DID document for a given DID.

Verifiable credential schema

A schema object is a template defining a set of attributes (names). A schema is bound to verifiable credential definitions in a Hyperledger Indy network. The bound schema restricts which claims an issuer can include in the credential. Schemas have a name and version, and an issuer or authoritative organization (e.g. government authorities defining license schemas) normally writes them to the ledger. Any client can read schemas from a Hyperledger Indy node.

Verifiable credential definition

A credential definition contains data required for both credential issuance and credential validation. Any Hyperledger Indy client can read it. A credential definition references a schema and the issuer’s DID. The issuer’s public key is included within the credential definition in order to enable validation of the credentials, which are signed by the issuer’s private key. When credentials are issued using the issuer’s credential definition, the attributes (names) of the schema must be used.

Revocation registry definition

A revocation registry definition contains information required for verifiers to determine whether the issuer has revoked a (revokable) verifiable credential. Revocation registry definitions are required for revokable verifiable credentials and are written to the ledger during creation of the credential definition. Any client can read them from a Hyperledger Indy node.[1]

Revocation registry entry

A revocation registry entry marks the current status of one or more revokable verifiable credentials (i.e., “revoked” or “not revoked”) in the ledger in a privacy-preserving manner. The owner of a revocation registry definition writes a revocation registry entry whenever they wish to make a batch (possibly only one) of revocations public and detectable.

Any Hyperledger Indy client can read any revocation registry entry.

[1] The registry definition isn’t ever updated after creation. It binds immutably the revocation detection mechanism (currently the Tails File) to the credential definition. There is an m to 1 relationship between revocation registry definitions and credential definitions. This is because a new revocation registry definition is created whenever the Tails File number of entries is exhausted and a new one needs to be created; the new one will still point back to the original credential definition. The revocation registry entry is updated when credentials are revoked.

The post How Do Blockchains Provide the Trust Foundation for Decentralized Identity-Based Apps? appeared first on Anonyome Labs.


Shyft Network

Australia Going the CBDC Way

Australia launched its CBDC pilot in January 2023. More than 80+ financial entities enter proposed use cases in e-commerce, offline payments, and government payments. It received over 140 use case proposals in e-commerce, offline, and government payments. Intense discussions around CBDC, gained momentum in Australia in August last year. It was when Australia’s Central Bank announced its
Australia launched its CBDC pilot in January 2023. More than 80+ financial entities enter proposed use cases in e-commerce, offline payments, and government payments. It received over 140 use case proposals in e-commerce, offline, and government payments.

Intense discussions around CBDC, gained momentum in Australia in August last year. It was when Australia’s Central Bank announced its plans to scrutinize the economic benefits of introducing a Central Bank Digital Currency to the country.

The Reserve Bank of Australia declared that it would conduct a year-long pilot project to explore “innovative use cases and business models” for a CBDC. The authorities were keen to understand the technological, legal, and regulatory boundaries within which the CBDC will operate in Australia.

The Pilot: Schedule, Attributes, Constraints

The initial announcement by the RBA made it clear that the pilot would be executed in partnership with the Digital Finance Cooperative Research Centre, the DFCRC.

The DFCRC is a government-supported industry group. It was to invite industry players who should demonstrate how the CBDC could act as a payment and settlement service to household consumers and businesses. With this, the RBA also prepared a detailed schedule of how they would like to conduct the program.

Relevant Article: China Leads the Digital Currency Future

The Schedule

While the project timeline kicked-in in September 2022, the RBA decided to run the CBDC pilot and selected use cases between January and April 2023. After it ends in April, the RBA will publish a report with findings.

(Image Source) e-AUD: Attributes and Constraints

The RBA decided to call the CBDC eAUD, which will appear as a liability on the RBA’s balance sheet, denominated in Australian Dollars. The RBA will fix the upper limit of the eAUD issue as deemed fit for specific use case providers. It will pay no interest on any holdings of the eAUD.

Only Australian resident individuals and Australian-registered entities will be eligible to hold eAUD and have to comply with certain requirements. For instance, each eAUD holder has to have a project participation invitation from an approved use case provider. The use case providers need to run are identity validation of the holder, and they themselve to be an approved KYC service provider.

The storage of eAUD needs to be either a custodial wallet provided by a use case provider or a non-custodial wallet held directly by the end user.

Relevant Article: India’s CBDC Project: What is it Upto?

Response to the Pilot

According to information published last month, the pilot program received over 140 use case proposals. More than 80 financial entities proposed use cases across e-commerce, offline, and government payments.

In response to the overwhelming interest from the finance industry, the RBA raised caution in advance, though Australia is not new to virtual assets. nearly one in five Australians bought digital currencies in 2021, with 43% of Australians investing in crypto for the first time in 2021, found out a report by the US crypto exchange Gemini.

(Image Source)

Hedging against inflation and a means to asset diversification were the primary reasons why most Australian crypto users showed interest in digital assets. 81% chose to hold their crypto investments for the long term, while 54% chose to diversify their portfolio. Nearly 25% of Australian investors aged 18–34 had at least one-tenth of their portfolio invested in cryptocurrencies.

Despite such enthusiasm, the RBA contemplated two potential concerns. First, the capability of CBDC to disrupt bank intermediation and monetary policy transmission in normal times. Second, CBDC’s potential to give rise to bank runs in stressed conditions.

So far, CBDC’s potential benefits seem to outweigh its drawbacks. For instance, the CBDC could yield privacy benefits, according to the RBA. It could help protect monetary sovereignty, and it could also enhance the resilience of the country’s money and payments system and increase competition and efficiency while reducing user costs.

There are still a few months to wait till the pilot summary report comes out in the middle of 2023 and understand which direction the Australian CBDC wind blows!

______________________________

VASPs need a Travel Rule Solution to begin complying with the FATF Travel Rule. So, have you zeroed in on it yet? Check out Veriscope, the only frictionless crypto Travel Rule compliance solution.

Visit our website to read more: https://www.shyft.network/veriscope, and contact our team for a discussion: https://www.shyft.network/contact.

Also, follow us on Twitter, LinkedIn, Discord, Telegram, and Medium for up-to-date news from the world of crypto regulations. Also, sign up for our newsletter to keep up-to-date on all things crypto regulations.

Australia Going the CBDC Way was originally published in Shyft Network on Medium, where people are continuing the conversation by highlighting and responding to this story.


Entrust

SSL Review: December 2022

The Entrust monthly SSL review covers SSL/TLS discussions — recaps, news, trends, and opinions from... The post SSL Review: December 2022 appeared first on Entrust Blog.

The Entrust monthly SSL review covers SSL/TLS discussions — recaps, news, trends, and opinions from the industry.

Entrust

Hardenize advises states Entrust Integration is Now Available

Bulletproof TLS Newsletter #96

TrustCor Not Trusted

TLS News & Notes

Andrew Ayer and Checking if a Certificate is Revoked: How Hard Can It Be? Eric Lawrence discusses TLS Certificate Verification Changes in Edge Andrew Ayer states No, Google Did Not Hike the Price of a .dev Domain from $12 to $850 Hashed Out provides, What Is Encryption? A 5-Minute Overview of Everything Encryption Mishaal Rahman advises Android is adding support for updatable root certificates amidst TrustCor scare

Code Signing News & Notes

Dan Goodin looks into how Microsoft digital certificates have once again been abused to sign malware

The post SSL Review: December 2022 appeared first on Entrust Blog.


Indicio

Market Signals — Increased Interoperability, Regulations, and Adoption to Come!

The post Market Signals — Increased Interoperability, Regulations, and Adoption to Come! appeared first on Indicio.
Our regular review of recent news from around the decentralized identity community. In this edition, we look at some predictions on what 2023 will hold for digital identity, increased interoperability between countries, and an interesting new intersection of AI and digital identity.

By Tim Spring

6 Digital Identity Predictions for 2023, Mike Engle, Security Boulevard

This Security Boulevard article by Mike Engle offers several useful insights into the ties between digital identity and security for 2023; we’re going to summarize the top two.

1.” US regulators will require tech giants to adopt non-proprietary identities.”

Proprietary identities are widely used across the internet today; they refer to any time you log in with a third-party application — such as Apple, Facebook or Amazon — and a defining trait of these identities is that all your data is stored centrally on one of the provider’s servers. Non-proprietary identities give more control to the user over where their data is stored — usually locally on one of their devices — and how much is shared with others when interacting online. Pointing to the Improving Digital Identity Act nearing passage, and federal government agencies already transitioning to a more secure approach to identity management, Engle thinks it is only a matter of time before regulations hit large corporations to drive more secure identity practices for all consumer-facing businesses.

2. “VPNs will give way to identity-based perimeters for the virtual workforce”

Engle’s main point here is that there are simply too many people using corporate networks from insecure locations, and VPNs are almost worthless if login credentials have been compromised. Offering some terrifying statistics such as 2 billion passwords were leaked in 2021 alone, and that roughly 70% of breached passwords are still in use, it’s easy to see how an identity based solution with more robust verifiable credentials tied to biometrics might be a better way forward. (Chase Cunningham —aka, Dr. Zero Trust— recently did a more in-depth look into this issue in an article for Indicio.)

Deal Between Philippines and Singapore Could Herald More Digital ID Interoperability, Chris Burt, Biometric Update

Chris Burt reports for Biometric Update that the governments of the Philippines and Singapore recently signed a memorandum of understanding (MOU) that will allow each nation’s digital ID to be recognized in both countries.

“The MOU covers digital cooperation on digital connectivity, particularly in inter-operable systems and frameworks that enable electronic documentation; cybersecurity, such as organizing training courses and technical programs through the ASEAN-Singapore Cybersecurity Centre of Excellence to develop and enhance skills related to cybersecurity; and digital government/e-governance, such as in the areas of digital government strategy, digital government services, and digital identity,” — Singapore Minister for Communications and Information, Josephine Teo.

This announcement mirrors the global trend of governments looking to make their digital credentials more interoperable with one another and, therefore, more useful to their citizens. For example, the Philippines has also signed deals with Belgium and China, and Singapore is exploring opportunities to collaborate with India. These partnerships are hugely important for the advancement of digital identity, offering the opportunity for interoperability, but also for nations to learn more from each other about technical approaches and best practices.

Nvidia’s AI-Based Digital Fingerprinting Addresses Identity Attacks, Nancy Liu, sdx Central

This article by Nancy Liu, Editor for sdx central, takes a look at a different approach to security just announced by Nvidia — Artificial Intelligence (AI) based Digital Fingerprinting. This AI is designed to build a digital profile of the data you typically interact with at work, and how if you — or someone with your credentials — does anything suspicious or out of the ordinary, the security team at the organization will be alerted.

As the team looks ahead, Bartley Richardson, director of cybersecurity engineering and R&D at Nvidia, predicts “a detailed digital identity model [using AI] that knows how fast a person types, with how many typos, what services they use, and when they use them” for creating secure digital identities.

This is an interesting probabilistic approach to digital identity. At Indicio, we have been working on a decentralized digital identity approach to employee verification in an effort to provide additional security for organizations, require little maintenance, and also to put that data back in the user’s hands. This AI based solution is still in its early days but will certainly be one to watch.

How Government Can Keep Up with the Future, Derek Robertson, Politico Magazine

Derek Robertson, a reporter for Digital Future Daily and contributor to Politico Magazine, discusses a conversation he had with Jordan Shapiro, an economic and data analyst at the center-left Progressive Policy Institute, in which she weighs in on where the American government stands now in terms of IT, and what generally gets in the way of keeping its systems modernized and accessible.

Some of the biggest issues Shapiro calls out for the government are that there are hundreds of agencies, all with their own technologies to keep updated, and that there are so many new solutions coming out that regulations are being created often without proper research or knowledge of how they will affect the real world. She also discusses how the businesses creating these products and the government have inherently competing interests, with the businesses often focusing on income and productizing as much as they can, and the government focusing on safety and privacy.

With the government’s main priority being security, safety, and privacy they are often slower to adopt new technologies. Shapiro points to President Biden’s recent executive order and the “time tax” — the sheer amount of paperwork necessary to enroll in benefits such as Medicare, Medicaid, or the Health and Human Services — Biden calls out as being problematic for Americans in the document. One solution Shapiro proposes is better digital identity systems to fix this problem, providing a more streamlined and secure process for accessing government programs.

The good news is that some governments are already hard at work on implementing these solutions! For more information about ongoing efforts you can watch the video of a recent Indicio Meetup where Indicio’s CEO, Heather C Dahl, sat down with representatives from several different government organizations to discuss their efforts to use decentralized digital identity and how they arrived at this solution.

The post Market Signals — Increased Interoperability, Regulations, and Adoption to Come! appeared first on Indicio.


Ocean Protocol

Data Farming DF20 Completed, DF21 Started

Stakers can claim DF20 rewards. DF21 runs Jan 19–26, 2023. 1. Overview The Ocean Data Farming program incentivizes the growth of data consume volume in the Ocean ecosystem. It rewards OCEAN for stakers who allocate liquidity to curate data asset with high data consume volume (DCV). To participate, users lock OCEAN to receive veOCEAN, then allocate veOCEAN to promising data assets (dat
Stakers can claim DF20 rewards. DF21 runs Jan 19–26, 2023. 1. Overview

The Ocean Data Farming program incentivizes the growth of data consume volume in the Ocean ecosystem. It rewards OCEAN for stakers who allocate liquidity to curate data asset with high data consume volume (DCV).

To participate, users lock OCEAN to receive veOCEAN, then allocate veOCEAN to promising data assets (data NFTs) via the DF webapp.

DF Round 20 (DF20) is part of DF Beta. DF20 counting started 12:01am Jan 12, 2022 and ended 12:01am Jan 19, 2023. 75K OCEAN worth of rewards were available. LPs can now claim rewards at the DF webapp Claim Portal.

DF21 is part of DF Beta. Counting started 12:01am Jan 19, 2023.

The rest of this post describes how to claim rewards (section 2), and DF21 overview (section 3).

2. How To Claim Rewards

As an LP (staker), here’s how to claim rewards:

Go to DF webapp Claim Portal Connect your wallet Rewards are distributed on Ethereum mainnet. Click “Claim”, sign the tx, and collect your rewards

Rewards will accumulate over weeks so you can claim rewards at your leisure. If you claim weekly, you can re-stake your rewards for compound gains.

Rewards will accumulate over weeks so you can claim rewards at your leisure. If you claim weekly, you can re-stake your rewards for compound gains.

3. DF21 Overview

DF21 is part of DF Beta. DF Beta’s aim is to test the effect of larger incentives, learn, and refine the technology. DF Beta may run 10–20 weeks. In any given week of DF Beta, the total budget may be as low as 10K $OCEAN or as high as 100K $OCEAN

Some key numbers:

Total budget is 75,000 $OCEAN. (This is an increase from 50,000 $OCEAN in DF18). 50% of the budget goes to passive rewards (37,500 $OCEAN) — rewarding users who hold veOCEAN (locked OCEAN) 50% of the budget goes to active rewards (37,500 $OCEAN) — rewarding users who allocate their veOCEAN towards productive datasets (having DCV). Ocean currently supports five production networks: Ethereum Mainnet, Polygon, BSC, EWC, and Moonriver. DF applies to data on all of them.

As usual, the Ocean core team reserves the right to update the DF rewards function and parameters, based on observing behaviour. Updates are always announced at the beginning of a round, if not sooner.

Conclusion

DF20 has completed. To claim rewards, go to DF webapp Claim Portal.

DF21 begins Jan 19, 2023 at 12:01am. It ends Jan 26, 2023 at 12:01am.

DF21 is part of DF Beta. Reward budget is 75K $OCEAN.

Further Reading

The Data Farming Series post collects key articles and related resources about DF.

Follow Ocean Protocol on Twitter, Telegram, or GitHub for project announcements. And chat directly with the Ocean community on Discord.

Data Farming DF20 Completed, DF21 Started was originally published in Ocean Protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.


Continuum Loop Inc.

Trust Registries: A Real-World Example with Bonifii’s MemberPass®

The post Trust Registries: A Real-World Example with Bonifii’s MemberPass® appeared first on Continuum Loop Inc..

Trust Registries: A Real-World Example with Bonifii’s MemberPass® Written by  Trust Registries are a powerful technology organizations can use to increase trust and streamline business operations in various industries. One of the most innovative uses of this technology can be found in the financial sector, where trust registries are being used to combat the threat of financial fraud.

Bonifii, an innovative financial technology company, has developed MemberPass®, a solution that proactively protects credit union members from becoming victims of financial fraud. MemberPass® allows member institutions (credit unions and other financial institutions) to issue credentials to their members. When a credit union verifies their members, they use cryptography, self-sovereign identity credentials and communications to prove the following key things:

The connection is the same private connection that was established with their own member (this is two-way – the member knows this too). The member still holds the credential. The contents of the credential haven’t been tampered with.  The credit union signed the credential using its Decentralized Identifier – its DID.

Getting cryptographically secure facts like these helps the credit union and their member be sure they aren’t talking to bad actors. 

But when another institution starts to use a credential, the cryptography can only go so far. 

Another institution can know that:

The person holds the credential (they aren’t a member of that other institution). The contents haven’t been tampered with. A DID signed the credential.

What we don’t know is whether the DID that signed the credential is really a trusted credit union or not.

To know that the institution checks the MemberPass Digital Trust Registry®. The trust registry is maintained under Bonifii’s governance framework, and Bonifii has a solid process that validates the institutions listed in the MemberPass Digital Trust Registry®. 

Increase Trust and Streamline Operations

The use of a trust registry allows new institutions to join the MemberPass® network and quickly validate the information from any participating member institutions.  Further, it provides for the efficient sharing and verification of information among credit unions and their partners, increasing trust and streamlining business operations.

The operational ecosystem, anchored by the MemberPass Digital Trust Registry®, provides Bonifii with a comprehensive solution for managing its identity verification data and interactions. MemberPass® offers a streamlined and user-friendly interface that allows members to easily manage their digital identity, including adding and removing authorized credentials, viewing transaction history, and monitoring for suspicious activity—allowing members to easily and securely access their financial accounts and complete transactions.

The implementation of Trust Registries in the financial industry, as demonstrated by Bonifii’s MemberPass Digital Trust Registry®, is a powerful example of how this technology can increase trust and streamline business operations. Trust Registries provide a significant step forward in the efficient and secure management of identity verification data and interactions in the credit union industry.

About Bonifii: By providing a safe, secure, and convenient identity verification solution, MemberPass® is solving one of the credit union industry’s biggest problems by reducing member friction and injecting trust in digital interactions. The use of a decentralized trust registry in MemberPass® is a real-world example of how organizations can use this technology to increase trust and streamline business operations.

About Continuum Loop: As a leading consulting firm, Continuum Loop played a vital role in the development and launch of MemberPass® and the MemberPass Digital Trust Registry®. Darrell O’Donnell was embedded as the CTO, providing crucial expertise and leadership. Through their partnership, Continuum Loop and Bonifii successfully deployed the first-ever Trust Registry for MemberPass®, significantly enhancing the platform’s security and user convenience.

 Related Articles Browse All Categories 2022 Year in Review

Dec 7, 2022

2022 Year in ReviewThank you to everyone who attended our 2022 Year in Review Webinar. We had a great time reflecting on the past year and discussing what's to come in 2023. Christine did a great job going over Trust Registries & Interoperability as well as...

Product Passports & Trust Registry Solutions

Nov 7, 2022

Products Passports & Trust Registries: Solutions in ActionIn our previous post, we explored the advantages of Trust Registries and a decentralized ecosystem for Product Passports. Trust Registries are essential for managing trust and reputation; they allow us to...

Product Passports & Trust Registries

Nov 3, 2022

Product Passports & Trust Registries: The Benefits of a Decentralized EcosystemA Step in the Right Direction A product passport is a tool that manufacturers can use to improve the sustainability of products and make it easier for consumers to make informed choices...

Stay Up to Date With The Latest News & Updates Newsletter

Subscribe for current insights and updates on decentralized ecosystems.

 

Success!

Email

Subscribe

Follow Us Follow Follow Follow

The post Trust Registries: A Real-World Example with Bonifii’s MemberPass® appeared first on Continuum Loop Inc..


Ocean Protocol

Trent McConaghy: Value Creation in the New Data Economy with Ocean Protocol

In the first part of this episode of Voices of the Data Economy, we welcomed Trent McConaghy, Founder of Ocean Protocol, for the second time on our show. During this two-episode series podcast interview, we spoke about the Ocean roadmap for 2023 and the project’s focus on newer use cases, including DeFi. Here are edited excerpts from the episode. Focus of the Ocean Protocol core team in 

In the first part of this episode of Voices of the Data Economy, we welcomed Trent McConaghy, Founder of Ocean Protocol, for the second time on our show. During this two-episode series podcast interview, we spoke about the Ocean roadmap for 2023 and the project’s focus on newer use cases, including DeFi. Here are edited excerpts from the episode.

Focus of the Ocean Protocol core team in 2023

Ocean Protocol as a project was conceived in late 2016. For the first five years, it was steadily building the infrastructure needed to solve a bunch of different tough problems such as sovereign data ownership, solving private data reconciling with crypto, pricing data, and many other related things. The project launched different versions–V1, V2, V3, and V4. Trent briefly spoke about these versions in another appearance of Voices of the Data Economy as well.

“The focus is now doubling down on traction. How we measure traction is by ‘data consume volume,’ which means how much data is being bought and sold in any given week or month through the Ocean ecosystem, on the Ocean Market, and other markets powered by Ocean contracts. We have more focused projects now like data challenges (for predicting the price of Ethereum) and Data Farming (an incentive program where the objective function is data consume volume).”

Trent quickly recaps the different versions of Ocean: Ocean V1 solved data sovereignty; Ocean V2 solved data privacy through privacy-preserving AI, and Ocean V3 solved a few things at once: How do we simplify things? How do we have really crazy interoperability for the custody and management of data? How do we automatically price data?

Here you can find an overview of Ocean V4. It got shipped in the spring of 2022. The three things that got shipped in V4 are Data NFTs, in addition to datatokens to map, base IP, and licenses against the base IP; solving rug pulls on the pools, and finally, better monetization for the community.

Users of Ocean Protocol. Who are they?

The users of Ocean Protocol can be divided into three segments — by verticals, by profession, and then there are individual or higher level groups. By vertical, it applies to multiple streams — like automotive data, could apply to personal social media data, or health data etc.

There are several projects within the Ocean ecosystem that focus on personal data, typically via DAOs. Most people don’t want to manage their data themselves — so you can have collective bargaining around that- this is handled via projects like the DataUnion app.

The relatively larger groups are NGOs, small businesses, large enterprises, and then cities, provinces, and countries — any of those slices of civilization can use Ocean in various ways. It goes up and down different levels; for example, one collaboration is actually via deltaDAO collaborating with the Berlin Library ecosystem.

Data Scientists as Ocean Protocol users and why DeFi?

The main user that Ocean serves the best right now is data scientists, because it’s data scientists who work with data and know how to create value from data, typically by building AI models or just simple machine learning models. Then making predictions, choosing actions, selecting actions, and then executing those actions. One example is DeFi, where you can predict the price of ETH in 10 minutes from now. And if you are right more often than you are wrong - you can make a buying or selling action.

“I believe that Ocean will become ubiquitous, and we are on track with that. How do you get to the ubiquity and its sustained growth over long periods? Facebook has become ubiquitous right now. It started with just college students and Harvard. But then it spread itself one university at a time. And eventually, after years and years of sustained growth, it became ubiquitous. And hopefully, it won’t be ubiquitous forever because it’s centralized. And typically, things fade. Ocean, though, I hope, becomes ubiquitous and has decentralized goals. It is an infrastructure for civilization, just like TCP/IP with web protocols on top.”

While deciding on vertical focus, one good metric is the time needed for going through the data value creation cycle. For DeFi, it can be as little as 10 minutes, and for areas like drug discovery, it can be as long as ten years for data to get monetized with all FDA approvals etc. In those terms, the lowest-hanging fruit is decentralized finance.

Build, build, build: V4, veOCEAN and Data Farming

With the launch of V4, Ocean solved these three problems: base IP NFTs, rug pulls, and much better monetization for the community. But before that, Ocean had already launched Data Farming. In Trent’s words, Data Farming incentivizes for growth of data consume volume in the Ocean ecosystem. It rewards OCEAN to pool liquidity providers (stakers) as a function of consume volume and liquidity. It’s like DeFi liquidity mining, but tuned for data consumption. DF’s aim is to achieve a minimum supply of data for network effects to kick in, and once the network flywheel is spinning, to increase growth rate.

Questions at that point: How do we solve the problem with free assets for Data Farming, fixed price assets for Data Farming, and the pool draining attack? The solution to this was built and shipped by Ocean in the fall of 2022: veOCEAN. It stopped all the issues from before. Read Trent’s blog here on Introducing veOCEAN and how Data Farming continued with the veOCEAN launch.

“There’s a macro-level problem in crypto. How do you reconcile near-term greed with long-term incentives? You’ve got the people who want to ape into something, make their 2x in a week or a month, and then ape out–take your profits and go. If you would have asked me three years ago, four years ago, five years ago, I would have said just ignore those people. But the fact of the matter is in crypto, they exist, and you should try to make your system somewhat attractive to them, even if they’re not your main target. You still want to be interesting to them, if you can. But for the long term, I still really want to make sure that we serve the builders around data, the data scientists,” concludes Trent in the first part of this interview.

Here is a list of different topics discussed during the two-episode series interview:

The focus of the core Ocean team in 2022 Different user groups of Ocean Protocol How Data Scientists can use Ocean Ocean’s focus on constantly building: veOCEAN and Data Farming Different use cases in DeFi and Ocean Trent’s view on the future of AI in 2022 Trent’s take on 2022 as a year for the crypto industry and what to look forward in 2023

We will release the second part of the interview soon. Stay tuned.

Follow Ocean Protocol on Twitter, Telegram, LinkedIn, Reddit, GitHub & Newsletter for project updates and announcements. And chat directly with other developers on Discord.

Trent McConaghy: Value Creation in the New Data Economy with Ocean Protocol was originally published in Ocean Protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.


Transmute TechTalk

Using Decentralized Identifiers (DIDs) to Authenticate Your Devices (Device Arbitration)

DIDs Prove Control with Private Keys Decentralized identifiers (DIDs) are a way to identify and authenticate devices. Unlike traditional unique identifiers, DIDs have a private key associated with them that proves control of the identifier. This is especially useful for devices, as they can hold their own private key and make assertions about their authenticity. One way to secure devices is
DIDs Prove Control with Private Keys

Decentralized identifiers (DIDs) are a way to identify and authenticate devices. Unlike traditional unique identifiers, DIDs have a private key associated with them that proves control of the identifier. This is especially useful for devices, as they can hold their own private key and make assertions about their authenticity.

One way to secure devices is to use a hardware-protected private key, such as a TPM (trusted platform module) or TrustZone in an ARM processor. This ensures that the private key is not extractable and can only be used to make signatures in a trusted execution environment. The public key, on the other hand, can be stored on a verifiable data registry and associated with a DID, allowing it to be retrieved upon request.

Keys Can Be Generated at the Beginning or Middle of Device Arbitration Process

The timing of public/private key generation on a device depends on the specific use-case. For example, if the device needs to be authenticated as a genuine product from a particular brand, the keys may need to be generated at the time of assembly. On the other hand, if the device is being provisioned for a specific use-case, key generation may take place during the provisioning stage.

Using Key Generation to Protect User Privacy

It’s important to consider privacy when using DIDs and hardware-protected keys to authenticate devices. In cases where devices are being sold to the general public, it’s best practice to generate new keys to preserve the privacy of the user.

Benjamin Collins, Transmute’s Technical Product Owner, spent 10 years working with Linux and Open Source as the creator of DashGL.com, a site which provides tutorials for writing simple games for Linux in C using OpenGL and the GTK toolkit.

Connect with Ben on Twitter, and GitHub

About Transmute: Building on the security and freedom that Web3 promised, Transmute provides all the benefits of decentralization to enterprise teams seeking a cost effective, interoperable, planet-forward experience provided by experts in technology and industry.

Transmute was founded in 2017, graduated from TechStars Austin in 2018, and is based in sunny Austin, Texas. Learn more about us at: http://www.transmute.industries

Connect with Transmute on LinkedIn and Twitter

Using Decentralized Identifiers (DIDs) to Authenticate Your Devices (Device Arbitration) was originally published in Transmute on Medium, where people are continuing the conversation by highlighting and responding to this story.


KuppingerCole

Automating the SOC

by Paul Simmonds SOAR, Security Orchestration, Automation and Response, is the latest in a line of security buzzwords to hit the market; and while SOAR may have only emerged in the last few years, it is fast becoming an essential tool for organizations.

by Paul Simmonds

SOAR, Security Orchestration, Automation and Response, is the latest in a line of security buzzwords to hit the market; and while SOAR may have only emerged in the last few years, it is fast becoming an essential tool for organizations.

OWI - State of Identity

Anti-fragile Identity

On this week's State of Identity podcast, host, Cameron D’Ambrosi sits down with Aaron Goldsmid, VP of Product for Twilio Communications Platform. They discuss verified identity as a primitive of the internet and the digital “anti-fragile identity” becoming better than the IRL.

On this week's State of Identity podcast, host, Cameron D’Ambrosi sits down with Aaron Goldsmid, VP of Product for Twilio Communications Platform. They discuss verified identity as a primitive of the internet and the digital “anti-fragile identity” becoming better than the IRL.


KYC Chain

Precious metals, NFTs and KYC compliance

The precious metals market has a history stretching back thousands of years. More recently, innovative businesses have been finding ways to use modern technology to optimize and improve this age-old asset class. In this article, we take a look at how some innovative projects and companies are using blockchain technology to back ownership of precious metals – and how Automated KYC is being used to

Ontology

Meet the Team: Loyal Ontology Community Member, Sasen

What’s your name and where are you from? I’m Sasen D, one of Ontology’s Loyal Members. I’m from Sri Lanka. I joined Ontology’s community in April 2019. Tell us a bit about yourself. What are your hobbies? Do you have any pets? I am a passionate sales and marketing professional with over 11 years of experience in the field and I have completed some professional certifications rela
What’s your name and where are you from?

I’m Sasen D, one of Ontology’s Loyal Members. I’m from Sri Lanka. I joined Ontology’s community in April 2019.

Tell us a bit about yourself. What are your hobbies? Do you have any pets?

I am a passionate sales and marketing professional with over 11 years of experience in the field and I have completed some professional certifications related to the subject. I am also married and have a child. I love animals and have a cat, but other than that, my hobbies include music and playing football.

What kind of work do you do on a day-to-day basis?

I work mainly on an online basis, working freelance with the crypto and blockchain communities, outside of my professional role in marketing and sales. I was part of the Ontology member base in 2020, and I attended as many events as I could. This is really how I got so closely involved with Ontology. Because of this, Ontology named me as its first Loyal Member of the team. Now I deal each week with Loyal Member discussions, join Twitter Spaces, and continue to play a role in the wider Ontology community.

In your opinion, what makes Ontology stand out from other blockchains?

I believe that Ontology’s node staking performance is the highlight of its offerings, as well as its trusted decentralised solution, easy to manage platforms, and 24/7 online support.

What is the most exciting part of the project you’re working on for Ontology?

The weekly Loyal Member discussion is the most exciting part of Ontology for me. Seeking out new ideas, topics, and opinions to discuss each week and share with the community increases my personal knowledge. Additionally, engaging in discussions with the community allows me to develop ideas with them and contribute to Ontology’s innovations.

What has been the most important moment in your career so far?

That is quite an interesting question. I got a huge amount of value for my work in sales and marketing when I worked at a private university. Career-wise, this was probably my best personal accomplishment to date.

What are you most excited for in your future at Ontology?

I am excited to be an Ontology Harbinger and have the opportunity to share my enthusiasm for the project with others.

As we mark Ontology’s five-year anniversary, where do you see Ontology and Web3 going in the next five years?

We can expect to see more creative uses for digital currency over the coming years, which could have a meaningful impact on many people’s everyday lives. Web3 will be the main cause of this so Ontology will naturally play a large part. Furthermore, decentralized identities are the future. This is something that may be a core part of Ontology’s strategy over the next five years.

Follow us on social media!

Ontology website / ONTO website / OWallet (GitHub)

Twitter / Reddit / Facebook / LinkedIn / YouTube / NaverBlog / Forklog

Telegram Announcement / Telegram English / GitHubDiscord

Meet the Team: Loyal Ontology Community Member, Sasen was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


Innopay

INNOPAY’s Night of Insight: Open Banking Monitor Update

INNOPAY’s Night of Insight: Open Banking Monitor Update 23 Feb 2023 trudy 19 January 2023 - 11:12 TechQuartier, Platz der Einheit 2, Frankfurt am Main Thursday 23 February we organise our INNOPAY's Night of Insight from 18:00-20:00 at the TechQuartier in Frankfurt. This bi-monthly event at our offi
INNOPAY’s Night of Insight: Open Banking Monitor Update 23 Feb 2023 trudy 19 January 2023 - 11:12 TechQuartier, Platz der Einheit 2, Frankfurt am Main

Thursday 23 February we organise our INNOPAY's Night of Insight from 18:00-20:00 at the TechQuartier in Frankfurt. This bi-monthly event at our offices is centred around sharing the latest findings from both our market research and our project experiences. We will kick of the event series with a session on the latest iteration of INNOPAY´s much-anticipated Open Banking Monitor.

During the Night of Insight, we will especially zoom in on how financial institutions in Germany have developed their product offerings and progressed on their Open Banking journey.

Where:           TechQuartier, Platz der Einheit 2, Frankfurt am Main

Date:              Thursday, 23 February 2023

Time:              18:00-20:00 PM

Register now!


KuppingerCole

HID Global Authentication Platform

by Alejandro Leal In recent years, investment into cybersecurity has soared but, in most cases, these efforts have not fully addressed the reliance on passwords and the challenges they introduce. Without secure access to resources, businesses and organizations are at risk of suffering from phishing and social enginee

by Alejandro Leal

In recent years, investment into cybersecurity has soared but, in most cases, these efforts have not fully addressed the reliance on passwords and the challenges they introduce. Without secure access to resources, businesses and organizations are at risk of suffering from phishing and social enginee

Identosphere Identity Highlights

Identosphere 117: Incumbents fight Verifiable Credentials • Metamask extended with MFA Snap • Ripple Adopts SSI?

We curate the latest in decentralized identity, with upcoming events, use-cases and developments in open standards and the public sector. Thanks for supporting our efforts by Paypal or Patreon
Welcome to Identosphere - We Gather, You Read Contribute to our efforts by PayPal, or Patreon & we’ll keep aggregating industry info Be sure your blog has an rss feed so we can better track and feature your content! Send us a message at newsletter@identosphere.net if you have an RSS feed you want us to track for this digest. Critical WARNING for our industry Battle for the brand [Verifiable Credentials] by Anil John

This in turn is generating allergic reactions from incumbent players who have often played the gatekeeper role in this domain, who after ignoring and laughing at the work for the longest time, are now becoming concerned about its success, traction and global adoption.

Their reactions are manifesting themselves in some specific ways

Upcoming 

Entra Verified ID: A trustworthy way to verify remote employees 1/19

Identity, Authentication and the Road Ahead: A Cybersecurity Policy Forum 1/25

GS1 Global Forum 2/13-16

APAC Digital Identity unConference 3/1-3 Bangkok, Thailand

Thoughtful Biometrics Workshop virtual unConference 3/13-17 [Registration]

Internet Identity Workshop 4/20, Mountain View, CA

Hiring

YeshID hiring a Web Frontend Generalist.

Explainers

Intro to Self-Sovereign Identity Allison Fromm, proofspace

Decentralized Identity: Giving Consumers Power Over Their Identities Forbes

Video describing Microsoft Entra

What our Chief Architect said about Decentralized Identity to Delay Happy Hour

[employment] Open Recognition is for every type of workplace WeAreOpenCoop

[employment] Proven Works — The Future of Employment Verification Indicio

[usecase] What are the Uses of Decentralized Identities? Noname.Monster

[book excerpt] Defining Digital Identity Windley.com

[101] What is Digital Identity? HoloChain

[101] What are Decentralized Identifiers? Identity.com

[101] DID and Verifiable Credential—Let’s Talk About Something New zCloak Network

[101] DWeb 2022 Talk: Decentralized Identity Open Standards IdentityWoman

IAPP Event: An Intro for Data Privacy Pros to Self-Sovereign Identity IdentityWoman

Looking Forward Decentralized Verifiable Credentials for the future of Identity and Access Mangement YeshID

In Europe, thanks to EBSI, you might see the advantages of verifiable credentials when moving between borders or notarizing documents. And maybe somewhere else entirely, you will find some self-issued digital identity that misses BitchX and talks complete nonsense, anonymously and securely. All of these are powered by DIDs and VCs.

Company Stories Ocean Protocol and http://Fetch.ai will drive a revolutionary change in the automotive industry KryptoInsider

Companies participating in Catena-X are collaborating to develop a secure, open network for cross-company data and information exchange across the entire automotive value chain.

Eclipse Dataspace Components

[explainer, video] Ripple Adopting Decentralized Identity Tomiwabold Olajide

Hear from Ripple GM, [Monica Long]—in under 60 seconds!—how decentralized identity can give individuals a new power to control their personal, online data in a Web3 future.

Spruce 2022 in Review

In April, we announced a $34M Series A round led by a16z crypto. This funding enables us to lead research efforts in cutting-edge privacy and usability technology for identity, while growing our product teams and executing partnerships across the Web3 ecosystem. You can learn more in our announcement here.

Spruce Developer Update #27

Kepler is a decentralized storage network organized around data overlays called Orbits. Kepler allows users to Securely share their digital credentials, private files, and sensitive media to blockchain accounts, all using your Web3 wallet.

Literature Decentralized Identity as a Meta-platform: How Cooperation Beats Aggregation Michael Shea, Samuel M. Smith Ph.D., Carsten Stöcker Ph.D., w contributions from Juan Caballero Ph.D. and Matt G. Condon

the network scaling law for meta-platforms differs from the network scaling law traditionally seen on closed platforms today, and we will examine how the cooperating members of a decentralized identity meta-platform may out-compete traditional, centralized identity platforms

Averting Cambridge Analytica in the Metaverse: Identity, Privacy… Anastasia

“What would it take to build a sufficiently robust privacy moat around every digital citizen such that the costs of surveillance capitalism become too onerous for companies to pursue?”

Business [podcast] Selling Solutions, Not SSI Technology (with Riley Hughes)

Customers need solutions, not SSI technology. We will struggle in fostering adoption if we try to sell SSI technology to end customers. We should rather focus on selling them solutions to business problems. For example, Slack sells productivity/collaboration tools to enterprises, not Internet technology or communication protocols. The same approach should be taken but us identity folks!

Organization Schellman Joins the Voilà Verified Trustmark Program DIACC

The Digital ID and Authentication Council of Canada (DIACC) is pleased to officially welcome Schellman to the Voilà Verified Trustmark Program – the first and only certification program to determine digital identity service compliance with the Pan-Canadian Trust Framework™ (PCTF)

European EU sets digital targets to empower people and organisations Innopay

Earlier this week, the European Commission published its digital targets for the future in the European Digital Decade Policy Programme 2030 (DDPP). This lays out its vision for empowering EU citizens and businesses through digital transformation from now until the end of the decade. The DDPP outlines concrete objectives in the following four domains

Standards Work A conversation with the Trusted Web podcast Content Authenticity Initiative [use DIDs + VCs]

A coming wave of content created with generative artificial intelligence and the importance of authentic storytelling enabled by digital content provenance technology to identify synthetic media and display attribution

How consumers, creators and industry may rebuild transparency online by engaging with provenance tools to advance verifiable media and digital literacy

The critical role of social media and search companies in adopting open-source tools and a standard for verifiable certificates and credentials  

Credible: Introducing Mobile Driver’s Licenses SpruceID

Federal agencies in the United States, namely TSA, have already committed to using ISO/IEC 18013-5, which means that state DMVs will need to build mDLs aligned with the same standard in order for their driver’s licenses to be usable for travel at airports. The ISO working groups, which Spruce is a part of, are still actively discussing methods for how an mDL is issued, presented online (ISO/IEC 18013-7), and refreshed.

Briefcase: Share Small Fragments of Structured Data Using DIDs Transmute

excited to participate in the Linux Foundation’s newest initiative the Open Wallet Foundation. [...] As part our helping evaluate open source building blocks for digital wallets we built: Briefcase 💼

IOT - Digital Twin Identity of Things: verifiable credentials are safer for IoT systems Wider Team

Capabilities-based access control is a promising paradigm that can handle the particularities of IoT systems.

Nevertheless, existing systems are not interoperable and they have limitations, such as lack of proof of possession, inefficient revocation mechanisms, and reliance on trusted third parties.

In this paper we overcome these limitations by designing and implementing a system that leverages Verifiable Credentials (VCs) to encode the access rights. Our solution specifies protocols for requesting and using VCs that can be mapped to OAuth 2.0, includes an efficient and privacy preserving proof of possession mechanism, and it supports revocation. We implement and evaluate our solution and we show that it can be directly used even by constrained devices.

Products Passports & Trust Registries: Trust Continuum™ Methodology. Continuum Loop [Digital Twin]

Our Trust Continuum™ Methodology helps organizations integrate digital product passports into their supply chain. We help you understand how you can use digital product passports to establish trust and accountability in your supply chain and how they can streamline processes and reduce costs. We help you assess the risks and benefits of integrating digital product passports into your existing systems and help you develop a plan for implementing them.

Thoughtful How to survive outside of the state with Crypto Agorism Anarkio

Slides & transcript: https://agorism.blog/anarkio/survival-outside-the-state

Fair and free markets for food, jobs, housing, healthcare, mail, sim cards & more – no government ID or state permit required.

Beyond the fetish of open OpenFuture, Balázs Bodó

The future of the internet is not what it used to be. Again. With web3, it is now the third time the promise of an exciting, novel, maybe revolutionary social, political, economic, cultural experiment founded upon open, freely accessible, distributed, decentralized techno-social infrastructures degrades into a dystopian nightmare. 

[tweet] I just realized there’s a term for the problem I see arising with VCs. KyleDH

[Jevons Paradox] As VCs make it easier to share certified information, more use cases will arise to demand this information. I’m hesitant that this is useful in society though.

DWeb So many years we fought with the Twitter API @shanselman

I love that if you want an RSS of someone's Mastodon feed, you just add .rss at the end of their URL. Open Web, my friends. https://hachyderm.io/@shanselman.rss

Web 3 Web3Auth MPC Snap: Integrating Multi-Factor Authentication Into MetaMask Web3Auth

Infominer: The big story here is that Snaps allow you to securely extend the features of Metamask (hint bring SSI).

Snaps is the roadmap to making MetaMask the most extensible wallet in the world. As a developer, you can bring your features and APIs to MetaMask in totally new ways. Web3 developers are the core of this growth and this series aims to showcase the novel Snaps being built today. Web3Auth Snap

[tweet thread] New technology brings massive opportunity. But 99% will miss the boat @mishadavinci

Here's why you MUST finally learn web3 (in 2023) […]

[announce] Welcome to Disco District

The goal of Disco District is to allow someone to prove something about themselves off-chain to get access to do something on-chain. To do this, Disco makes use of our two favorite crypto primitives: Decentralized Identifiers and Verifiable Credentials

[partnership] Unstoppable Domains partners with Ready Player Me to improve interoperability for decentralized identities NFT News

Unstoppable Domains joined the Open Metaverse Alliance, a group of blockchain-based metaverses and Web3 platforms working to overcome interoperability challenges in the NFT industry, specifically on digital identity and avatar standards across metaverses. The latest collaboration with Ready Player Me is a clear effort towards furthering the Open Metaverse Alliance’s agenda. [using DIDs]S

Twitter @by_caballero shares
Thanks for Reading

Read more \ Subscribe: newsletter.identosphere.net

Please support our efforts by Patreon or Paypal

Contact \ Submission: newsletter [at] identosphere [dot] net

Wednesday, 18. January 2023

KuppingerCole

KC Open Select: Your #1 Shortlisting Tool

Discover and Compare Cybersecurity Solutions for Free Optimize your decision-making process with the most comprehensive and up-to-date market data available. Configure your individual requirements to find the right vendor for your business or follow the best practice recommendation of an unbiased research analyst. Passwordless Authentication coming in Q1 2023! Learn more: https://go.kuppin

Discover and Compare Cybersecurity Solutions for Free

Optimize your decision-making process with the most comprehensive and up-to-date market data available.

Configure your individual requirements to find the right vendor for your business or follow the best practice recommendation of an unbiased research analyst.

Passwordless Authentication coming in Q1 2023!

Learn more: https://go.kuppingercole.com/open-select




Shyft Network

India’s CBDC Project: What is it Upto?

The central bank of India, RBI, started with India’s CBDC pilot project in the last phase of 2022. India’s CBDC pilot has two segments: Digital Rupee Wholesale and Digital Rupee Retail. The pilot is active within a closed user group with eight banks entrusted with the initial responsibilities. Reserve Bank of India (RBI), India’s central banking authority, defines CBDC as “the legal
The central bank of India, RBI, started with India’s CBDC pilot project in the last phase of 2022. India’s CBDC pilot has two segments: Digital Rupee Wholesale and Digital Rupee Retail. The pilot is active within a closed user group with eight banks entrusted with the initial responsibilities.

Reserve Bank of India (RBI), India’s central banking authority, defines CBDC as “the legal tender issued by a central bank in digital form.” However, it differs from the existing digital money in the country — a CBDC is the liability of the Reserve Bank, not of a commercial bank.

According to a brief press note made available by the Ministry of Finance on December 12th, 2022, the RBI has launched CBDC pilots in both the Wholesale and Retail segments.

Digital Rupee-Wholesale

Launched on November 1st, 2022, this segment of the CBDC pilot aims to facilitate settlements of secondary market transactions in government securities. The goal is to increase the efficiency of the interbank market.

(Image Source)

The authorities believe that settlement in central bank money would reduce transaction costs. It would be better positioned to pre-empt the need for settlement guarantee infrastructure or collateral to mitigate settlement risk.

Digital Rupee — Retail

Launched a month later, the retail version comes in the form of a digital token, representing legal tender. The issue denominations of these tokens remain the same as the paper currency and coins. Financial intermediaries, such as banks, can distribute it. The user must have a digital wallet issued by the respective bank.

In the Digital Rupee-Retail pilot project, transactions can happen the same way as the legal tender: person to person (P2P) and Person to Merchant (P2M). It will come with other legacy tender qualities/advantages relating to trust, safety, and settlement finality. Although it will not be eligible to earn interest, one can convert it to bank deposits or other forms of money.

The Pilot Project

The Digital Rupee — Retail project started with a closed user group (CUG) as its first set of participating customers and merchants. RBI has named eight banks for phase-wise participation in the retail pilot project.

(Image Source)

These banks include the State Bank of India, the ICICI Bank, the Yes Bank, and the IDFC bank in the first phase and the Bank of Baroda, Union Bank of India, the HDFC Bank, and the Kotak Mahindra Bank in the second one.

Relevant Article: China Leads the Digital Currency Future

According to mid-January reports compiled by the Indian State-owned banks, trades worth INR7,140 Crores (more than US$850 million) were settled in November 2022 alone.

The Vision Behind India’s CBDC Project

The Reserve Bank of India, while preparing the concept note on its CBDC project, looked carefully at use cases available globally to understand the benefits of a digital currency.

For instance, the RBI considered CBDCs as a means to restrict the excessive physical movement of cash and as a better option to private virtual currency.

The RBI believes the CBDC could be a preferred financial instrument for many in the country. It has an Index for measuring people’s inclination towards digital payments, active since 2018, known as the RBI-DPI index.

The RBI-DPI index, which had a value of 100 in March 2018, increased to 349.30 in March 2022, a growth of almost 3.5 times in four years. This indicates that people’s growing interest in making digital payments was a reason for the RBI to launch its CBDC pilot.

The Future of India’s CBDC Program

While experts see user behavior as the key to the CBDC program’s success, there is a lot that the country can do to ramp up adoption.

In a country like India, where a significant number of the population lives in rural areas, creating awareness about the benefits of CBDC should be crucial. The rural payment infrastructure must also support sound and speedy internet networks.

The CBDC, wholesale or retail, must fulfill some unique user needs that legacy currency can not fulfill. Those who have already adopted it must be equipped with offline payment facilities and should not worry about their data being breached.

Only time will tell whether these efforts will lead to a paced-up adoption.

India & Cryptocurrencies

On the crypto front, at one point, India was among the top five countries in terms of adoption. However, several anti-crypto measures, including a 30% tax on crypto profits, no way to offset losses from one cryptocurrency to another, and 1% GST on each crypto transaction, dealt a massive blow to the local crypto ecosystem.

Recently, the RBI governor called cryptocurrencies akin to gambling, which shows that the Central Bank is against the proliferation of private digital assets in the country. It also believes the growing use of cryptocurrencies can lead to the dollarization of the economy and has urged the Indian government to ban them.

So far, the government hasn’t banned crypto or regulated them despite introducing a bill for a potential ban on private virtual currencies in 2021. However, the bill was never taken up for debate.

What must be the reason behind the sudden change of heart? After studying cryptocurrencies and their underlying blockchain technology, the government realized that the borderless nature of digital assets requires global cooperation to enforce a ban or even regulate them under any specific framework.

Finally, with India taking over the G20 presidency, the country saw an opportunity to initiate a global discussion on crypto regulations and build a consensus.


Entrust

Decentralized Identity – Know Your Customer (Kyc)

It is a well-established fact that credential theft is the most common attack vector cybercriminals... The post Decentralized Identity – Know Your Customer (Kyc) appeared first on Entrust Blog.

It is a well-established fact that credential theft is the most common attack vector cybercriminals use to defraud consumers. In fact, globally financial institutions spend hundreds of billions of dollars a year on anti-money laundering compliance and technology solutions that help protect and authenticate consumer identity (I.e., identity proofing, biometrics, and multi-factor authentication to name a few).

Creating and implementing effective customer identity verification processes is essential to reducing the risk of fraudulent transactions and ensuring regulatory compliance.

One such approach is to share information among banks and governments through a multi-step model called Know Your Customer (KYC) – which may involve a central database maintained by a consortium of banks like Swift whereby consumer data is used to verify their identity and identify any potential risks of doing business with them. self-sovereign identity (SSI) is a new decentralized identity model that can help banks to solve the existing challenges of sharing KYC information securely and cost-effectively.

With decentralized KYC, the customer can be issued a KYC verifiable credential that can be cryptographically verified and stored on the customer’s mobile wallet. Verifiable credentials are trustworthy, tamper-proof, and machine-verifiable digital identity documents. The verifiable credentials can be issued by KYC providers and verified by other service providers for authenticity and ownership. The customer may proactively trigger issuance of such credentials or as part of an existing KYC process. KYC verifiable credentials provide a strong benefit to all involved – a seamless customer experience, as well as a simplified and more cost-effective processes for banks and service providers.

A lot of progress has been made in recent years in the field of self-sovereign identity:

Ratification of standards like W3C VC, and DID European Blockchain Services Infrastructure (EBSI) framework matured with POCs and pilot projects under way eIDAS 2.0 close to being released British Columbia Government launches VON (Verifiable Organizations Network) Increased number of SSI networks all around the world

eIDAS 2.0 will mandate business and government organizations within the EU to accept decentralized identity verifiable credentials stored in digital wallets. KYC providers must ensure they are ready to integrate with the EBSI framework.

Benefits for all when KYC verified credentials are implemented

Incremental deployment of AI/ML technology has enabled online KYC checks that provide a significant improvement over the manual KYC process. With the manual KYC processes, a considerable amount of time and resources are spent on reviewing checks for the same user repeatedly.

Imagine a user onboarding experience using a verifiable KYC credential as follows:

Customer visits service provider website Customer scans a QR code with their mobile wallet A secure encrypted channel is established between the mobile wallet and website Service provider requests KYC credentials Customer consents to sharing their verifiable KYC credential Service provider website lets them in

And this process only takes a few seconds!

The reusable nature of KYC verifiable credentials and user control of their own data lends itself nicely to an exceptional user experience and builds confidence around privacy. Additionally, the reusable KYC credentials enhance business process efficiency and reduce costs.

Repeated cost of passport and driver license scans, facial recognition, and liveness detection for KYC can add up and impact the P&L significantly. With KYC verifiable credentials, the cost is reduced significantly, and that financial benefit can be passed on to the banks by KYC providers while maintaining higher margins.

As decentralized verifiable credentials are stored in mobile wallets, there is significantly reduced risk of data being stolen. Mobile wallets enable user consent to ensure no information leakage, as well as maintaining a strong security and privacy posture.

With rapidly increasing adoption and acceptance of  decentralized identity (~27% CAGR), KYC providers can benefit from issuing KYC verifiable credentials.

Contact our experts to discuss your organization’s needs.

The post Decentralized Identity – Know Your Customer (Kyc) appeared first on Entrust Blog.


KuppingerCole

Mar 21, 2023: Urgent: Find and Block Identity-Centric Security Threats Today

The inability to deal with identity-centric cyber threats is one of the most critical issues facing modern enterprises. Attackers are increasingly targeting digital identities to gain unauthorized access to systems and data. Action is essential, but detecting unauthorized access is challenging.
The inability to deal with identity-centric cyber threats is one of the most critical issues facing modern enterprises. Attackers are increasingly targeting digital identities to gain unauthorized access to systems and data. Action is essential, but detecting unauthorized access is challenging.

Ontology

Ontology Weekly Report (January 10–16, 2023)

Highlights The Ontology community has reached 10,000 followers on CoinMarketCap. Let’s get to the next 10k! Latest Developments Development Progress We are 100% done with the Rollup VM design. The White Paper will be published soon. We are 98% done with the Rollup L1<->L2 cross-layer communication. We are 98% done with the Rollup L1<->L2 Token Bridge. We are 99% done
Highlights

The Ontology community has reached 10,000 followers on CoinMarketCap. Let’s get to the next 10k!

Latest Developments Development Progress We are 100% done with the Rollup VM design. The White Paper will be published soon. We are 98% done with the Rollup L1<->L2 cross-layer communication. We are 98% done with the Rollup L1<->L2 Token Bridge. We are 99% done with the L1 data synchronization server. We are 99% done with the L2 Rollup Node. We are 92% done with the L2 blockchain browser. We are 10% done with the Evm bloom bit index optimization. We are 10% done with the high ledger memory usage optimization. Product Development ONTO App v4.4.6 integrated KCC and Canto Chains, Added asset swap on Cronos and Aurora Chains, and enabled Vision Chain assets in Red Packet. ONTO has published the December monthly report, summarizing a series of functional optimizations including integrated Ripple Ledger, Litecoin, Celo and ENULS Chains, and added asset swap on Harmony and Boba Network. ONTO partnered with IoTeX to host a Telegram Quiz campaign of $100 USDC rewards. Follow the @ONTO Wallet Official Announcement on Telegram for more details. On-Chain Activity 160 total dApps on MainNet as of January 16th, 2023. 7,273,509 total dApp-related transactions on MainNet, an increase of 7,245 from last week. 18,223,198 total transactions on MainNet, an increase of 14,623 from last week. Community Growth We held our Weekly Community Call with the theme of “The Dimensions of Reputation”. The construction of Web3 reputation is an important issue of Web3 trust and security, and it is also an element of project construction such as DeFi and DAO. How to build reputation and how to choose reasonable scoring dimensions are issues that reputation providers need to take seriously. We held our Telegram weekly Community Discussion led by Ontology Loyal Members, discussing “the theme of “CEX vs DEX”, comparing the advantages and disadvantages between them. Participants also got the chance to win Loyal Member NFTs. As always, we’re active on Twitter and Telegram, where you can keep up with our latest developments and community updates. Global News Ontology published the “Meet the Team” series and interviewed Ontology’s Head of Community Humpty Caldero. He has shared his thoughts about Web3 in the next five years:
What we’re going to see is more people owning their identity and data whilst creating monetization opportunities. Both privacy and self-sovereignty are important to people
In the Media

Cointelegraph — Using blockchain technology to combat retail theft

Blockchain technology could be the world’s solution to ending retail theft! By registering products on the blockchain with unique IDs, items such as power tools would then only be activated once purchased and could be programmed to be unusable if stolen.

“Through Project Unlock, a unique ID is registered and assigned to each of our power tools. When that product is purchased, the RFID system activates the power tool for use. At the same time, the transaction can be viewed by anyone, since that information gets recorded to a public blockchain network.”

Follow us on social media!

Ontology website / ONTO website / OWallet (GitHub)

Twitter / Reddit / Facebook / LinkedIn / YouTube / NaverBlog / Forklog

Telegram Announcement / Telegram English / GitHubDiscord

Ontology Weekly Report (January 10–16, 2023) was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


KuppingerCole

Feb 08, 2023: Unlock the Potential of Passwordless Authentication

The idea of passwords becoming obsolete has been discussed by the IT industry for years, if not decades. In recent years, Passwordless Authentication has become a popular and catchy term. It is used to describe a set of identity verification solutions that remove the password from all aspects of the authentication flow, and from the recovery process as well. Some passwordless options have been arou
The idea of passwords becoming obsolete has been discussed by the IT industry for years, if not decades. In recent years, Passwordless Authentication has become a popular and catchy term. It is used to describe a set of identity verification solutions that remove the password from all aspects of the authentication flow, and from the recovery process as well. Some passwordless options have been around for a while but are starting to be implemented more by enterprises and even consumer-facing businesses.

UbiSecure

Open Metaverse and the Importance of Self-Sovereign Identity, with Dr Mark van Rijmenam, The Digital Futures Institute – Podcast Episode 83

Let’s talk about digital identity with Dr Mark van Rijmenam, Founder and Future Tech Strategist at The Digital Futures Institute. Dr Mark... The post Open Metaverse and the Importance of Self-Sovereign Identity, with Dr Mark van Rijmenam, The Digital Futures Institute – Podcast Episode 83 appeared first on Ubisecure Customer Identity Management.
Let’s talk about digital identity with Dr Mark van Rijmenam, Founder and Future Tech Strategist at The Digital Futures Institute.

Dr Mark van Rijmenam joins Oscar to discuss the importance of Self-Sovereign Identity in the Open Metaverse – including his definition of metaverse, derived from his interviews with entrepreneurs for his latest book, the motivations for entrepreneurs to be building assets in the metaverse, the role of identity and its importance in the open metaverse.

[Transcript below]

“I think it’s crucial that we own and control our own data, that we control our own digital assets, and that we control our own identity and reputation.”

Dr Mark van Rijmenam is The Digital Speaker. He is a leading strategic futurist who thinks about how emerging technologies change organizations, society and the metaverse. He is the founder of the Digital Futures Institute, with a mission to ensure a thriving digital future for business and society. Van Rijmenam is an international keynote speaker, and 5x author. His latest book is Future Visions, which was written in five days in collaboration with AI.

Find his articles and books at The Digital Speaker.

Connect with Mark on LinkedIn or Twitter.

We’ll be continuing this conversation on Twitter using #LTADI – join us @ubisecure!

Go to our YouTube to watch the video transcript for this episode.

 

Podcast transcript

Let’s Talk About Digital Identity, the podcast connecting identity and business. I am your host, Oscar Santolalla.

Come and meet the Ubisecure team at the Gartner Identity and Access Management Summit, in London, on the 6th and 7th of March. To find out more, take a look at the Ubisecure events page, www.ubisecure.com/events/.

Oscar Santolalla: Hello and thank you for joining us to this first episode of Let’s Talk About Digital Identity in this New Year, 2023. And we want to start hearing very futuristic things about a future, very futuristic. We have a really amazing guest to start this year. Let me introduce you, Dr Mark van Rijmenam. He is the digital speaker, he is a leading strategic futurist who thinks about how emerging technologies change organisations, society and the metaverse.

He is the founder of the Digital Futures Institute with a mission to ensure a thriving digital future for businesses and society. Van Rijmenam is an international keynote speaker. He is five times author, and his latest book is Future Visions, which was written in five days in collaboration with artificial intelligence. I definitely want to hear more about that. Hey, Mark, welcome.

Dr Mark van Rijmenam: Thank you very much Oscar for having me on the show. It’s great to be here.

Oscar: Yes, definitely our pleasure. Well, happy New Year.

Mark: Happy New Year to you, too.

Oscar: Yes, we are still in the beginning of 2023.

Please tell us about yourself and how – what was your journey to this world of identity, metaverse and everything that you are doing today.

Mark: I’m sure it sounds good. Well, obviously you already gave a very nice introduction, but I’ll add some things to it. So, I’ve been a keynote speaker for over a decade. I am a strategic futurist, so it means I really think about emerging technologies, and I try to understand what these technologies, these emerging cutting-edge technologies mean for you and me, for organisations, for society, and how we can benefit from them.

Because these technologies are constantly evolving. So, I’ve been doing this for over a decade. I’ve been speaking all around the world about that. I’ve been, as you said, five books. And I really try to always practise what I preach. And so that means that I – when the pandemic hit, I created myself an avatar, created myself as a hologram to deliver keynotes as such.

I’m currently working on building a digital twin of myself to understand – what are the consequences of creating a digital twin of yourself? A synthetic human, so to say. And how does it influence whatever we do? And I am very much involved in, you know, big data blockchain, artificial intelligence and the convergence of these technologies, which we are all coming together in the metaverse of, which was my fourth book, Step into the Metaverse, where a big part of that is also focused on identity. Because I believe that the metaverse will unleash a sort of a Cambrian explosion of identity, and it’s very important how to deal with that.

I’ve also been involved in a start-up, which unfortunately failed, but that’s the start-up life. Focused on identity, focused on fighting misinformation with reputation-based system. It’s very challenging to do anything in this space because we are very much used to a certain identity system that we have in our society. And shifting that is quite challenging, but I’m sure we’ll get to that during this episode.

So yes, that’s basically what I do. And yeah, indeed, my latest book, Future Visions, written, edited and designed by AI, I’m sure some of you have heard of ChatGPT, which is taking internet by storm. And the moment it arrived, I thought, I’m going to grasp this opportunity to write a book with it.

So, I literally wrote it in five days, and I didn’t change a word. I didn’t – I maybe like five or ten words that I changed myself, but the rest is exactly written by AI. And it was an experiment for me to understand what is possible with off the shelf technology, and it’s quite surprising how good it is, but also how not good it is. It’s not the Holy Grail. It’s fantastic technology, but there are definitely some caveats. And it was a fantastic experience to do.

Oscar: That, that sounds very interesting. So, you wrote a full book just using the ChatGPT that many people are talking about these days for the last, at least, last two months, I would say – quite a lot about that. And yes, super interesting journey you have had.

One of the last things you said is about the misinformation that – every time I hear that word like, we really have to do more about that – and it’s not easy, right? It’s definitely not easy.

That will come also on the metaverse, which is actually the main thing we’d like to discuss with you. So, it’s skimming through the pages of your last, second book, Step into the Metaverse, How the Immersive Internet Will Unlock a Trillion Dollar Social Economy. So, I read part of your book it is very interesting, so let’s go into that – to start with a common idea – please, could you give me your definition of metaverse?

Mark: Yes. That’s a very good point to start because the metaverse is a very, very abstract concept which many people have different perspective of what it actually is. And for the book I did about 100 in-depth interviews with the stakeholders who are building the metaverse. I did about 150 surveys, and interestingly enough, I got like almost 250 different definitions of what the metaverse is, which sort of shows you how difficult of a concept it is.

I sort of derive my own definition from this, and to me, the metaverse is the next iteration of the internet, it’s where the physical and the digital world are converging. And where the physical moving to the digital the digital moves into the physical. Now, that’s a lot of information there. And so, we can briefly unpack it a little bit.

So, if we start with the first one, you had a physical move into the digital. Basically, this conversation that we have, you could argue, is part of a very, very early phase of the metaverse because you are physically in Finland. I’m physically in Australia and we are digitally connected through our computers, and we have this conversation. It’s a 2D connection. So yes, our screens are 2D. These are not immersive.

But you could argue this is part of the metaverse. Other parts of the metaverse are, which I think are very, very important, is, for example, digital twins now. Where we create a digital replica of a physical asset that we can interact with in the digital world and we can just monitor it, or we can actually interact with it and then any changes that we make in the digital world, we will have an effect in the physical world. And that’s also part of the metaverse. And often people think that virtual reality is the most important part of the metaverse, but to me it’s only one channel to access the metaverse in an immersive way.

The other part, the other channel so to say, is augmented reality, where basically which means that we bring the digital into the physical world. I think that part is going to be much more important and much bigger because it allows us basically to create like infinite layers on top of reality. And this layer can be for entertainment, so you can have a flying purple dragon above the Opera House here in Australia, or you can use it to understand when you’re driving to have, augmented reality where there’s a parking space available or whatever you can come up with. And I think that’s also a very, very important part of the metaverse.

I think in the next decade or so we will see that computers will disappear, smartphones will disappear, tablets will disappear. They will all be replaced by headsets at first, augmented reality headsets I think, they will become a miniaturised, very sleek glasses that you can wear. And you don’t need a laptop anymore, you don’t need a smartphone anymore because you have it all in front of your eyes.

So, the metaverse is the immersive internet, and this internet will become as pervasive as the air we breathe. And it will mean it will move from making a conscious decision to go on the internet – so, if you want to go on the internet today, you have to grab your phone and start doing something. And it will switch to being “in” the internet. So being fully immersed and being part of the internet. By the internet being as pervasive to the air we breathe or energy that we use. This internet will be 3D, and that’s much more in line with what we humans are used to because we are 3D humans.

So, we thrive in a 3D environment much more so than a 2D environment. So that’s sort of what’s going to happen. There’s a lot of information, but in short, it’s where the physical and the digital world are converging, creating this immersive 3D internet that we can connect with and can be part of.

Oscar: Yes, you said that for writing this book on the metaverse. You have interview at least 100 of entrepreneurs who are building some their metaverse versions or some product related to the metaverse. So, I like to know from those conversations that you have had – so what has been their main motivation, why they are spending their time building those and not something else. So, what are, let’s say, the main motivation did you find in common amongst these entrepreneurs?

Mark: Well, I think it’s a very good question. And I think what I noticed is that everyone that I spoke to, understands that the metaverse is the next iteration of the internet. It is the future. Whether we want it or not, whether we believe in it or not, it will define the next ten, 20, 30 years, if not more.

And so, any smart entrepreneur should dive headfirst into that because if you would have done that in the 1990’s, you would have been, you know, had a good chance to be the next Amazon. And that’s what I think is happening here because first we had one, then we had like sort of the mobile app with the launch of the iPhone. We had the social web, with the launch of all the social media platforms and now we move to the immersion web.

So, there’s a ton of work to be done. There’s a ton of money to be made because, you know, several banks and a major strategy consultants say that by 2030, the metaverse will drive between 5 and 13 trillion dollars for the global economy.

Personally, I think it’s going to be a lot more, simply by looking at the impact that the internet had already on our society. So, it makes just good business sense to dive into the metaverse to see what you can contribute to this next iteration of the internet.

Besides, firstly, I think that the metaverse is a fascinating environment to work in, because it’s all novel, it’s all magical, it’s all – all the things that can become true in the metaverse. There are no laws of physics in the metaverse, so you’re not – we don’t have any restrictions on what we can build in the metaverse. And I think we can create this magical world, this magical virtual world, with these magical augmented digital experiences that are not possible in the physical world. And I personally find it fascinating.

So, I really enjoy being part of that. And I think over time when – the more we step into the metaverse, because mind you, the metaverse is still a few years out. The more a society steps into the metaverse, the more people will experience this magic as well.

Oscar: So different motivation. It sounds to me – it’s most like, I know there will be this new paradigm so that the technology is coming anyway. It sounds like that, and the entrepreneurs have to be there. Sounds like those are the main motivations.

Mark: Yeah, I want to add one to that because it’s – so my book has been, I’ve meant it as a blueprint for an open metaverse. And an open metaverse is really focused on – how can we create a metaverse that’s there for us, for you and me, consumers. And that’s owned and controlled by us and not necessarily controlled by big tech or very, very tiny elite who controls whatever we do online, which currently the current internet is like that.

We don’t control our own data; we don’t control our own digital identity. The internet is basically controlled by a handful of very, very powerful, very big technology companies. Now, with the metaverse, with the amount of data that you create in this immersive internet, which will be 100 times more than we do today, if not even more. I think it’s crucial that we own and control our own data, that we control our own digital assets, and that we control our own identity and reputation. Because we don’t want to live in a world where the Zuckerberg’s of this world can decide whether or not you have access to this immersive internet or not. And I think that’s something really, really important.

Of course, we have to build it in the correct way, because you know, with building something decentralised also come a lot of challenges. But that’s what I did for the book and most people that I spoke to, they tried to do that as well. So, for a lot of people that I spoke to, they’re driven by this quest of building an open metaverse that’s there for us. And to change the paradigm from a centralised internet to a decentralised.

Oscar: Yes, that’s something I read in your book, the concept of the Open Metaverse. So, it’s great that many of these entrepreneurs have that in mind. So, something else that you just mentioned is – it’s about, of course, identity. Again, thinking of the companies who are now building the metaverse. How in top of the mind is digital identity? So, it’s a component that they are thinking every day, like yes, this is part of metaverse or something that is neglected? So, what would you say?

Mark: Well, I think that the digital identity is a very, very important part of the metaverse. And it was also confirmed, to the very people that I talk to. Simply because, as I mentioned in the start in the metaverse, we can be whomever we want to be, whether that is, I don’t know, a flying dragon, whether that is a walking piano, whether that’s a talking mushroom, it really doesn’t matter.

You can literally be whoever you want to be. And identity in the metaverse is really, really important, much more important than we think today. And if we ask Generation Z, those born after 95 or Generation Alpha, those born after 2010 to them, and this has been done to them, their digital identity is as or even more important than their physical identity.

Let that sink in a bit because that’s the paradigm shift. Your digital identity being more important than your physical identity, completely shift of mind and mindset. And therefore, we see that in the metaverse, digital fashion is really important because just like in a physical world, you want to dress a certain way to showcase who you are, to display your identity. You also want to do that in the metaverse. So digital fashion is a multibillion industry of for people to do that.

Now, what research also has shown is that the moment people can be whoever they want to be in the metaverse, they start experimenting with their identity. And there’s research that people switch gender just to understand what that means. There’s also research which showed that if you are an introvert person in the physical world and you use an extrovert character in the metaverse or in virtual reality, and you play with that character for a couple of hours. Then you will continue to display those extrovert characteristics in the physical world afterwards. Fascinating I think, how that works the digital, our digital identity can affect our physical identity.

So now, of course I think when we talk about identity in the metaverse, we also have to think about the challenges that come with it. Because if you can be walking a piano for that matter, how do I know that that walking piano is Oscar, you know, how do I know that? How can I be certain that I’m not dealing with this with someone else who has stolen your identity?

So digital identity or in this case I would argue Self-Sovereign Identity is very, very crucial for a metaverse, especially in open metaverse. Less so a closed metaverse, which is controlled by companies because they can do your identity check and they can verify that you are a real person. Your identity can still be hacked and be stolen, but that it’s more easy to control it.

That’s also has problems to it. In an open metaverse self-sovereign identity is really, really important because it allows us to control who has access to our data, for how long, to which data, and have full control over assets and our identity. So, I think if we think that identity is important on the current web, we have to think twice because it will be a lot more important in the metaverse. And for many millions of kids and teenagers that digital identity is already more important than the physical identity.

Oscar: Yes, that thing that you just said for a second time. It’s very, very important to think about because we need to protect those identities. Because the big bunch of the people who are going to be in when metaverse is more ubiquitous, as we call it, in the next 10/ 20 years. Will be using heavily, and we have to protect those, those identities.

Another thing you mentioned is if you have some of these, what example? Like a flying dragon, for instance, Oscar is a Flying Dragon in some metaverse. Right? So, people who are inside a metaverse will see the Flying Dragon, my name, maybe. But how do I enter to this metaverse? So that’s a point that many people don’t think right? I should have been, call it, logged in or authenticated properly in order to enter to that that metaverse.

Mark: Well, that’s a major technical and cultural challenge that you just mentioned, because what we don’t want is that if I go to Fortnite and into Roblox, into Decentraland and into the sandbox and to whatever other virtual world. That I every time I have to recreate my flying dragon, every time I have to create a new account, just like we do in the real world, actually.

So that’s not what we want to happen. Now in order to achieve that, we need interoperability. So, you need to be able to have an identity that you can take to a place just like you take your identity to a pub or a restaurant or club or whatever, in the physical world. So, we need to have that same approach.

But there are some companies are working on this. Ready Player Me is a company that’s building an avatar tool so that you can create your avatar ones and then you can use that avatar in over a few thousand platforms already. So that’s a start. It’s a centralised company’s nothing self-sovereign identity with it, is nothing blockchain, nothing decentralised. So, you actually don’t control your identity, but at least it’s the first step that you create one account to do this.

But we already have that in the 2D world, which is called a Facebook login in or a Google login, you know. Login with Google account, login with your Facebook account, which by the way, I would recommend not to do. Because yes, it is easy, but it also means that your data goes to Facebook, goes to Google, and they have even more access into what you are doing. So please don’t do that. I know it’s easy, but just don’t do it.

And so from that perspective, your identity is really important and we need to be able to build this interoperability so that you can take your avatar, your identity, that you create – yours, your flying purple dragon that you’ve created to all these different platforms and all these different platforms have different graphical requirements, different computational requirements, which makes it really, really challenging to do that.

You know, if you go to platform A, it might be hyper realistic and your dragon looks really, really hyper realistic, but then you go to a platform like Minecraft or Roblox is very, very blocky, and how do you adapt that? How do you have that one identity work in both worlds? That’s a massive technical challenge, that’s definitely not solved yet and that does require probably quite a bit of work to achieve that. But yes, what we need to have is interoperability, that you can take your avatar, you can take your flying dragon, and you can fly from one world to the other.

Oscar: Yes, exactly. Now mentioned flying from one world to the other. How open these companies, like Fortnight, say Disney or whichever is the other, Minecraft, no? Are they open to that interoperability? What do you feel that they’re open, to have that? Would they prefer to have it closed?

Mark: Well, most likely they will prefer to have it closed, which I think is a very short-sighted approach. Yes, having a closed network offers you a lot of value. We can only have to look at a mobile, a mobile messaging. WhatsApp was sold for $19 billion in 2014 for a reason because it’s a closed network and you can’t send a WhatsApp message to your signal or to your telegram.

In Europe, that’s going to change with the new laws. Investor rule probably not. So, we are very much used to not having this interoperability because for such large companies it offers a lot of value. If we do have that interoperability for society, it brings a lot of value.

We only have to look at email, we are able to send an email from a Gmail account to a Hotmail account. Imagine that would not be possible or imagine that we, we don’t have interoperability for websites that you can only build a website for, I don’t know, Chrome and then you have to build a completely new website for Internet Explorer, and you can’t just switch between. Imagine what that would mean for the world, it would just ruin the internet.

And email is so successful because I can use Gmail, you can use Hotmail and we can communicate. So, I think it’s very short-sighted for these companies – I understand why they think like that, but I think it’s very short sighted and very selfish almost, to work on value extraction instead of value creation for society.

So, interoperability will add a lot more value to all these platforms. If you really make it easy and make it nice and easy for people to come and also leave, you will see that if you offer the best product, the best service then people will still come, and you will still make money. It’s a different approach, it’s approach from a short-term share approach to a long-term societal stakeholder approach.

And I think as a society, we need to make that shift from a short term to long term. And I argue and I call every organisation to, to make that shift. However, I’m also a realist and I know that that’s not very likely and that most likely regulation will have to step in to force these people because they probably will not do it by themselves.

Oscar: Yes. I couldn’t agree more with this point about that, and I hope they are listening to. They are listening to Mark and everyone else who is.

Mark: I hope so too.

Oscar: You already mentioned self-sovereign identity. Would you say that this is going to be the dominant paradigm in the metaverse?

Mark: Well, I hope so and I think it should be, because it’s a way that we control who has access to our data. And the best example here is, of course, if I go to pub and I need to show that I’m over 18 currently in the world, I have to show my driver’s licence. On my driver’s licence there’s a ton of information that’s not relevant for the question. Are you over 18, yes, or no? Which is a, just a very simple question and a self-sovereign identity allows, would allow us to answer that question, that we can trust, without providing all that information. And I think as a society, I think we should want it.

We should be able to live in a world where we are not controlled by a centralised entity because generally centralised entities, they corrupt or they get if they become too powerful – in terms of countries, democracies change to non-democracies. So, I don’t think that’s the right direction. So, for me, for a humanity perspective, I think a self-sovereign identity is the best approach.

Now obviously there’s also a lot of challenges to it because if you own and control your own digital identity and it works with the private key and public key, and your private key is 128 bits, whatever, or even ideally more whatever. And you’re going to lose this long string of numbers because people lose passports and smartphones all the time. How are we going to deal with that?

That question hasn’t been answered yet and people will lose their private key. And if your self-sovereign identity is everything that you do and you lose it, then you are in really, really deep trouble. So, we need to solve that.

It hasn’t been solved and we need to – because it’s almost an oxymoron. You know, I am I going to store my private key with a centralised entity. So, then your private key is, you know, your self-sovereign identity is no longer self-sovereign.

And we saw that with that with the collapse of the various crypto exchanges, if you don’t own your private keys, the money is not really yours. Because it can just disappear. And so, this is self-sovereign identity, very, very important. It hasn’t been cracked yet and there’s still quite some technical challenges that we need to resolve here.

Oscar: Yeah, I believe so. It’s super important to solve that problem. Absolutely. We’ve been talking about – you illustrate very nicely all these scenarios mostly for individuals, I would say. But if we now focus our attention a bit more into businesses, even government, for instance, organisations just in general. So, what are the opportunities or some scenario you can see the metaverse for, yes, for organisations and businesses?

Mark: Well as I mentioned earlier, you know, the metaverse will contribute trillions and trillions of dollars to the global economy. So, there are enormous amount of possibilities.

There are possibilities for consumer B2C, digital fashion, multibillion dollar industry, entertainment, immersive sports, watching sports or using augmented reality to bring a TV show into your living room. Well, Disney recently released a sample of that, which looks amazing. Education, you know, if you can learn something, immersive world, if you can walk around Rome for your history classes and pause whatever is happening to have a discussion with your teacher, that, of course, is a lot more powerful, but also from an enterprise perspective, you know, if I am able to collaborate in a virtual world, in a world that works, in a 3D world that’s a lot more intuitive and much, much more logical for us humans to operate in.

And that will have a big, big impact. Early last year in 2022, I was part of a training done by almost a dozen police forces around the world. And they were doing an exercise in the metaverse, in virtual reality and working with, you know, physical evidence and digital evidence. Everyone was in their own location in Singapore, in UAE, Bahrain, Senegal, France and several other countries.

And they were able to, to solve this scenario, which was a terrorist attack in a hypothetical country. And they all said afterwards that being able to collaborate in a virtual world is really nice and it’s really easy to get along with and also because there was no hierarchy, because all the avatars look the same that help the police forces are very hierarchical that, of course, and that really helped as well.

So, there are a lot of benefits to this. You also see it, for example, in design companies, car companies Volvo is doing a lot by using virtual reality or even mixed reality to design cars with remote teams. So, sort of building a claim model in a physical location, you build a digital model with your design team just living or working anywhere in the world.

It doesn’t matter where they are. All these things will have a big impact. And that will also have a big impact on society because, you know, if we think that the pandemic changed working from home, the metaverse will enable working from anywhere and where you can be literally anywhere you want in the world and eventually in early next decade, I have the feeling as if you are physically present in the office by but you are on a tropical paradise in the Pacific.

And that that’s something where we are going to still far, far away.

Oscar: Yes, from sounds nice. Final question, Mark, for all business leaders that are listening to us now, what is the one actionable idea that they should write on their agendas today?

Mark: Educate yourself and because the world is changing so fast at the moment, if you blink your eyes, you’ve missed a train. And we could have seen that. We did that with the AI, all the generative AI stuff that is happening at the moment, even for me. And it’s my job to be in to know what’s going on.

Even for me, it’s sometimes difficult to understand and to follow and to be up to date of what’s going on because the developments are going so fast. Now, if this is not your core job which for 99.99% of people, it isn’t. And it often ends up on a very long to do list at the bottom.

But you need to understand what’s happening and you need to understand ideally as an organisation, I would also start experimenting with this stuff, small experiments, just to understand what’s happening. And then you can, you can take it from there.

Oscar: Yes. Excellent. And as you say, your, um, you do what you preach. So, like your last book, just in doing your due diligence, doing this kind of stuff. Yeah, I think I have to do some experiments like that myself.

Mark: Well 100%, and for me doing these experiments, they help me to understand, to better understand these technologies. And so, if you want to understand what technology, X, Y, Z means for your business, start experimenting with it.

Oscar: Excellent. Well, thanks a lot Mark. What’s been really fascinating conversation, going for moments very deep into the digital identity, which is something that we are very passionate about, that. And you gave us really good ideas and updates what’s going on. But let us know if someone would like to follow the conversation with you or get more about what you’re doing. What are the best ways?

Mark: So, I’m pretty visible online, so the easiest ways to go to find me on my website, which is thedigitalspeaker.com, you’ll find my books there, my academic papers, my videos, my articles. I have almost a thousand articles about these topics all available to consume, feel free to email me, connect with me on LinkedIn, on Twitter. I’m happy to connect with anyone.

Oscar: Fantastic and again, it was a pleasure talking with you, Mark, and all the best.

Mark: Thank you very much for having me, Oscar. It’s been a great conversation.

Thanks for listening to this episode of Let’s Talk About Digital Identity produced by Ubisecure. Stay up to date with episode at ubisecure.com/podcast or join us on Twitter @ubisecure and use the #LTADI. Until next time.

The post Open Metaverse and the Importance of Self-Sovereign Identity, with Dr Mark van Rijmenam, The Digital Futures Institute – Podcast Episode 83 appeared first on Ubisecure Customer Identity Management.


SelfKey

Certifier's Platform

The Platform gives Identity Owners the ability to validate and certify their personal information or documents using the services of real notaries.

The Platform gives Identity Owners the ability to validate and certify their personal information or documents using the services of real notaries.


Dock

Calling Dock Validators to Vote on the Mainnet Upgrade for the ETH Bridge Launch

Later this year, Dock will be launching an ETH bridge and mainnet upgrades on the Dock Blockchain are required to facilitate this while also enhancing security.

Later this year, Dock will be launching an ETH bridge and mainnet upgrades on the Dock Blockchain are required to facilitate this while also enhancing security.

The Dock-Ethereum bridge will enable the following:

Users to easily move Ethereum assets to Dock’s low-cost, highly scalable platform and vice versa and bridge any ERC-20 token to Dock The issuers and DApp builders on Dock will be able to create DIDs and anchor verifiable credentials on Dock and verify it on Ethereum Open a new gateway for the enormous number of organizations and DAOs built on Ethereum to issue VCs and DIDs on Dock that will be verifiable on Ethereum

The bridge won’t be possible without the upgrades.

What Do Validators Need to Do?

In order for the upgrade proposal to pass, we require at least ⅔ of validators to approve the new code before it is added to the mainnet in order to avoid consensus issues. If it passes it will take a few days for the upgrade execution to complete.

View the proposal here and the voting period will be from Feb. 4-18, 2023. Go here for detailed instructions and scripts to upgrade nodes as soon as possible.

Voting will be democratic and each network participant will be able to vote.

Connect With Us

If you have any questions about these upgrades, please connect with us on Discord.

Learn More How to Prevent Supply Chain Fraud Self-Sovereign Identity Decentralized Identity How to Prevent Certificate Fraud Verifiable Credentials Blockchain Identity Management What are Digital Credentials? Web3 Identity Web3 Authentication Blockchain Food Traceability Data Compliance

Tuesday, 17. January 2023

KuppingerCole

Evolving Identity and Access Management for the Digital Era

Join Identity & Access Management experts from KuppingerCole Analysts and Broadcom as they discuss how business IT is changing, and the implications for IAM. They will define modern IAM and explain why and how IAM needs to change to support modern app development, regulatory compliance, and user satisfaction. Martin Kuppinger, Principal Analyst at KuppingerCole Analysts, will look at the evo

Join Identity & Access Management experts from KuppingerCole Analysts and Broadcom as they discuss how business IT is changing, and the implications for IAM. They will define modern IAM and explain why and how IAM needs to change to support modern app development, regulatory compliance, and user satisfaction.

Martin Kuppinger, Principal Analyst at KuppingerCole Analysts, will look at the evolution of the concept of Identity Fabrics, its guiding principles, how to approach IAM investments, and how KuppingerCole expects Identity Fabrics to evolve over the next few years. Vadim Lander, Identity Security CTO & Distinguished Engineer in the Symantec Identity Security Group will explain how organizations can transition their IAM capabilities to support modern business IT environments without a radical rip and replace approach. He will also provide insights into how to make your IAM performant, scalable, extensible, manageable, and interoperable.




1Kosmos BlockID

What Is a Time-Based One-Time Password (TOTP)?

Modern identity management relies on multi-factor authentication to maintain account security above and beyond simple passwords. One-Time Passwords (OTPs) are a vital part of this effort. What are Time-Based OTPs? TOPTs are one-time passwords that use synced clocks to generate and cycle through authentication tokens for added system security. What Is a One-Time Password? A … Continued The post W

Modern identity management relies on multi-factor authentication to maintain account security above and beyond simple passwords. One-Time Passwords (OTPs) are a vital part of this effort.

What are Time-Based OTPs? TOPTs are one-time passwords that use synced clocks to generate and cycle through authentication tokens for added system security.

What Is a One-Time Password?

A One-Time Password (OTP) is a form of authentication in which the authentication system generates pseudo-randomized tokens, or strings of alphanumeric characters, to verify user identity. A common form of multi-factor authentication (MFA), one-time passwords fill the role of “ownership,” or proof that the user has access to a particular device or delivery method.

In an OTP scheme, the authentication system generates a secure one-time authentication token, often in the form of a string of numbers, to which the user has access. This token is unique to the transaction at hand and will only be known through the ownership of the delivery mechanism of the password (which can be both hardware- and software-based).

OTPs provide users with a secure form of MFA because they are generally easy to disseminate and use as a second authentication factor. Currently, several common forms of OTP delivery exist in enterprise and consumer markets.

These forms of token delivery include:

Email/SMS: By far, the most common OTP method is the delivery of a token via SMS text or email. The authentication system uses the user’s credentials to send an OTP to one of these services, requiring the user to input the OTP once received. This approach is generally more secure than single-factor authentication, with the caveat that it is useless should users have access to their devices or email stolen. Hardware Tokens: Some strict security schemes require users to have physical tokens. These devices may resemble a badge, keychain fob, or USB key and generate keys synced to the conditions of the authentication server. However, they do not require Internet access and can only be compromised if the device is lost or stolen. Software Authenticators: A common alternative to hardware keys are software authenticators that generate tokens. Common mobile apps like Microsoft Authenticator or Google Authenticator provide added security by eliminating the need to send OTPs across SMS or email. Push Authentication: Modern OTP systems can use push notifications on a mobile phone to send a one-time password or even bypass the need for an OTP entirely.

With all the possible ways to send an OTP, it’s still critical that these passwords are random and secure–if a hacker could guess an OTP, they are next to worthless. To avoid that issue, there are several ways that OTPs are generated:

Hash-Based: Hash-Based OTPs (HTOPs) are generated using a hashing function that often includes a secret key and a “moving factor” to create pseudo random tokens. This moving factor is pulled from the environment to make guessing OTPs or their generation mechanisms impossible.

Generally, general HTOPs can draw from information gathered during an initial transaction (i.e., the initial password-based authentication in an MFA verification request). Additionally, some hash-based systems will utilize “hash chains,” or sequences of OTPs based on the values of previous one-time passwords.

Challenge-Response: Challenge-response systems will use user input as a moving factor in OTP generation. This moving factor could be the password the user provides or the answer to a specific security question. Time-Based: Time-Based OTPs (TOTPs) use system time and other factors to create and cycle through one-time passwords for additional security. How Do Time-Based One-Time Passwords Work?

Time-Based OTPs are those created using a synchronized timestamp as a moving step to seed a random key. This moving step is used to regenerate a new OTP once per interval, so the actual OTP required for multi-factor authentication changes each interval.

The result is a continuing cycle of OTPs that don’t rely on a specific input to generate. This generation cycle can continue unimpeded such that, at any given time, a user may access a pseudorandom OTP for MFA purposes.

The fluid nature of TOTPs provides several significant security benefits to enterprises and users by protecting against specific attacks, namely:

Brute-Force Password Attacks: Brute-force attacks are those that involve, as the name suggests, hammering authentication systems with password lists (usually dictionaries or lists of common or default passwords).

This is a general feature of most MFA systems, but OTPs circumvent even the possibility of password guessing by requiring access to a service account or device.

Phishing: Phishing attacks count on users giving up passwords via email or SMS so that hackers can gain access to important accounts. OTPs mitigate this possibility by requiring that hackers also have access to an OTP to authenticate.
In the case of TOTPs, this challenge is amplified by the fact that the hacker would need to own either the token-generation device or an end-user’s token authenticator (such as a software app) at the exact moment of verification to get the right token. Replay Attacks: Replay attacks are when a third party intercepts credentials and attempts to insert them into an authentication process to spoof their identity, usually soon after the original transaction. Because TOTPs require time-sensitive tokens, there’s no way to spoof credentials in a time-sensitive way accurately.

While TOTPs prevent several attacks, they are still vulnerable if a token generator (hardware or software) is stolen. For example, if the user has an OTP application on their phone and the phone is stolen, then the benefits of TOTPs are largely irrelevant.

What Are the Benefits of TOTP?

TOTP, and MFA writ large, provide significant business and security advantages over single-factor authentication. Additionally, a TOTP system can address significant issues with authentication security–namely, implementation, onboarding, and adoption.

These benefits include:

Cost Reduction: OTP systems generally are common and relatively easy to implement. Using TOTPs involves very little investment above and beyond typical MFA systems. Finally, even third-party authentication and identity providers can offer robust MFA and OTP support for a straightforward monthly fee. All of these factors combine to reduce costs (in terms of both time and money) for advanced security. User Friendly: OTP systems integrate well with mobile and networked technologies. Software TOTPs are readily available for tablets and smartphones, so your employees can easily install and use the technology without changing their normal computing habits. This reduces the complexity of onboarding or adoption–often serious issues in adopting proper security methods. Scalable: OTPs, and TOTPs, are easy to scale. You only need an authentication server and delivery methods (apps, hardware tokens, etc.) to distribute. That’s it. Secure: TOTPs provide a critical layer of time-sensitive security that can thwart some of the most common and challenging threats facing enterprises today, specifically phishing and other social engineering attacks. Trust 1Kosmos Authenticators in Your Identity Verification System

Identity management has gone almost entirely unchanged for 60 years. Even new forms of authentication have relied on slowly-evolving technologies that need to catch up with the times.

1Kosmos changes the authentication landscape by combining MFA (including using time-based OTPs) with passwordless authentication and decentralized identity management. The result? Ultra-secure enterprise systems that also reduce user roadblocks for adoption with intuitive UX design for quick and easy onboarding.

With 1Kosmos BlockID, you get the following benefits:

Identity-Based Authentication: We push biometrics and authentication into a new “who you are” paradigm. BlockID uses biometrics to identify individuals, not devices, through credential triangulation and identity verification. Identity Proofing: BlockID verifies identity anywhere, anytime and on any device with over 99% accuracy. Privacy by Design: Embedding privacy into the design of our ecosystem is a core principle of 1Kosmos. We protect personally identifiable information in a distributed identity architecture and the encrypted data is only accessible by the user. Private and Permissioned Blockchain: 1Kosmos protects personally identifiable information in a private and permissioned blockchain, encrypts digital identities, and is only accessible by the user. The distributed properties ensure no databases to breach or honeypots for hackers to target. Interoperability: BlockID can readily integrate with existing infrastructure through its 50+ out-of-the-box integrations or via API/SDK.

Learn more about 1Kosmos Authenticators with our documentation.

The post What Is a Time-Based One-Time Password (TOTP)? appeared first on 1Kosmos.


KuppingerCole

Data Quality and Integration Solutions

by Martin Kuppinger This report provides an overview of the market for Data Quality and Data Integration solutions that help in gathering, integrating, cleansing, improving, and enriching data across the complete range of data sources in your organization, for enabling use of that data, as well as enabling data governance and supporting data security initiatives. It provides you with a compass to

by Martin Kuppinger

This report provides an overview of the market for Data Quality and Data Integration solutions that help in gathering, integrating, cleansing, improving, and enriching data across the complete range of data sources in your organization, for enabling use of that data, as well as enabling data governance and supporting data security initiatives. It provides you with a compass to help you to find the solution that best meets your needs. We examine the market segment, vendor service functionality, relative market share, and innovative approaches for Data Quality and Integration solutions.

Wider Team

What if your identity ecosystem caused pollution? 

A new paper describes harms rising from digital identity, and accountable responses for 2023. 
A new paper describes harms rising from digital identity, and accountable responses for 2023.  Video of Nicky Hickman and Phil Wolff introducing the paper at the Trust Over IP All Hands meeting on 25 January 2023. 20 minutes of presentation and 20 minutes of Q&A.

Trust Over IP announced today their “Overcoming Human Harm Challenges in Digital Identity Ecosystems” paper for review. Wider Team’s Phil Wolff and Come To The Edge’s Nicky Hickman wrote it with a large team of contributors including Pyrou Chung of the East-West Management Institute.

Six cases track ways people have been hurt when digital identity ecosystems were abused, done poorly, or attacked.

People went hungry, broke, and even died because of digital identity failures.  Organizations wrote to the World Bank last year with ‘grave concerns’ about the Bank’s ID4D program and its “potential for abuse and exploitation.”  Internet Safety Labs lists more than seventy specific ways harms happen in their Digital Harms Dictionary. The sheer diversity of harms makes them harder to identify. 

Many of these harms are preventable by standards bodies, developers, and makers of identity systems. Preventable by customers, the enterprises who configure, deploy, and use identity ecosystems. Preventable by public policy makers and the regulators who enforce them.

Despite human harms being preventable, few are doing anything about it. 

So what?  The coming backlash may devastate open standards efforts. The identity community risks fallout as public cases of outrageous harm pile up. We all rely on vigorous open standards bodies. Backlash hurts identity businesses. When biometrics and AI hit their trust and ethics crises a few years’ ago, they faced regulatory interventions, stalled sales, and investor skepticism. Blamestorms hurt identity ecosystem cohesion. Media outrage blames the closest business or agency. A hospital gets blamed for a vendor’s patient data breach, for example. So shaming and blaming of any ecosystem member will tarnish the rest, and drive abandonment and defection from that ecosystem.

However…

Accountability mitigates backlash. The more we build accountability into ecosystem governance and technology, the smaller the backlash, hopefully. Accountability By Design could be our watchword. Identity can use proven accountability tools and practices. Other technologies and industries dealt with their negative externalities. Drug tampering led to safety caps. Toxic food led to traceability through food supply chains. Racist AI led to ethical AI standards for training ML systems. Environmental pollution led to… well we’re still working on that. So, let’s adapt effective ideas and practices.  Next? What can identity technologists, industry, regulators, and standards groups do now? The paper lists low-hanging fruit and suggests first steps.  Preventable side effects offend me. 

My head hurts over these six identity harms case studies, and the many we didn’t include in the paper. I can’t count the number of aspirational manifestos and value statements by folks who work on self-sovereign identity. We frame identity as a human right, a civil right, a duty of good government, a way to empower the excluded, the poor, and the dispossessed. We champion our digital identity technologies and governance in the name of serving humanity, and business progress, and enabling society, and healing our planet. 

Despite our pure motives and hard work… 

These new, richer identity tools can be abused just as much as old ones. It is human nature that every identification technology will be abused. My cynical take on human nature is grounded in history.

Today’s widely used digital identity tools suffer from cultural blindness and dissonance. ToIP’s Asia Pacific Human Experience working group shared stories where Western models of personhood, encoded in nearly all digital identity systems, exclude and distort how identity works for millions of people. The paper tells more about this. 

We looked at identity systems that enabled predators to hound a child to suicide. To facilitate gambling addiction. To leave a refugee in limbo. All without a responsible person to blame. Without even a single company or policy or technical artifact to blame. 

These harms don’t arrive with accountability. 

Digital identity systems are fundamentally “ecosystems.” 

They are made from many parties, in a complex, diverse, fluid, multijurisdictional web. 

Like cities, identity ecosystems are built up over time, with legacies of cultural assumption and prejudices, of old choices and narrow contexts. 

And those of us who have worked with trust frameworks, who bring identity ecosystems into being or refine them, we have not held ourselves morally at fault when they do harm. No more than those who make cars or airplanes or firearms are responsible when those goods result in injury and death. 

You won’t trace those harms back to an open standard committee meeting or to an identity ecosystem rollout any more than you’d trace a spent bullet back to the factory operator that machined it. 

When chains of causality are hard to map, or are untraceable, the root causes of these harms are unaccountable. 

Inciting action, building in accountability, will be difficult.  The identity technology community is in denial about downstream risks. Few in the identity professions and industries believe these human problems exist.  Untested business cases. Nobody believes this matters enough or matters to them personally or matters urgently.   Anecdotes, not statistics. These threats are mostly uncounted and unmeasured; a weakness in quantified policy and management circles.  No ownership. Nobody owns these problems outright. Neither the toolmakers nor any of an ecosystem’s parties own the side effects of a system’s broken trust.  Scopes of responsibility are too narrow. Our definitions of “ecosystem” leave out many kinds of people affected by the ecosystem, and harms inflicted on them. Although governance practices include responsibility to other members of an ecosystem, they rarely name negative externalities or speak to accountability for them.  Ethics are unevenly distributed. The abilities to discover harms swiftly, make sense of them, and respond well are unequal within an identity ecosystem. They will continue to be. Those who are most ready and able to respond are unlikely to be the only parties generating those harms. And some ecosystem parties will trigger harms emitted by others. “Ethics are unevenly distributed” seems like a universal law that begs for corollaries. IAM industry consolidation reduces individual companies’ fear of side effects. Thoma Bravo buying Forgerock, Ping,and Sailpoint cuts the odds these issues make it to any of their executive OKRs. 

This is blue ocean territory: I haven’t found anyone in our space investing to manage these business risks. No companies, government agencies, or NGOs are actively working on human harm reduction, response, or regulation. After my year on the Trust Over IP Human Experience Working Group’s Harms Task Force, I hope this paper helps folks understand the range of risks digital identity brings. And that we act now, in concert. 

https://trustoverip.org/wp-content/uploads/Overcoming-Human-Harm-Challenges-in-Digital-Identity-Ecosystems-V1.0-2022-11-16.pdf

Paper: Trust Over IP – Overcoming human harm challenges in digital identity ecosystems v1.0 2022-11-16Download

Wider Team are experts in decentralised identity, helping clients assess risks, identify opportunities and map a path to digital trust. For more information please connect on LinkedIn or drop us a line at hello@wider.team.


Finicity

Mastercard partners with upSWOT

Mastercard has partnered with upSWOT, a U.S.-based white-label embedded financial platform, to add data for small businesses on upSWOT’s platform. With the addition of owner-permissioned data from Mastercard’s open banking platform, upSWOT… The post Mastercard partners with upSWOT appeared first on Finicity.

Mastercard has partnered with upSWOT, a U.S.-based white-label embedded financial platform, to add data for small businesses on upSWOT’s platform.

With the addition of owner-permissioned data from Mastercard’s open banking platform, upSWOT now gives small and medium-sized businesses (SMBs) the ability to link financial data to 200 API-enabled apps. These include accounting, enterprise resource planning (ERP), payroll, ecommerce, Customer Relationship Management (CRM), marketing, and POS business applications.

With this partnership, Mastercard and upSWOT will be able to provide SMBs with a smooth and effective approach to run their operations.

Read more about this innovative partnership here.

The post Mastercard partners with upSWOT appeared first on Finicity.


Shyft Network

China Leads the Digital Currency Future

The People’s Republic of China has been into its CBDC project since 2014, with the trials starting in April 2020. Within two years of trial, the country witnessed CBDC transactions of more than 100 billion Yuan. Experts believe the digital Yuan needs to grow in usability to become more popular. In a discussion paper released by the Asian Development Bank in February 2022, it ca
The People’s Republic of China has been into its CBDC project since 2014, with the trials starting in April 2020. Within two years of trial, the country witnessed CBDC transactions of more than 100 billion Yuan. Experts believe the digital Yuan needs to grow in usability to become more popular.

In a discussion paper released by the Asian Development Bank in February 2022, it came out in no ambiguous terms that the People’s Bank of China is developing a Central Bank Digital Currency, also called the digital Yuan (e-CNY).

By that time, the country had already witnessed the emergence of significant fintech players like the Ant Group and Tencent, which started with digital payments and eventually expanded into investment products and loans.

CBDC is one of the central topics of discussion during this year’s World Economic Forum in Davos. With more than 100 countries exploring CBDCs, it is vital to understand the motives and scope of their progress. Could CBDCs increase resilience towards global risks of high inflation, low growth, and the emergence of a high-debt economy, and what are the dynamics with regard to the crypto economy?

In this context, we take a closer look at China’s Digital Currency Project.

What is the genesis of the CBDC in China? How far has it come? What are some of the features that make China’s CBDC unique or special?

The Genesis of China’s CBDC

Four driving factors led to the People’s Bank of China thinking of introducing a digital Yuan (e-CNY), the digital version of the legacy Yuan, the digital RMB, or the digital Renminbi.

(i) Equip the population with a form of digital cash.

(ii) Make retail payment services efficient and safe while encouraging fair competition.

(iii) Make cross-border payments efficient.

(iv) Finally, in alignment with the philosophy of the ‘digital,’ the PBC believed it could positively impact financial inclusion, which is vital in such a highly populated country as China.

Whether the digital Yuan has lived up to its promises needs further scrutiny. Before that, let’s look into the time trajectory line.

The Evolution of the Digital Yuan

The People’s Republic of China started researching CBDCs in 2014 under the Digital Currency/Electronic Payment (DC/EP) project. The on-ground development commenced in 2017. It was the time when the Digital Currency Research Institute also started functioning with Yao Qian as its director.

Multiple distribution and circulation decisions were adopted in the subsequent years. Then in 2018, the Deputy Governor of the PBC, Fan Yifei, declared centralizing the digital Yuan and adopting a two-tier distribution model.

The Governor of the PBC, Yi Gang, announced replacing a component of the country’s cash with the digital Yuan without affecting the country’s money supply wings such as bank deposits and private payment platforms.

The pilot program began in October 2020 in the four regions of Shenzhen, Suzhou, Chengdu, and Xi’an.

The success of the Digital Currency Pilot Program

The PBC gifted 10 million digital Yuan to 50,000 Shenzhen residents. They could spend these gifted currencies at more than 3,300 businesses. It took a week for the population of 50,000 people to spend 8.8 million digital Yuan in more than 62,000 transactions.

(Image Source)

Did all these efforts lead to people proactively opting for CBDC? According to numbers gathered till 30th June 2021, Chinese people had opened nearly 21 million personal wallets and more than 3.5 million corporate wallets. Digital Yuan transactions went nearly up to 71 million, resulting in a value of almost 5 billion US$.

The Latest Numbers

It is still too early to judge whether CBDC has taken off the way it should have in China. However, there has been noticeable growth, with the total number of digital Yuan transactions hitting the number 360 million and the total transaction value crossing the mark of 100 billion Yuan by August 31st, 2022, close to 15 billion US$. In essence, digital Yuan transaction value took a little more than a year to grow three times.

(Image Source)

Fifteen provinces and cities have come under the digital Yuan pilot program. The number of stores that accept digital Yuan payments in the country is around 6 million. According to the government, the pilots ran without any setbacks, covering a range of application scenarios in Wholesale, Retail, Catering, Travel, Education, Medical Care, and Public Services.

Bottlenecks and Concerns

A People’s Bank of China official recently stated that the growth in digital Yuan usage was “not ideal.” Xie Ping, a former director-general of research, spoke of a few reasons why the CBDC did not get as much success as it should have.

Xie believed that for digital currency to become successful, it must expand its usability. Having the digital Yuan only as a substitute for cash does not make it compelling enough for users to go beyond the traditional payment mechanisms of cash, bank cards, and third-party payment apps like Alipay.

Whether the Digital Yuan will go beyond being just a means of consumption is yet to be seen.

The Current Situation

Presently cryptocurrencies, considered hard to control, are banned in China while the country is doubling down on its digital currency.

Interestingly, NFTs, called digital collectibles within China, are seeing a surge in sales, recording almost $4.8 billion in sales in 2022. The country is also launching a national NFT marketplace, which may contribute to higher growth. A strong indicator that the Government of China prefers digital assets it can control.

Recent reports also indicate that the Asian powerhouse may be treating Hong Kong as its playground, considering its recent affinity to digital assets. If that is true, China may roll back its crypto ban in the future, but that may not happen anytime soon, as the country is pushing for higher usage of its digital yuan.

______________________________

VASPs need a Travel Rule Solution to begin complying with the FATF Travel Rule. So, have you zeroed in on it yet? Check out Veriscope, the only frictionless crypto Travel Rule compliance solution.

Visit our website to read more: https://www.shyft.network/veriscope, and contact our team for a discussion: https://www.shyft.network/contact.

Also, follow us on Twitter, LinkedIn, Discord, Telegram, and Medium for up-to-date news from the world of crypto regulations. Also, sign up for our newsletter to keep up-to-date on all things crypto regulations.

China Leads the Digital Currency Future was originally published in Shyft Network on Medium, where people are continuing the conversation by highlighting and responding to this story.


auth0

Actions Integrations Are Now GA

Drag-and-drop extensibility for your identity flow
Drag-and-drop extensibility for your identity flow

Fission

Building Edge Apps with Webnative SDK

We're kicking the New Year off with the release of Webnative SDK version 0.35! An Introduction to Webnative Here at Fission, we build identity, data, and compute protocols that empower developers to build decentralized edge apps without a complex back-end. We're actively working on a decentralized database and virtual machine, and our solutions for decentralized identity and storage are availabl

We're kicking the New Year off with the release of Webnative SDK version 0.35!

An Introduction to Webnative

Here at Fission, we build identity, data, and compute protocols that empower developers to build decentralized edge apps without a complex back-end. We're actively working on a decentralized database and virtual machine, and our solutions for decentralized identity and storage are available today in Webnative SDK. Our SDKs and protocols have been integrated into many edge apps, including Capyloon, web3.storage, and Noosphere.

The Webnative library provides:

User accounts using the browser's Web Crypto API or a blockchain wallet using the Webnative WalletAuth plugin Authorization using UCAN Encrypted file storage using the Webnative File System backed by IPLD Key management using the AWAKE protocol to link devices Platform APIs for publishing apps from the browser

Let's break each of these down.

User Accounts

Developers can choose whether they would like the browser or a blockchain wallet to store a user's encrypt and decrypt keys. The best choice depends on what kind of user experience the developer would like their users to have. For example, if they are building an app for a blockchain audience, using the blockchain wallet user account is likely the best option.

Our team is also currently working on a third user accounts option - passkey support. Passkeys are secured by the device's OS.

Authorization

UCAN stands for User Controlled Authorization Network. UCANs are a way of doing authorization (“what you can do”) where users are fully in control.
There’s no all-powerful authorization server, or server of any kind required! Everything that a user is allowed to do is captured directly in a key or token, and can be sent to anyone that knows how to interpret this format.

The UCAN mascot Encrypted File Storage

The Webnative library provides developers the tools to make their edge apps work seamlessly offline while being encrypted at rest. We created an encrypted file storage system called WNFS, which is built on top of the IPFS protocol and backed by IPLD, making it interoperable across the Web.

Winnie the WNFS mascot Key Management

Device-linking is a breeze with the AWAKE protocol. AWAKE is an authenticated key exchange that provides a way to know something provable about what the other party can do, even if they have no sure way of knowing who the party is. To do this, the protocol leverages UCAN capabilities that don't require a root source of truth.

The AWAKE mascot Platform APIs

As well as using the Fission CLI to register and publish apps, developers can create web apps that publish an app directly from the browser.

What's New

Webnative version 0.35 is a major rewrite to improve the reliability and extensibility of Webnative. Here are a few of the big changes:

Namespaced Apps

Webnative can now have many apps run on the same domain without any conflicts. Each app is namespaced by default and has its own accounts and WNFS. This should help developers when building apps on the same localhost port.

Customizable + Extensible

A new component system makes it possible for Webnative to run independent of Fission infrastructure. For example, you can use the web3.storage IPFS platform to host your encrypted data. This system also enables the webnative-walletauth plugin.

Increased Reliability

We've made improvements to the device-linking feature so it works smoothly across multiple browsers.

Temporary Filesystems

Multiple filesystems can be loaded at the same time now that conflicts with identifiers have been resolved in the latest version. This allows users to access a temporary filesystem until they are ready to create their own account. When the user ends their session, the developer can give them the option to either store their files locally or create a new account.

Next Steps

Learn all about the latest version of Webnative SDK by reviewing the changelog and guide.

Developers with existing Webnative apps will need to make some changes in order for their apps to be compatible with the latest version. Details on what needs to be updated are available on the migration page.

Webnative App Template is up to date and ready to use with version 0.35, so now is a great time to take Webnative SDK out for a test drive.


Vancouver DWeb Social

We'll be discussing the Fediverse, dismantling the disinformation economy, and hanging out with our friends at Internet Archive Canada! Join us and the Internet Archive Canada in Vancouver! Event Information ​Did you know that the Internet Archive now has a Canadian edition? And that they have an amazing space in downtown Vancouver that supports everything from servers to socials? ​Join the Fi

We'll be discussing the Fediverse, dismantling the disinformation economy, and hanging out with our friends at Internet Archive Canada!

Join us and the Internet Archive Canada in Vancouver! Event Information

​Did you know that the Internet Archive now has a Canadian edition? And that they have an amazing space in downtown Vancouver that supports everything from servers to socials?

​Join the Fission team & Internet Archive Canada at The Permanent for an evening of presentations, lightning talks, and the first of hopefully many sociotechnical conversations connecting the Decentralized Web in Vancouver and beyond.

​If you’re interested in tech, open source, decentralization, policy, or any similar sociotechnical topics, come join us!

Register for the event on Luma

​Agenda ​6pm

​Doors open. Presentations will start promptly, followed by Q&A.

​6:30pm - An overview of Check My Ads, Claire Atkin Check My Ads Website Banner

Claire is the cofounder of Check My Ads, the adtech watchdog. Check My Ads is dismantling the disinformation economy via the 400 billion dollar digital advertising industry. They have taken millions of advertising dollars from Steve Bannon, Dan Bongino, Charlie Kirk, and other insurrectionists. This summer they launched a campaign to defund Fox News’s website for brand safety violations regarding election disinformation.

​6:45pm - The Big World of the Fediverse, Blaine Cook Mastodon, a decentralized social network

Blaine was a founding engineer at Twitter, and at one time had code running that made it interoperable with other networks. Fast forward 15 years, and we’re witnessing an explosion of of interoperable, standards based social networks and messaging platforms. Blaine Cook will give a tour of the wide world of the Fediverse as it is now, as well as some of the work he’s doing at Fission.

​7:30pm Social

​We’ll wrap up with community announcements and then adjourn for refreshments and conversation.

Further Reading

Why A Twitter Founding Engineer is Now All-In On Mastodon by Richard MacManus (The New Stack)

Nelson’s Blaine Cook helped build Twitter – and he has a few ideas on what should come next by Tyler Harper (Nelson Star)

​Thanks to the Internet Archive Canada for being our venue sponsor.


Tokeny Solutions

Key Real-World Asset Tokenization Milestones and Predictions

The post Key Real-World Asset Tokenization Milestones and Predictions appeared first on Tokeny.
January 2023 Key Real-World Asset Tokenization Milestones and Predictions by Institutions

 

Welcome to the first Tokeny Insight newsletter of 2023.

We predicted that 2022 would be a pivotal year for institutional adoption of tokenization, and it appears that our prediction has come to fruition. The past year has seen a significant increase in the number of institutions embracing tokenization, and we have compiled a non-exhaustive list of significant projects we saw on the market below.

The trend towards tokenization as a means of cost-efficient asset management is undeniable, as evidenced by a number of statements from prominent organizations, such as the following.

With tokenization becoming a must-have strategy for organizations, we understand that many of you are already in the process of planning and implementing your tokenization strategy. We’re here to support you every step of the way.

Launch Your Tokenization Project Tokeny Spotlight

MENTION

Security Token Advisors featured us in their latest article

Read More

SPEAK

Tokeny to speak with Copper on The use of DLT in Financial Market Infrastructure

Read More Tokeny Events

GBBC’s Blockchain Central Davos

January 15-19, 2023 | 🇨🇭 Davos