Last Update 12:56 PM February 20, 2024 (UTC)

Company Feeds | Identosphere Blogcatcher

Brought to you by Identity Woman and Infominer.
Support this collaboration on Patreon!

Tuesday, 20. February 2024

SC Media - Identity and Access

Hundreds of credit unions exposed to CUSG CMS flaws

Up to 275 credit unions across the U.S. could have been compromised in account takeover and credential theft attacks due to critical vulnerabilities in the CU Solutions Group content management system that could be leveraged for "ultra admin" privileges, reports SecurityWeek.

Up to 275 credit unions across the U.S. could have been compromised in account takeover and credential theft attacks due to critical vulnerabilities in the CU Solutions Group content management system that could be leveraged for "ultra admin" privileges, reports SecurityWeek.


Novel MMS Fingerprint attack used by NSO Group against WhatsApp

Novel MMS Fingerprint attack used by NSO Group against WhatsApp WhatsApp users have been targeted by Israeli spyware firm NSO Group through the new MMS Fingerprint attack that involved the exploitation of a vulnerability in the widely used messaging app, Hackread reports.

Novel MMS Fingerprint attack used by NSO Group against WhatsApp WhatsApp users have been targeted by Israeli spyware firm NSO Group through the new MMS Fingerprint attack that involved the exploitation of a vulnerability in the widely used messaging app, Hackread reports.


CISA: State agency hacked via ex-employee's credentials

The Cybersecurity and Infrastructure Security Agency revealed that a U.S. state government agency had its network compromised due to a former employee's administrative credentials that had been obtained from previous data breaches, SecurityWeek reports.

The Cybersecurity and Infrastructure Security Agency revealed that a U.S. state government agency had its network compromised due to a former employee's administrative credentials that had been obtained from previous data breaches, SecurityWeek reports.


Elliptic

Crypto regulatory affairs: US Treasury releases national financial crime risk assessments

On February 7, the US Department of the Treasury published three reports that offer a view into what the US government perceives as the biggest illicit finance risks facing the financial system. 

On February 7, the US Department of the Treasury published three reports that offer a view into what the US government perceives as the biggest illicit finance risks facing the financial system. 


IDnow

UK Fraud Awareness Report 2024

IDnow fraud survey reveals: 33% of Brits share sensitive ID documents unprotected online, with youth most at risk. 33% of Brits have shared scans or photos of an ID card, driving licence or passport via digital channels, such as social media or email, despite knowing that these ID documents could land in the wrong hands […]
IDnow fraud survey reveals: 33% of Brits share sensitive ID documents unprotected online, with youth most at risk. 33% of Brits have shared scans or photos of an ID card, driving licence or passport via digital channels, such as social media or email, despite knowing that these ID documents could land in the wrong hands The survey also revealed that three-quarters of Brits are most concerned about banking fraud Over half of Brits (54%) do not know what social engineering is, or the role it plays in fraud, with almost half not knowing what deepfakes are

London, February 20, 2024 – A major survey into attitudes and knowledge around fraud in the UK has been unveiled, with findings indicating a lack of knowledge around key tactics used by fraudsters, leaving Brits vulnerable to this crime, which is seeing exponential growth.

Commissioned by IDnow, a leading identity verification platform provider, the YouGov survey of 2,264 people has uncovered that almost half (45%) of UK adults were aware that scans or photos of their ID documents could be obtained by criminals to be used to commit fraud – yet sent the documents via digital channels, such as email, social media and messenger apps anyway.

Such activity could lead to identity theft, which IDnow believes should be a concern to the UK public, especially given the rise in deepfake technology. Developments in generative artificial intelligence (AI) mean deepfake technology can now be used to create hyper-realistic fake documents, as well as videos. However, the survey found that less than a third (31%) of Britons know what deepfake documents are and are aware of the potential risks posed by digitally generated images of physical documents.

Lovro Persen, Director Document and Fraud at IDnow, commented: “Many of us have seen the uncanny deepfake videos of celebrities that spread like wildfire across the internet, showing how easy it is to emulate the likeness of someone using AI. But worryingly, this research suggests that the UK public is not as concerned, or aware as they should be, of the risks associated with such digitally generated images or videos.

“The extraordinary leaps in AI technology mean it’s now almost too easy for a fraudster to carry out financial crimes. Consumers shouldn’t make it even easier for fraudsters though. Our advice is always to think twice before sending a scan or photo of your driving licence or passport into the digital ether via unencrypted channels, such as social media or email.”

Interestingly, 48% of 18- to 24-year-olds surveyed have shared ID documents via such risky channels, compared with just 21% of over-55s, highlighting the potential need to better educate the younger generation on digital fraud threats.

Is fraud front of mind?

The survey also revealed that three-quarters of Brits are most concerned about banking fraud, when asked about the different areas of life where fraud could occur. An additional 37% of Brits are most concerned about fraud via social media channels.

With 54% of Brits unfamiliar with social engineering, encompassing deceptive tactics such as phishing or smishing, the majority of the population remains vulnerable to potential fraud attempts. Social engineering, one of the most prevalent and hard to catch fraud typologies, sees fraudsters manipulating trust or fear, putting consumers at risk of divulging sensitive information or falling prey to malicious links disguised as trustworthy messages.

In terms of the likelihood of being a victim of crime, a fifth (21%) of Brits believe they are most at risk of someone hacking their social media profile. In fact, social media was the primary security concern for those aged 18- to 24-years-old, with each remaining age group citing their main worry as someone accessing their bank account through identity fraud.

Hence, for accounts connected to larger sums or investments, three quarters of Brits (75%) would be willing to go through a lengthier online onboarding process, if this made it safer. Doug Pollock, Vice President Customer Success at IDnow, explained: “Our findings show that banks in the UK do not always go far enough to make their customers feel safe and secure. They need to go further in terms of fraud prevention technology to meet their customers’ risk appetite, especially when their money is at stake. Because, and our research confirms this, if banks get it wrong, the majority of people (54%) would consider moving banks were they to become a victim of fraud.

“We hope these findings highlight the massive impact online fraud continues to have on British people. Because fraudsters work across industries, regions and use cases, it’s vital we all work together – financial services, technology providers, government, law enforcement and the public – to identify and stop fraudsters before it’s too late.”

About this study

This release is based on data from an online survey conducted by YouGov Germany, in which 2,264 people in the UK took part between 6 and 7 December 2023. The results were weighted and are representative of the UK population aged 18 and over.

UK Fraud Awareness Report 2024 Learn more about the British public’s awareness of fraud based on age and gender, what they think about fraud in the UK, and what types of fraud they believe pose the biggest threat. Read more

KuppingerCole

Generative AI in Cybersecurity – It's a Matter of Trust

by Alexei Balaganski The Era of Generative AI is upon us – there is no doubt about that. After over 50 years of academic research in artificial intelligence culminating in the development of neural networks and machine learning algorithms, and then another decade of building cybersecurity products based on those methods and trying to persuade customers to finally put their trust into them… The se

by Alexei Balaganski

The Era of Generative AI is upon us – there is no doubt about that. After over 50 years of academic research in artificial intelligence culminating in the development of neural networks and machine learning algorithms, and then another decade of building cybersecurity products based on those methods and trying to persuade customers to finally put their trust into them… The seemingly out-of-nowhere emergence of ChatGPT has managed to achieve that overnight.

Instead of a grumpy, suspicious crowd not willing to let AI automate their jobs, we now see a public craze that makes the Tulip Mania of the 17th century pale in comparison. Businesses are rushing to implement GenAI capabilities into all their processes and workflows. Vendors are struggling to come up with even more potential use cases to add generative capabilities to their products. But most importantly, literally everyone is happy to consume those capabilities again and again, disregarding potential risks and challenges (including being automated out of their jobs completely).

Even I couldn't resist asking DALL-E to create an illustration for the previous paragraph, and the result is quite impressive...

Discussing those risks for business applications of GenAI is a huge topic on its own. We already hear experts talking about lack of transparency, potential biases in training, compliance issues with leaking sensitive information to third parties, to say nothing about the massive costs of running LLM infrastructures. Still, the attempts of some organizations to outright ban ChatGPT usage on their premises have already been proven futile. Just like BYOD, this usage should be controlled by a combination of government-level regulation and organizations’ own acceptable use policies.

Generative AI for cybersecurity practitioners

Still, today we want to focus on a more specific question: does the introduction of GenAI capabilities fundamentally change cybersecurity? What practical new functionality do they offer to security experts? And what are the challenges of developing and using security tools that rely on modern GenAI models?

Oddly enough, the most obvious use case doesn’t even require any additional interfaces between you and ChatGPT: it’s continuous education, an absolute must for every cybersecurity practitioner. Having a virtual assistant that can instantly answer any question and free you from doing your own research is a very alluring prospect indeed. You just have to remember that ChatGPT is not smarter than you. In fact, it works much more like a room full of hyperactive monkeys with typewriters, and not everything they produce is necessarily on par with Shakespeare. When an LLM does not know something, it will happily make up a completely wrong but still plausibly-looking answer, and it is entirely your responsibility to check its validity. Trust but verify!

Another related area is using LLMs for creating various materials for safe simulations of cybersecurity incidents. The idea itself is, of course, not at all new – tabletop exercises, incident response bootcamps and other kinds of training are a big part of every business continuity program, but using Generative AI does not just reduce the cost of such exercises but dramatically increases the degree of their realism – after all, real threat actors are already using very similar tools for their nefarious purposes. Again, just keep in mind that the cost of an error in this business is not at your advantage – a poorly crafted spear phishing mail will only leave a hacker with a bit of lost revenue, yet a poorly designed phishing awareness exercise might leave your entire company unprotected against multiple future attacks.

With the tremendous improvements LLMs are now making in generating not just natural language, but application source code as well, the idea of making new software “secure by design” by replacing human developers (that make so many coding mistakes and are impossible to teach to follow security practices!) with code generated directly by an AI model is getting more and more traction. In a similar way, LLMs can be used to create more sophisticated authorization policies, generate synthetic data for security testing, etc. In a slightly less radical approach, LLMs would not outright replace humans but serve as an additional filter in an existing CI/CD pipeline to improve its coverage and efficiency.

To be honest, looking at the results produced by current-generation tools, I’m somewhat skeptical about AI completely replacing human developers anytime soon, but the situation might change quickly. In any case, however, there is absolutely no reason to treat the code generated by AI as inherently error-free. You’ll still need to keep unit tests, static and dynamic code analysis solutions, and a lot of other security tools included in your development pipeline. Can you delegate all these activities to LLMs? Perhaps, but should you? In the end, it is still someone’s liability, and you cannot put than on AI…

But wait, someone reading this might say, what about arguably the most interesting application of Generative AI – specialized security-trained LLMs built into solutions like Microsoft Security Copilot? Aren’t they the real future of cybersecurity?

Well, I cannot argue with that… to a degree. SIEMs, XDRs, and other security analytics solutions have been relying on various AI technologies for years, and the addition of GenAI does make them better in many ways. Providing better insights by cutting through the noise, helping to make critical incident response decisions faster, improving security analysts’ productivity – all these capabilities are great improvements, but they are not what makes a security tool reliable and scalable. Microsoft’s Copilot would never be so useful without the company’s existing vast telemetry network and threat intelligence database or their own cloud infrastructure.

Even Charles Babbage, the inventor of the first programmable computer, already understood the principle that later became known as “Garbage in, garbage out” – not even the most sophisticated machine can make right decisions based on incomplete, flawed, or biased input data. For Generative AI, this applies to a much larger extent than anywhere else in IT. Perhaps, when you’re choosing the next best cybersecurity tool for your organization, looking at its GenAI-powered bells and whistles should not be on the top of your list of priorities.

Tapping into the Wisdom of the Crowd

In the end, we cannot deny the fact that Generative AI is indeed a game-changer in almost every industry, including cybersecurity. And yet it is critical for everyone to understand that GenAI does not do magic. It is just a tool - an extremely sophisticated, sometimes quite delicate, and very expensive one. A crucial part of integrating these tools into your security strategy is to clearly understand their capabilities and limitations (which tend to change literally on a weekly basis). Even more important is to be aware of how both the trailblazers and ordinary peers within your industry are using them to capitalize on their experience and avoid their mistakes.

And of course, there is no better platform to meet those people than the upcoming EIC 2024, Europe’s prime conference on Digital ID, Security, Privacy and Governance in an AI-driven world, which will take place this June in Berlin, Germany. I hope to see you there as well!


Cybersecurity & IAM:​ 2023 in Numbers

by Marina Iantorno Significant advancements happened in identity and access management (IAM) and cybersecurity in 2023. For most organizations, it has become more and more important to strengthen cybersecurity and optimize IAM procedures as they continue to traverse the challenges of digital transformation. This report explores the major figures and trends of 2023 compiled from our polling data an

by Marina Iantorno

Significant advancements happened in identity and access management (IAM) and cybersecurity in 2023. For most organizations, it has become more and more important to strengthen cybersecurity and optimize IAM procedures as they continue to traverse the challenges of digital transformation. This report explores the major figures and trends of 2023 compiled from our polling data and provides insights into the present potential futures of IAM and cybersecurity.

Monday, 19. February 2024

IdRamp

Ping Identity Partner Profile

Ping Identity delivers intelligent identity solutions for the enterprise. We enable companies to achieve Zero Trust identity-defined security and more personalized, streamlined user experiences. The post Ping Identity Partner Profile first appeared on Decentralized Identity Orchestration.

Ping Identity delivers intelligent identity solutions for the enterprise. We enable companies to achieve Zero Trust identity-defined security and more personalized, streamlined user experiences.

The post Ping Identity Partner Profile first appeared on Decentralized Identity Orchestration.

IBM Blockchain

Unveiling the transformative AI technology behind watsonx Orders

With IBM® watsonx™ Orders, we have created an AI-powered voice agent to take drive-thru orders without human intervention. The post Unveiling the transformative AI technology behind watsonx Orders appeared first on IBM Blog.

You’re headed to your favorite drive-thru to grab fries and a cheeseburger. It’s a simple order and as you pull in you notice there isn’t much of a line. What could possibly go wrong? Plenty.

The restaurant is near a busy freeway with roaring traffic noise and airplanes fly low overhead as they approach the nearby airport. It’s windy. The stereo is blasting in the car behind you and the customer in the next lane is trying to order at the same time as you. The cacophony would challenge even the most experienced human order taker.

With IBM® watsonx™ Orders, we have created an AI-powered voice agent to take drive-thru orders without human intervention. The product uses bleeding edge technology to isolate and understand the human voice in noisy conditions while simultaneously supporting a natural, free-flowing conversation between the customer placing the order and the voice agent.

Watsonx Orders understands speech and delivers orders

IBM watsonx Orders begins the process when it detects a vehicle pulling up to the speaker post. It greets customers and asks what they’d like to order. It then listens to process incoming audio and isolates the human voice. From that, it detects the order and the items, then shows the customer what it heard on the digital menu board. If the customer says everything looks right, watsonx Orders sends the order to the point of sale and the kitchen. Finally, the kitchen prepares the food. The full ordering process is shown in the figure below:

There are three parts to understanding a customer order. The first part is isolating the human voice and ignoring conflicting environmental sounds. The second part is then understanding speech, including the complexity of accents, colloquialisms, emotions and misstatements. Finally, the third part is translating speech data into an action that reflects customer intent.

Isolating the human voice

When you call your bank or utilities company, a voice agent chatbot probably answers the call first to ask why you’re calling. That chatbot is expecting relatively quiet audio from a phone with little to no background noise.

In the drive-thru, there will always be background noise. No matter how good the audio hardware is, human voices can be drowned out by loud noises, such as a passing train horn.

As watsonx Orders captures audio in real time, it uses machine-learning techniques to perform digital noise and echo cancellation. It ignores noises from wind, rain, highway traffic and airports. Other noise challenges include unexpected background noise and cross-talk, where people are talking in the background during an order.  Watsonx Orders uses advanced techniques to minimize these disruptions.

Understanding speech

Most voice chatbots began as text chatbots. Traditional voice agents first turn spoken words into written text, then they analyze the written sentence to figure out what the speaker wants.

This is computationally slow and wasteful. Instead of first trying to transcribe sounds into words and sentences, watsonx Orders turns speech into phonemes (the smallest units of sound in speech that convey a distinct meaning). For example, when you say “shake,” watsonx Orders parses that word into “sh,” “ay” and hard “k.” Converting speech into phonemes, instead of full English text, also increases accuracy over different accents and actively supports a real-time conversation flow by reducing intra-dialog latency.

Translating understanding into action

Next, watsonx Orders identifies intent, such as “I want” or “cancel that.” It then identifies the items that pertain to the commands like “cheeseburger” or “apple pie.”

There are several machine learning techniques for intent recognition. The latest technique uses foundation and large language models, which theoretically can understand any question and respond with an appropriate answer. This is too slow and computationally expensive for hardware-restrained use cases. While it might be impressive for a drive-thru voice agent to answer, “Why is the sky blue?”, it would slow the drive thru, frustrating the people in line and decreasing revenue.

Watsonx Orders uses a highly specific model that is optimized to understand the hundreds of millions of ways that you can order a cheeseburger, such as “No onions, light on the special sauce, or extra tomatoes.” The model also allows customers to modify the menu mid-order: “Actually, no tomatoes on that burger.”

In production, watsonx Orders can complete more than 90% of orders by itself without any human intervention. It’s worth noting that other vendors in this space use contact centers with human operators to take over when the AI agent gets stuck and they count the interaction as “automated.” By our IBM watsonx Orders standards, “automated” means handling an order end-to-end without any humans involved.

Real-world implementation drives profits

During peak times, watsonx Orders can handle more than 150 cars per hour in a dual-lane restaurant, which is better than most human order takers. More cars per hour means more revenue and profit, so our engineering and modeling approaches are constantly optimizing for this metric.

Watsonx Orders has taken 60 million real-world orders in dozens of restaurants, even with challenging noise, cross-talk and order complexity. We built the platform to easily adapt to new menus, restaurant technology stacks and centralized menu management systems in hopes that we can work with every quick-serve restaurant chain across the globe.

Keep your restaurant running smoothly with AI that handles the toughest orders

The post Unveiling the transformative AI technology behind watsonx Orders appeared first on IBM Blog.


What are breach and attack simulations?

Breach and attack simulation (BAS) provides organizations with continuous offensive security testing. See how it works and why it's important. The post What are breach and attack simulations? appeared first on IBM Blog.

Breach and Attack Simulation (BAS) is an automated and continuous software-based approach to offensive security. Similar to other forms of security validation such as red teaming and penetration testing, BAS complements more traditional security tools by simulating cyberattacks to test security controls and provide actionable insights.

Like a red team exercise, breach and attack simulations use the real-world attack tactics, techniques, and procedures (TTPs) employed by hackers to proactively identify and mitigate security vulnerabilities before they can be exploited by actual threat actors. However, unlike red teaming and pen testing, BAS tools are fully automated and can provide more comprehensive results with fewer resources in the time between more hands-on security tests. Providers such as SafeBreach, XM Cyber, and Cymulate, offer cloud-based solutions which allow for the easy integration of BAS tools without implementing any new hardware.

As a security control validation tool, BAS solutions help organizations gain a better understanding of their security gaps, as well as provide valuable guidance for prioritized remediation.

Breach and attack simulation helps security teams to:

Mitigate potential cyber risk: Provides early warning for possible internal or external threats empowering security teams to prioritize remediation efforts before experiencing any critical data exfiltration, loss of access, or similar adverse outcomes. Minimize the likelihood of successful cyberattacks: In a constantly shifting threat landscape, automation increases resiliency through continuous testing. How does breach and attack simulation work?

BAS solutions replicate many different types of attack paths, attack vectors and attack scenarios. Based on the real-world TTPs used by threat actors as outlined in the threat intelligence found in the MITRE ATT&CK and Cyber Killchain frameworks, BAS solutions can simulate:

Network and infiltration attacks Lateral movement Phishing Endpoint and gateway attacks Malware attacks Ransomware attacks

Regardless of the type of attack, BAS platforms simulate, assess and validate the most current attack techniques used by advanced persistent threats (APTs) and other malicious entities along the entire attack path. Once an attack is completed, a BAS platform will then provide a detailed report including a prioritized list of remediation steps should any critical vulnerabilities be discovered.

The BAS process begins with the selection of a specific attack scenario from a customizable dashboard. Besides running many types of known attack patterns derived from emerging threats or custom-defined situations, they can also perform attack simulations based on the strategies of known APT groups, whose methods may vary depending on an organization’s given industry.

After an attack scenario is initiated, BAS tools deploy virtual agents within an organization’s network. These agents attempt to breach protected systems and move laterally to access critical assets or sensitive data. Unlike traditional penetration testing or red teaming, BAS programs can use credentials and internal system knowledge that attackers may not have. In this way, BAS software can simulate both outsider and insider attacks in a process that is similar to purple teaming.

After completing a simulation, the BAS platform generates a comprehensive vulnerability report validating the efficacy of various security controls from firewalls to endpoint security, including:

Network security controls Endpoint detection and response (EDR) Email security controls Access control measures Vulnerability management policies Data security controls Incident response controls What are the benefits of breach and attack simulation?

While not intended to replace other cybersecurity protocols, BAS solutions can significantly improve an organization’s security posture. According to a Gartner research report, BAS can help security teams uncover up to 30-50% more vulnerabilities compared to traditional vulnerability assessment tools. The main benefits of breach and attack simulation are:

Automation: As the persistent threat of cyberattacks grows year over year, security teams are under constant pressure to operate at increased levels of efficiency. BAS solutions have the ability to run continuous testing 24 hours a day, 7 days a week, 365 days a year, without the need for any additional staff either on premises or offsite. BAS can also be used to run on-demand tests, as well as provide feedback in real time. Accuracy: For any security team, especially ones with limited resources, accurate reporting is crucial for efficient resource allocation—time spent investigating non-critical or falsely identified security incidents is wasted time. According to a study by the Ponemon Institute, organizations using advanced threat detection tools such as BAS experienced a 37% reduction in false positive alerts. Actionable insights: As a security control validation tool, BAS solutions can produce valuable insights highlighting specific vulnerabilities and misconfigurations, as well as contextual mitigation recommendations tailored to an organization’s existing infrastructure. Additionally, data-driven prioritization helps SOC teams address their most critical vulnerabilities first. Improved detection and response: Built on APT knowledge bases like MITRE ATT&CK and the Cyber Killchain, and also integrating well with other security technologies (e.g., SIEM, SOAR), BAS tools can contribute to significantly improved detection and response rates for cybersecurity incidents. A study by the Enterprise Strategy Group (ESG) found that 68% of organizations using BAS and SOAR together experienced improved incident response times. Gartner predicts that by 2025, organizations using SOAR and BAS together will experience a 50% reduction in the time it takes to detect and respond to incidents. Breach and attack simulation and attack surface management

While integrating well with many different types of security tools, industry data indicates a growing trend toward integrating breach and attack simulation and attack surface management (ASM) tools in the near future. As Security and Trust Research Director of the International Data Corporation, Michelle Abraham said, “Attack surface management and breach and attack simulation allow security defenders to be more proactive in managing risk.”

Whereas vulnerability management and vulnerability scanning tools assess an organization from within, attack surface management is the continuous discovery, analysis, remediation and monitoring of the cybersecurity vulnerabilities and potential attack vectors that make up an organization’s attack surface. Similar to other attack simulation tools, ASM assumes the perspective of an outside attacker and assesses an organization’s outward-facing presence.

Accelerating trends toward increased cloud computing, IoT devices, and shadow IT (i.e., the unsanctioned use of unsecured devices) all increase an organization’s potential cyber exposure. ASM solutions scan these attack vectors for potential vulnerabilities, while BAS solutions incorporate that data to better perform attack simulations and security testing to determine the effectiveness of security controls in place.

The overall result is a much clearer understanding of an organization’s defenses, from internal employee awareness to sophisticated cloud security concerns. When knowing is more than half the battle, this critical insight is invaluable for organizations seeking to fortify their security.

Explore the IBM QRadar Suite

The post What are breach and attack simulations? appeared first on IBM Blog.


Reducing defects and downtime with AI-enabled automated inspections

IBM harnesses the power of data and AI to drive real-time, predictive business insights to help clients make intelligent decisions. The post Reducing defects and downtime with AI-enabled automated inspections appeared first on IBM Blog.

A large, multinational automobile manufacturer responsible for producing millions of vehicles annually, engaged with IBM to streamline their manufacturing processes with seamless, automated inspections driven by real-time data and artificial intelligence (AI).

As an automobile manufacturer, our client has an inherent duty to provide high-quality products. Ideally, they need to discover and fix any defects well before the automobile reaches the consumer. These defects are often expensive, difficult to identify and present a myriad of significant risks to customer satisfaction.

Quality control and early defect detection are paramount to uphold standards, enhance operational efficiency, reduce costs and deliver vehicles that meet or exceed customer expectations while safeguarding the reputation of the manufacturer.

How IBM helped the client better detect and correct defects during manufacturing

IBM worked with the client’s technical experts to deploy IBM® Inspection Suite solutions to help them reduce defects and downtime while enabling quick action and issue resolution. The solutions deployed include fixed-mounted inspections (IBM® Maximo® Visual Inspection Mobile) and handheld inspections (IBM® Inspector Portable). Hands-free wearable inspections (IBM® Inspector Wearable) were also made available for situations that required a head-mounted display.

While computer vision for quality has existed in more primitive states for the last 30 years, the lightweight and portable nature of IBM’s solution, which is based on a standard iPhone and uses readily available hardware, really got our client’s attention. The client loved the fact that the solution can be used anywhere, at any time, by any of their employees—even while objects are in motion.

Scaling to 30 million inspections for an immediate and significant reduction in defects

The IBM Inspection Suite improved the client’s quality inspection process without requiring coding. The client found the system to be simple to train and deploy, without the need for data scientists. The system learned quickly from images of acceptable and defective work products, which enabled the solution to be up and running within a matter of weeks. The implementation costs were also lower than those of viable alternatives.

The ability to deliver AI-enabled automation by using an intuitive process in their plants allowed this client to scale this user-friendly technology rapidly across numerous other facilities where it aided in over 30 million inspections. The customer almost immediately realized measurable success due to the significant reduction in defects.

Voted on by the leaders of the client’s technical community, the client awarded their annual IT Innovation award to IBM for the technology they believed delivered the greatest value-driving innovation to their company. In presenting the award, the client’s executives declared that a discussion with IBM about transformation led to a focus on improving manufacturing quality with AI automation.

The Inspection Suite supported the client’s quality initiatives with in-station process control and quality remediation at the point of assembly or installation. The solution also provided continuous process improvement that is helping the client lower repair and warranty costs, while improving their customer satisfaction.

Transparency and trust in AI

By bringing the power of IBM’s deep AI capabilities, deployable on cost-effective edge infrastructure, across the client’s plants, IBM Inspection Suite enabled the client to deliver higher quality vehicles to their customers. The client is now expanding to additional plants and use cases thanks to their collaboration and innovation with IBM.

All the team members at IBM were honored that this client recognized them for their business and technical achievements. We believe that this recognition reflects the IBM values of client dedication and innovation that matters. It is a direct acknowledgment of the value IBM Inspection Suite provides to clients.

IBM’s mission is to harness the power of data and AI to drive real-time, predictive business insights that help clients make intelligent decisions.

Empower your subject matter experts with automated visual inspection

The post Reducing defects and downtime with AI-enabled automated inspections appeared first on IBM Blog.


Unlocking financial benefits through data monetization

Empowering organizations to use their data assets and artificial intelligence (AI) capabilities to create tangible economic value. The post Unlocking financial benefits through data monetization appeared first on IBM Blog.

Data monetization empowers organizations to use their data assets and artificial intelligence (AI) capabilities to create tangible economic value. This value exchange system uses data products to enhance business performance, gain a competitive advantage, and address industry challenges in response to market demand.

Financial benefits include increased revenue through the creation of adjacent industry business models, accessing new markets to establish more revenue streams, and growing existing revenue. Cost optimization can be achieved through a combination of productivity enhancements, infrastructure savings and reductions in operating expenses.

In 2023, the global data monetization market was valued at USD 3.5 billion, and experts project it to reach USD 14.4 billion by 2032, demonstrating a compound annual growth rate of 16.6% from 2024 to 2032.

Treating data as a strategic asset

Data is one of the most valuable intangible assets for organizations. Therefore, adopting a holistic approach that prioritizes data-driven business transformation helps optimize value extraction. This transformation harnesses the power of data within the organization, enabling enterprise-wide cost optimization and unlocking net new direct revenue opportunities.

When it comes to data optimization, most organizations focus solely on infrastructure cost reduction. However, those that embrace data-driven business transformation strategies can multiply the benefits by considering revenue growth potential, optimizing costs across infrastructure, development, maintenance and enhancing data security and compliance.

Figure 1: Data driven business transformation

Critical aspects of data-driven business transformation are the overall data monetization strategy and how data products are used. Data insight and AI automation drive cost optimization with predictive maintenance, process automation and workforce optimization. AI automation substantially reduces data security and compliance risks by proactively identifying and analyzing the severity, scope and root cause of threats before they impact the business.

The net effect of data-driven business transformation is increased compliance, productivity and effectiveness via automation across different business units, such as sales, marketing and services. This leads to revenue uplift through opportunities to create new services and channels.

Identifying data products

Industries across the board are experiencing a surge in enterprise data volume, presenting both challenges and opportunities. These challenges, along with specific industry needs and use cases, influence the types of data products organizations or markets require.

Data products are assets developed from a company’s internal data sources or by combining internal and public data, augmented with AI to extract unique insights that help drive business decisions. Managed as products, these data assets come with defined service contracts, repeatable delivery methods and a clear value proposition.

Figure 2: The data product lifecycle

The banking industry, for example, faces the following challenges:

Competition from agile and innovative financial technology and challenger banks. High degree of regulatory control. Need to protect sensitive information. Organizational data silos that impede a unified customer experience. Pressure to increase margins and identify new revenue streams.

To address these challenges, organizations create relevant use cases that address their specific needs, as well as the needs of the market at large. The following sample use cases show associated data products and corresponding financial benefits.

Use CaseImprove lending decision-making to reduce riskDrive behavior-based recommendations and personalizationDevelop customer service strategies based on comprehensive customer dataData ProductEconomic climate risk analysisCustomer behavior insightsUnified view of customer economic dataFinancial BenefitsImproved market share predictability and revenue growth. Reduced costs through risk mitigation.Enhanced understanding of customer preferences. Increased revenue growth through personalized product offerings. Improved user experience.Increased customer lifetime value through tailored services. Reusable, integrated data across organizational silos. Scroll to view full table

Data products can be created for internal use across various functions or business units. When an organization shares its data internally and consistently to improve efficiency and achieve qualitative or quantitative benefits, it is referred to as internal data monetization.

Data products can also be created for wider external consumption across multiple organizations and ecosystems. When data is shared externally to achieve strategic and financial benefits, it is referred to as external data monetization.

AI-driven data platform economics

An AI-driven organization is one where AI technology is fundamental to both value creation and value capture within the business model. A data monetization capability built on platform economics can reach its maximum potential when data is recognized as a product that is either built or powered by AI.

Figure 3: Data platform economics

In the collection-led model, data from external and internal sources, such as data warehouses and data stores, is fed into analytical tools for enterprise-wide consumption. At the enterprise level, business units identify the data they need from source systems and create data sets tailored exclusively to their specific solutions. This leads to a proliferation of organizational data and added pipeline complexity, which can pose challenges in upkeep and use for new solutions, directly affecting costs and timeliness.

As enterprises shift from collection-led to product-led models, data products are created by using external and internal data sources, along with analytical tools. Once developed, these data products can be made available to business units within the organization for real-time data sharing and analytics. Also, these data products offer opportunities for monetization through ecosystem partnerships.

In a platform-driven approach, business units build solutions by using standardized data products and combining technologies to reduce work, simplify the enterprise data architecture and decrease time to value.

The data platform offers data-enriched data products that use machine learning, deep learning and generative AI. Those AI-driven data products can virtualize and integrate disparate data sources to create domain-specific AI models using proprietary enterprise data. Data platform services enable data products to be provided as SaaS services, a single data mesh deployed across the hybrid cloud and authenticated, secure and audited data product delivery.

When organizations connect their valuable data and AI assets to wider user groups, they can use the multiplier effect from the consumption and evolution of data products, as well as the market reach from scalable cloud distribution.

The economic impact of data monetization

Organizations usually develop a business case spanning 3 to 5 years to gain a comprehensive view of short-, mid- and long-term economic benefits. Successful cases address market demands to remain competitive, foster scalability, and constantly pursue cost optimization and revenue enhancement opportunities.

Figure 4: Economic impact of data monetization

The graph above shows the incremental revenue potential from data monetization over a 5-year period. In an example organization with USD 2 billion in revenue, the baseline revenue from data is USD 5 million (0.25% of the overall revenue). If the organization follows the traditional approach, revenue from data might grow by 10% year-on-year, from USD 5 million to USD 6.7 million in three years, just 1.34 times the baseline revenue.

In contrast, data monetization can act as a force multiplier and contribute to upwards of a 1% increase in a company’s revenue. With data monetization capabilities, revenue from data could potentially grow from USD 5 million to USD 20 million in 3 years, representing a fourfold increase compared to the baseline revenue.

According to recent economic impact reports, the cost of building a data monetization capability is less than the baseline revenue from data. Therefore, an organization might allocate a portion of its existing data revenue in the first year to build a data monetization capability.

Getting started with data monetization

Organizations can start by defining their data monetization strategy and identifying the data products. Then, they can create their data monetization capability by developing an integrated AI-driven data platform. IBM Cloud Pak® for Data, IBM Cloud Pak® for Integration, IBM® watsonx.data™ and IBM® watsonx.ai™ provide them with that holistic platform.

We recommend a discovery workshop where you’ll explore your data and AI ambitions to determine your first data product. In a 4 to 6-week sprint, we’ll collaborate to craft a vision for your platform architecture and develop a proof of concept for the first data product design. This comprehensive process includes the development of the initial data product, the creation of a roadmap for future products, and the establishment of a supporting business case.

Explore AI-driven data platform architecture

The post Unlocking financial benefits through data monetization appeared first on IBM Blog.


Elliptic

Inside The Crypto Launderers: How the private and public sector have innovated to combat money laundering in crypto

Over the past decade, illicit actors have devised numerous techniques in the attempt to abuse cryptoassets. 

Over the past decade, illicit actors have devised numerous techniques in the attempt to abuse cryptoassets. 


Shyft Network

User Signing: A Step-by-Step Guide on Verifying Ethereum Non-Custodial Wallet Wallet Ownership

Last Friday, we published a blog post that explained the user signing process on the Bitcoin network using Trezor and Ledger hardware wallets. Today, we’re going to delve into the user signing process on the Ethereum network with the Trezor and Ledger hardware wallets. User Signing on the Ethereum Network With Ledger Step 1: Select the type of address you want to sign First, we h

Last Friday, we published a blog post that explained the user signing process on the Bitcoin network using Trezor and Ledger hardware wallets. Today, we’re going to delve into the user signing process on the Ethereum network with the Trezor and Ledger hardware wallets.

User Signing on the Ethereum Network With Ledger Step 1: Select the type of address you want to sign

First, we have to choose the type of cryptocurrency address to sign with, which is the Ethereum network in our case.

Step 2: Connect Your Wallet

Next, choose one of the four wallet options: MetaMask, WalletConnect, Ledger, and Trezor.

If you don’t have a wallet account, tap on the “I don’t have a wallet” option on the left side of the screen. This will take you to the Ethereum Foundations’ list of supported wallets.

For this section, we chose Ledger.

Step 3: Connect Your Ledger Live

Now, you have to connect with Ledger Live, which you can do either through Ledger Live Mobile or the Ledger Live Desktop.

For this guide, we will do it through the desktop.

Step 4: Connect to User Signing

You’ll then be prompted to connect to User Signing (dev.usersigning.shyft.network). Right click on both Ethereum and Polygon “Test on User Signing” options and click on the “Connect” button.

Step 5: Sign Address‍

On this page, you will also see your selected Ethereum wallet address. Click on the “Sign Address” button, which will take you to a pop-up screen asking for your public key to continue the User Signing process.

Next, you have to open the Ethereum app on your Ledger device.

Step 6: Open Your Ledger Device and Complete the Sign up Process

Now open the Ethereum app on your Ledger device.

Next, on your Ledger device, you will be asked to confirm that you are the trust anchor of this public key and you own the given address.

That’s it. You have now finished signing the message.

Step 7: The Verification is Complete

After you sign the message, you’ll get a signature proof. This confirms that you own your Ethereum address.

We will now see a similar process on Trezor.

User Signing on the Ethereum Network With Trezor Step 1: Select the Type of Address to Sign

First, select the Ethereum Network from the two available options — Bitcoin and Ethereum.

Step 2: Choose Your Hardware Wallet‍

Next, select from the four available wallet options: MetaMask, WalletConnect, Ledger, and Trezor. For this segment, we will choose Trezor.

Step 3: Scan Accounts

Now, click on the “Scan Accounts” button to get a list of your Ethereum wallet addresses. Doing so will take you to the next screen, where Trezor will request permission to export your public key to the signing service.

On this page, you have to click on the “Export” button. This helps you verify the address you own without exposing your private key.

Step 4: Choose an Ethereum Wallet Address

Now, you will see a list of all your Ethereum addresses. From this list, choose the one address that you want to use for the User Signing process and click on the “Connect” button.

Once you do that, your Trezor wallet will be connected to Shyft User Signing.

Step 5: Signing the Message

Here, you have to click on the “Sign Address” button, which will take you to the pop-up screen asking you to allow signing with your Ethereum address.

To sign the Ethereum message, click on the “Allow Once for This Session” button.

Step 6: Open Your Trezor Device and Complete the Sign up Process

Follow the instructions on your Trezor device and confirm that you are the trust anchor and own the given Ethereum wallet address.

Next, your Trezor device will prompt you to confirm ownership of the address and verify you as the trust anchor for the public key.

Step 7: The Verification is Complete

The verification is now complete, generating a signature proof. This confirms your Ethereum address ownership.

📽️ If you want to watch a video of the entire process, you can find it here: https://www.youtube.com/watch?v=glEXgOWSz0c

______________________________________

Shyft Network powers trust on the blockchain and economies of trust. It is a public protocol designed to drive data discoverability and compliance into blockchain while preserving privacy and sovereignty. SHFT is its native token and fuel of the network.

Shyft Network facilitates the transfer of verifiable data between centralized and decentralized ecosystems. It sets the highest crypto compliance standard and provides the only frictionless Crypto Travel Rule compliance solution while protecting user data.

Visit our website to read more, and follow us on X (Formerly Twitter), GitHub, LinkedIn, Telegram, Medium, and YouTube. Sign up for our newsletter to keep up-to-date on all things privacy and compliance.

User Signing: A Step-by-Step Guide on Verifying Ethereum Non-Custodial Wallet Wallet Ownership was originally published in Shyft Network on Medium, where people are continuing the conversation by highlighting and responding to this story.


auth0

Get Started with the Auth0 Terraform Provider

Learn how to get started with the Auth0 Terraform Provider to automate your Auth0 configuration.
Learn how to get started with the Auth0 Terraform Provider to automate your Auth0 configuration.

Shyft Network

User Signing: A Step-by-Step Guide on Verifying Bitcoin Non-Custodial Wallet Ownership

For Virtual Asset Service Providers (VASPs), ensuring the legitimacy of transactions to comply with regulatory standards such as the FATF Travel Rule while maintaining user privacy is crucial. However, this was easier said than done until Shyft launched User Signing in late December last year. With Shyft User Signing, VASPs now have a user-friendly and secure way for users to prove ownership

For Virtual Asset Service Providers (VASPs), ensuring the legitimacy of transactions to comply with regulatory standards such as the FATF Travel Rule while maintaining user privacy is crucial. However, this was easier said than done until Shyft launched User Signing in late December last year.

With Shyft User Signing, VASPs now have a user-friendly and secure way for users to prove ownership of their non-custodial wallets.

In this guide, we outline the user signing process on the Bitcoin network with the Trezor and Ledger hardware wallets.

User Signing on the Bitcoin Network With Trezor Step 1: Select the Type of Address to Sign

The first step in the user signing process is to select the type of cryptocurrency address you wish to sign. In this case, we are focusing on the Bitcoin network.

Step 2: Choose Your Hardware Wallet

After selecting Bitcoin, the next step is to choose the hardware device that you will use to connect to your Bitcoin wallet. For this section, we will proceed with Trezor.

Step 3: Connect Your Trezor Wallet

Connecting your Trezor device is straightforward. Once you’ve selected Trezor as your hardware wallet, you’ll be prompted to connect the device to your computer and enter your PIN.

The service will then communicate with your Trezor to prepare it for signing.

Step 4: Export the Public Key

Trezor will request permission to export your public key to the signing service. This is a crucial step as it allows the service to identify which address you’re proving ownership of without compromising your private key.

Step 5: Signing the Message

With your public key exported, the next step is to sign a message. This is done through the Trezor interface, where you will see a prompt to sign a Bitcoin message. Follow the instructions on your Trezor device’s screen to complete this step.

Step 6: Verification and Proof

Upon successful signing, you will receive a signature proof. This is a cryptographic string that proves you have signed a message with the private key corresponding to your public Bitcoin address.

Now, we will see a similar process with Ledger, another popular hardware crypto wallet.

User Signing on the Bitcoin Network With Ledger Step 1: Select the Type of Address to Sign

First, select the type of cryptocurrency address you wish to sign. We selected the Bitcoin network.

Step 2: Choose Your Hardware Wallet

After selecting Bitcoin, the next step involves selecting your hardware device, which was Ledger in our case.

Step 3: Connect Your Ledger Wallet

Connect your Ledger device to your computer, enter your PIN, and open the Bitcoin application on the device. The interface will then establish a connection with your Ledger and prepare it for the user signing process.

Step 4: Export the Public Key

Your Ledger will now ask permission to export your public key to the user signing service. This step is essential for the service to verify your address ownership securely.

Step 5: Sign the Message

Once your public key is exported, sign the message using your Ledger. Here, you will see the details of the message to sign. Now, confirm the message on your Ledger device to complete the signing process.

Step 6: Verification and Proof

After signing the message, you’ll receive a signature proof — a cryptographic verification that you have used your private key to sign the message without exposing it. This proof is used to confirm ownership of your Bitcoin address securely.

Conclusion

User signing is a must-have for VASPs to keep transactions secure and compliant. With this guide, VASPs can help their users confidently verify their non-custodial Bitcoin wallets, making sure everyone’s on the same page with clear and secure processes.

If you wish to check the video of this entire process, you can head over here: https://www.youtube.com/watch?v=iuE6_IafxOs

______________________________________

Shyft Network powers trust on the blockchain and economies of trust. It is a public protocol designed to drive data discoverability and compliance into blockchain while preserving privacy and sovereignty. SHFT is its native token and fuel of the network.

Shyft Network facilitates the transfer of verifiable data between centralized and decentralized ecosystems. It sets the highest crypto compliance standard and provides the only frictionless Crypto Travel Rule compliance solution while protecting user data.

Visit our website to read more, and follow us on X (Formerly Twitter), GitHub, LinkedIn, Telegram, Medium, and YouTube. Sign up for our newsletter to keep up-to-date on all things privacy and compliance.

User Signing: A Step-by-Step Guide on Verifying Bitcoin Non-Custodial Wallet Ownership was originally published in Shyft Network on Medium, where people are continuing the conversation by highlighting and responding to this story.


KuppingerCole

Identity Fabrics

by Martin Kuppinger This report provides an overview of the market of providers of technology for building Identity Fabrics, which are comprehensive IAM solutions built on a modern, modular architecture. It provides a compass to help organizations find the solution that best meets their needs. We examine the market segments, product functionality, the market position of vendors, and the innovative

by Martin Kuppinger

This report provides an overview of the market of providers of technology for building Identity Fabrics, which are comprehensive IAM solutions built on a modern, modular architecture. It provides a compass to help organizations find the solution that best meets their needs. We examine the market segments, product functionality, the market position of vendors, and the innovative approaches to providing solutions that serve customers best in building their Identity Fabrics.

Sunday, 18. February 2024

KuppingerCole

Analyst Chat #202: Beyond Traditional Boundaries - Intelligent SIEM solutions

In this Analyst Chat episode, Matthias and guest Warwick Ashford explore the shift from traditional to next-gen Security Information and Event Management (SIEM) solutions. Highlighting the limitations of traditional SIEM in the face of evolving cyber threats and complex data landscapes, the discussion emphasizes the need for intelligent, automated, and integrated SIEM solutions. The conversatio

In this Analyst Chat episode, Matthias and guest Warwick Ashford explore the shift from traditional to next-gen Security Information and Event Management (SIEM) solutions. Highlighting the limitations of traditional SIEM in the face of evolving cyber threats and complex data landscapes, the discussion emphasizes the need for intelligent, automated, and integrated SIEM solutions.

The conversation focuses on crucial features for modern Security Operations Centers (SOCs) dealing with high costs, skills shortages, and a surge in security alerts, providing insights into navigating today's intricate digital security landscape.



Monday, 19. February 2024

IBM Blockchain

Streamlining supply chain management: Strategies for the future

Here’s how companies are using different strategies to address supply chain management and meet their business goals. The post Streamlining supply chain management: Strategies for the future appeared first on IBM Blog.

In today’s complex global business environment, effective supply chain management (SCM) is crucial for maintaining a competitive advantage. The pandemic and its aftermath highlighted the importance of having a robust supply chain strategy, with many companies facing disruptions due to shortages in raw materials and fluctuations in customer demand. The challenges continue: one 2023 survey found 44% of companies had to make changes in the past year due to issues with their supply chain footprint, and 49% said supply chain disruptions had caused planning problems.

But with the right tools and priorities, each step in the process can run smoothly. Here’s how companies are using different strategies to address supply chain management and meet their business goals.

Why supply chain management matters

Supply chain management involves coordinating and managing all the activities involved in sourcing, procurement, conversion and logistics. It includes everything from product development and strategic decision-making to information systems and new technologies. But perhaps most relevant today are the capabilities within SCM to help mitigate disruption.

Supply chain disruptions are caused by a variety of factors, from pandemics, natural disasters and political instability to supplier bankruptcy and IT failures. To mitigate these risks, companies need the resources and technology to develop robust contingency plans.

On top of disruption, companies with global supply chains must also deal with different regulatory environments, cultural norms and market conditions. They need strong SCM practices to help work out the logistics of transporting goods across long distances and through multiple countries without creating longer lead times or delays.

Effective SCM initiatives offer several benefits:

Lower operational costs: By optimizing inventory levels, improving warehousing efficiency and streamlining order fulfillment processes, companies can save on storage, labor and transportation expenses. Better customer satisfaction: An optimized supply chain allows businesses to make sure products are available when and where people want them—which can improve relationships and loyalty with customers. Fewer disruptions: A healthy supply chain mitigates risks and protects against inevitable disruption. By developing contingency plans and resilient supply chains, companies can continue to operate even when unexpected events occur. Key strategies for effective supply chain management

There are a number of ways that companies can better optimize and manage their supply chains. Here are the areas some businesses are focusing on as they refine their overall approach:

Forecasting and demand planning

Companies can reduce storage costs and avoid stockouts or overstock with help from accurate demand forecasting, which uses historical sales data, market research and economic trends to predict future market demand for products or services.

Big data and predictive analytics are increasingly being used to improve forecasting accuracy, allowing businesses to respond more effectively to changes in customer needs. Advanced software tools can automate some parts of forecasting, providing real-time updates and alerts when inventory levels are too high or low.

Automation

Automation can streamline supply chain operations, from order fulfillment to inventory tracking. It can take many forms, from automated warehouse systems that pick and pack orders, to blockchain-based smart contracts to software that automates purchasing and invoicing processes. These technologies can significantly reduce manual labor, minimize errors and speed up processes, leading to increased efficiency and cost savings.

Technology-driven visibility

AI and machine learning can be used to analyze large amounts of data quickly and accurately, providing insights that can improve forecasting, inventory management, and customer service. A 2023 survey found that a third of businesses are using artificial intelligence (AI) to improve resource and supply chain planning, and more than a third said using digital tools for inventory management was the most effective strategy in cutting overall supply chain costs.

Real-time tracking systems, often enabled by Internet of Things (IoT) devices, help companies monitor their supply chain accurately and immediately. Having visibility into the status and location of goods as they move through the supply chain helps companies monitor supplier performance, identify bottlenecks and respond quickly to disruptions. A supply chain control tower can connect many sources of data-driven information and improve end-to-end visibility.

Moreover, enterprise resource planning (ERP) software can integrate different aspects of a business into one system, providing a holistic view of operations and the metrics needed to streamline supply chain processes. Integrated supply chain analytics software, such as IBM Planning Analytics, connect complex data sets for more insightful analytics and scenario analysis that can help avoid demand-and-supply mismatches or fulfillment delays.

Sustainable and ethical sourcing

Companies are trying to ensure that every stage of their supply chains is adhering to responsible practices. This not only helps satisfy customer desire for sustainability and ethical products but also helps mitigate risks, such as regulatory penalties or reputational damage.

Ethical sourcing involves ensuring that the products and services a company procures are produced under fair, safe and environmentally friendly conditions. This may involve conducting audits of suppliers, implementing codes of conduct and using certification schemes to verify compliance. Sustainable sourcing, meanwhile, focuses on minimizing the environmental impact of the supply chain. This might involve partnering with suppliers who use renewable energy, reducing packaging or implementing recycling programs.

Blockchain technologies can increase transparency and traceability in the supply chain, helping to prevent fraud and ensure product authenticity. For example, IBM Food Trust®, a collaborative network of growers, processors, wholesalers, distributors, manufacturers, retailers and others, uses blockchain technology to improve visibility and accountability across the food supply chain.

Access the Food Trust ebook Building strong partnerships

Strong relationships are a key part of any business strategy. When it comes to supply chains, strong connections with providers, distributors and other key stakeholders in the supply network can help companies improve resilience. Partnerships can facilitate better communication, collaboration and responsiveness, particularly in times of disruption. They can also provide access to new markets, technologies, and expertise, offering a competitive advantage.

Because finding the right suppliers can be challenging, some businesses turn to technology to help. For example, IBM® Trust Your Supplier, a supply chain risk management software solution, helps ensure suppliers and other partners in the supply chain meet global and industry standards, provides continuous monitoring and risk assessment and uses blockchain-based information management to ensure transparency and traceability.

Effective supply chain strategy for the future

Whether it’s through forecasting, automation, sustainable sourcing, strong partnerships or new technologies, there are numerous strategies companies can employ to optimize supply chain planning and execution. By doing so, they can improve profitability, meet customer demand and ensure their business is resilient in the face of disruptions.

To learn more about how a technology-enabled supply chain can support your business goals, explore the IBM Sterling® Supply Chain Intelligence Suite.

Transform your supply chain operations

The post Streamlining supply chain management: Strategies for the future appeared first on IBM Blog.

Friday, 16. February 2024

FindBiometrics

Samsung Seeks Appeal for BIPA Arbitration Order

Samsung has asked the Chicago-based 7th U.S. Circuit Court of Appeals to reverse a lower court’s decision to make the company pay millions in arbitration fees for claims that it has violated […]
Samsung has asked the Chicago-based 7th U.S. Circuit Court of Appeals to reverse a lower court’s decision to make the company pay millions in arbitration fees for claims that it has violated […]

IBM Blockchain

Customer service vs. customer experience: Key differentiators

What is customer service vs customer experience? Here are the key differences, and why your business needs to excel at both. The post Customer service vs. customer experience: Key differentiators appeared first on IBM Blog.

In many organizations, but not all, customer service is treated as part of the customer experience. Both are interested in driving customer satisfaction, but they focus on different parts of the customer journey to achieve it. So what are the key differences in customer service vs. customer experience? And why do both matter for your business?

Customer experience, or CX, is a holistic accounting of customers’ perceptions resulting from all their interactions with a business or brand, whether online or in-store. Customer experience involves customer experience management (CXM), which refers to strategies, technologies and practices for improving business results by creating an ideal experience for anyone interacting with a company. The overall customer experience focuses on meeting customer expectations and influencing the customer’s overall perception of products and solutions wherever they take place on the customer journey.

Alternatively, customer service refers to the actions that an organization takes to ensure that customers are satisfied with their products post-purchase. Customer service, which can also be called customer support or customer care, is much more customer-facing than many parts of customer experience. Providing great customer service involves making important decisions about pricing, branding, positioning, and use cases.

Customer-centric organizations should aim to excel at both customer experience and customer service. Therefore, it’s worthwhile to explore more deeply where the two are similar and where they differ.

Subscribe for customer and employee experience topic updates Customer service vs. customer experience across the customer journey

The simplest key difference between CX and customer service is that CX is concerned with meeting customer needs during the entire customer journey. Customer service is focused on post-purchase. As such, CS is considered a subset of CX.

CX teams are concerned with both short-term tactics and long-term strategy. They are thinking about the holistic picture of the entire customer journey from awareness to consideration to purchase and post-purchase.

Customer journey mapping involves defining the touchpoints throughout the lifecycle of engagements with prospects and customers. A customer journey involves many touchpoints over the entire lifecycle of customer engagement. The assumptions behind customer journey mapping are that prospects or customers are being purposeful at each touchpoint—trying to solve a problem, answer a question, compare options, or cross something off a to-do list.

One way to think about the intersection of customer experience and customer service is to map out the marketing funnel. Doing so demonstrates how CX oversees the entire process, whereas customer service is activated for specific functions.

Awareness: This starts with the customer learning about the organization and its solutions, and potentially exploring competitors’ solutions. They might sign up for email messages or follow the organizations on social media. Consideration: After they understand the value propositions, they may ask questions or do further research. Purchase: When a customer is ready to make a purchase, customer service activates. The function helps customers with any questions when they are finalizing purchases and can facilitate the purchase if a customer cannot buy online or in-store. Loyalty: The moments immediately after a purchase are incredibly important for generating customer loyalty. The customer service function helps ensure that customers know how to use the product they purchased. CS is also available to answer further questions or solve problems afterward. Companies often create customer success teams, which can be a part of customer service or the sales team, to provide tutorials and best practices on maximizing the use of a product. The goal is to help those customers use the product as quickly, simply and satisfactorily as possible. Advocacy: Creating loyal customers unlocks the possibility that some of them tell people in their network about an organization’s products or even potentially laud the value of the customer experience it provides. Creating customer advocates helps the customer experience function perform better. That is because new prospects come into the funnel already ‘warmed’ by the positive sentiment from previous customers. CX and CS tools

Both customer experience and customer service disciplines rely on valuable tools to maximize their value.

Key customer experience tools:

CX teams use tools that help them see and take strategic actions across the entire customer journey.

Customer relationship management (CRM) tools enable organizations to collect, track, and analyze data resulting from customer interactions across channels. A/B test software, which can provide different messaging to website visitors to identify which resonates the most. CX teams, working directly with UX teams, can use software to create variations of a message and track which one leads to the most purchases or time that is spent on the site. Dynamic recommendations for other products or accessories based on previous product purchases. Key customer service tools:

While customer service teams will likely use the previously mentioned tools, some others are much more aligned with CS team roles and responsibilities.

Self-service chatbots that interact with customers to provide answers to their questions. Customer service interactions are increasingly powered by automation and generative artificial intelligence (AI). Web-based knowledge bases where users can find articles, FAQs and videos to walk them through how to solve issues and use their products or services correctly. A webpage that provides customers with multiple ways to reach the organization to talk to customer support representatives. Proactive email or text messages to customers who inquire how the product is performing and provide instructions and tips on how to use it. CX and CS metrics are different

Both customer experience and customer service involve measurement of their activities to ensure that they are successful in meeting customer needs. Many revolve around capturing customer feedback and measuring real-time responses. And while some common KPIs relate to both disciplines, others are more closely aligned with one than another.

Key customer experience metrics: Customer satisfaction score (CSAT): CSAT is the percentage of respondents who claim to be satisfied (4) or very satisfied (5) in surveys that are offered after a touchpoint experience. Net Promoter Score (NPS): NPS gauges how likely a person is to recommend a company or its products to others. People are asked on a scale of 1 to 10 how likely they would recommend it to others. Scores 6 or less are subtracted from the number of 9s and 10s to create a percentage. It is best considered a customer experience metric because it can occur during any part of the customer journey. Customer Effort Score (CES): After a touch, the customer is asked how easy or difficult it was to accomplish their goal, rating the difficulty from 1 (easy) to 5 or 7 (difficult). Customer retention rate: Maintaining high customer retention rates demonstrates a successful customer experience function and enhances the bottom line by increasing customer lifetime value. Increasing customer loyalty and limiting churn means that customers are either satisfied with the product or solution or have yet to find a good replacement. Key customer service metrics: First Response Time (FRT): How long it takes for customer support teams to respond to a customer problem or request. It is a sign of good customer service for an organization to be able to respond immediately to a customer issue, whether on social media, email, chat room, or phone call. Average Resolution Time (ART): This involves how long it takes from the beginning of a customer service interaction until the issue is resolved. Issue resolution rate: This relates to how many customer service issues are successfully addressed and resolved. While a customer service team cannot expect to resolve every issue, failure to solve almost all issues is a sign of an issue. CS and CX together ensure that organizations are caring for customers

Today’s consumers are more discerning and have more options than ever. To delight your customers and remain competitive, you should personalize every touchpoint across the entire customer experience (CX). True personalization at scale involves all aspects of your business, from marketing and messaging to supply chain, sales, and service.

IBM puts customer experience strategy at the center of your business. Our deep expertise in customer journey mapping and design, platform implementation, and data and AI consulting can help you harness best-in-class technologies to drive transformation across the customer experience.

Get the report: “The 5 pillars of personalization” Explore customer experience consulting services

The post Customer service vs. customer experience: Key differentiators appeared first on IBM Blog.


FindBiometrics

Should AI Companies Be Liable for Deepfake Scams? – Identity News Digest

Welcome to FindBiometrics’ digest of identity industry news. Here’s what you need to know about the world of digital identity and biometrics today: FTC Wants Input on Deepfake Scam Liability […]
Welcome to FindBiometrics’ digest of identity industry news. Here’s what you need to know about the world of digital identity and biometrics today: FTC Wants Input on Deepfake Scam Liability […]

auth0

An Open Letter to Women in Tech

Empowering women in technology roles
Empowering women in technology roles

SC Media - Identity and Access

Are users ready to go passwordless? Why it's better to move slowly

The most promising passwordless technology isn't enterprise-ready. Focus on feasible IAM upgrades that will strengthen your security posture until passwordless solutions mature.

The most promising passwordless technology isn't enterprise-ready. Focus on feasible IAM upgrades that will strengthen your security posture until passwordless solutions mature.


Accumulate Network (was Factom)

Accumulate Protocol Year in Review: A 2023 Odyssey

Presented by the Accumulate Core Committee A long time ago in a blockchain far, far away… EPISODE 2023: THE DAWN OF INNOVATION In the vast expanse of the digital cosmos, the ACCUMULATE protocol and its infrastructure embarked on an epic journey of transformation and growth. The stalwart team, guardians of code and innovation, delved deep […]

Presented by the Accumulate Core Committee

A long time ago in a blockchain far, far away…

EPISODE 2023: THE DAWN OF INNOVATION

In the vast expanse of the digital cosmos, the ACCUMULATE protocol and its infrastructure embarked on an epic journey of transformation and growth. The stalwart team, guardians of code and innovation, delved deep into the heart of technology to enhance the very fabric of the ACCUMULATE universe.

With a vision to streamline the galaxy of code, they embarked on a quest to refine the simulator, a tool of immense power, to conduct testing more effectively than ever before. In the intricate dance of programming, they honed the Command Line Interface (CLI), forging it into a tool of unparalleled precision, enhancing the experience of all who wielded it in the vast transactional cosmos.

Amidst the network of networks, ACCUMULATE witnessed a surge of internal progress. The use of synthetic transactions, the lifeblood of healing and anchoring within the digital realm, reached new heights of mastery. This era also heralded the rise of API v3, a beacon of customizable messaging solutions, empowering developers and users alike with its adaptable might.

The integration of libp2p, a force of connectivity, weaved its way into the P2P network, binding the galaxy together in a web of seamless interaction. 

In the realm of the Wallet CLI, a new dawn arose with the support for multiple key vaults and the alliance with 1Password, merging security with ease. The introduction of an interactive mode transformed the art of transaction signing, simplifying the orchestration of multi-signature transactions across the vast stretches of the ACCUMULATE chain.

As the chronicles of 2023 unfurled, it was a time of both towering achievements and formidable challenges. The journey was marked by relentless pursuit to elevate system reliability, optimize celestial operations, and uplift the user experience to new dimensions. From the enhancement of testing methodologies to the streamlining of CI processes, from the meticulous overhaul of documentation to the introduction of advanced features like synthetic message types and API v3, each step was a leap towards the future.

The integration of libp2p and the advancements in the Wallet CLI Tool were not mere milestones; they were the beacons of our technological evolution. As the year drew to a close, the foundations laid and the lessons learned illuminated the path for future endeavors, highlighting realms waiting to be explored and refined.

Thus begins the next chapter in the grand saga of ACCUMULATE, where innovation meets legacy, and the future becomes now.

The journey continues…

Protocol Core Development

Reflecting on the past year, we have seen a steady progression in various aspects of the Accumulate protocol’s core development. While there have been advancements in enhancing core functionalities and refining operational processes, we also encountered challenges that required our attention and effort. This review aims to provide a balanced overview of our efforts in testing, continuous integration (CI), documentation, and other key areas.

Throughout the year, we achieved several advancements, albeit with accompanying challenges:

1. Testing Coverage Improvements: We placed a strong emphasis on enhancing our testing coverage, which has contributed to improving the reliability and robustness of our platform. However, the complexity and breadth of the protocol meant that this was an ongoing process requiring continual attention.

2. CI and Deployment Efficiencies: We made strides in optimizing our continuous integration process, streamlining our development and deployment pipelines. This effort helped in quicker delivery of new features and fixes but also highlighted areas needing further refinement for efficiency.

3. Documentation Refinements: Recognizing the importance of clear and accurate documentation, we made concerted efforts to improve it. This remains an area of ongoing development to ensure it keeps pace with our evolving platform.

4. Feature Introductions and Enhancements: Introducing new features and enhancing existing ones was a focus this year. Each development aimed to meet user needs and adapt to the changing technological landscape, though some features required additional refinement post-deployment.

5. System Refinements and Bug Fixes: Continuous system refinements and addressing bugs were a significant part of our work. Restructuring some core code segments was undertaken to support better maintainability and scalability.

6. Embracing New Technologies: We continued to incorporate new technologies and methodologies, striving to keep our platform innovative. This journey brought both advancements and learning experiences, underscoring the need for balanced integration of new technologies.

7. Challenges for 2024: One of our challenges has been integrating and stabilizing the various innovations. The protocol experienced significant outages, prompting a renewed focus on stability and an expansion of our testing processes.

Reflecting on this year, we acknowledge both our technical achievements and the areas where we faced difficulties. Our commitment remains steadfast to our users and to improving our platform. As we enter the new year, our focus will be on addressing these challenges, continuing our pursuit of innovation, and further developing our platform. We are cautiously optimistic about the future and remain dedicated to making the coming year one of steady progress and growth.

Feature Developments

This past year, our focus has been on enhancing the Accumulate platform through the addition of new features, each designed to contribute to the platform’s functionality and user experience. While these updates mark progress in our technology, they also brought with them a set of challenges that we have worked to address. Here’s a look at some of the key developments from 2023:

Advanced Synthetic Message Type for Healing

We introduced an advanced synthetic message type to improve our system’s healing capabilities. This development aimed to establish more efficient and reliable self-repair mechanisms within our system. Moving forward, a primary goal will be to enhance the reliability and stability of this messaging.

Block Hold Feature for Enhanced Transaction Security

Recognizing the importance of transaction security, we implemented a new feature that allows for holding transactions until a specified network height is reached. This feature is designed to provide a minimum review period for signers, thereby aiming to enhance transaction security.

Efficient Collation of Transaction Responses

In response to user feedback, we optimized the process of collating transaction responses. This improvement targeted resolving previous issues and enhancing the system’s overall responsiveness and efficiency.

API v3: A Gateway to Customized Messaging Solutions

We revamped our API to provide more flexibility and adaptability for developers and third-party applications. API v3 introduces a modular messaging framework, transforming it into a platform that enables the integration of custom messaging solutions. This feature aims to offer a seamless and customized experience for developers, allowing them to create unique messaging functionalities that are directly integrated into the protocol.

Looking Ahead

The introduction of these features marks steps in our ongoing journey toward creating a versatile and developer-friendly ecosystem. Each development reflects our effort to adapt to the evolving needs of the blockchain and application development community. As we move forward, we are focusing on stabilizing these innovations and ensuring that they effectively meet the needs of our users and developers.

Enhanced System Performance Through Key Fixes

Over the past year, we undertook a series of fixes and improvements to address various issues within our platform. These efforts were part of our ongoing initiative to enhance security and reliability, even as we navigated the complexities inherent in such a comprehensive undertaking.

Enhanced Dispatcher Functionality

We identified and resolved several issues in the dispatcher to ensure more accurate processing of transactions and signature fields in envelopes. These improvements aimed to enhance the dispatcher’s reliability, contributing to a more consistent user experience in request processing.

Sequencer System Refinement

A significant update was implemented in the sequencer, particularly targeting the hash signing process for synthetic messages in executor v2. This refinement was crucial in bolstering the reliability of our messaging system and in enhancing the integrity of our transaction processing mechanism.

Optimized Docker Build Process

In our efforts to improve deployment efficiency, we reduced the docker build context size, notably by excluding `.git` files. This change led to a decrease in build times, facilitating faster and more efficient deployment. Additionally, updates to our Makefile were made to better incorporate git data into the build process.

Improved API Error Management

We revised the error handling mechanism in API v3 to improve error diagnosis and provide clearer, more informative feedback to users. This enhancement is expected to lead to smoother interactions with our platform and a better overall user experience.

Transition to Modern Faucet Services

In line with our objective to adopt modern service standards, the old faucet system in our API was phased out. We shifted our focus to a more sophisticated, faucet-as-a-service model, aiming to provide users with a more advanced and efficient way to access our services.

Critical Snapshot Bug Resolution

Addressing a key issue, our team resolved a bug in the snapshot system that had led to data inconsistencies. This fix was a significant step towards enhancing the reliability and stability of our data management, aiming to safeguard user data integrity.

Comprehensive Database Debugging Improvements

We introduced a series of enhancements to our database debugging capabilities. These included improved error messaging for light client interactions, the addition of a tool for efficient local database querying, and adjustments to API v2 for better handling of missing snapshots. These improvements were designed to collectively enhance our platform’s diagnostic capabilities.

While these updates and fixes represent our commitment to maintaining a high-quality platform, they also highlight the continuous challenge of balancing innovation with stability. We recognize the importance of these improvements in enhancing our users’ experience and the overall performance of our system, and we remain focused on addressing any challenges that arise as we continue to develop and refine our platform.

System Refinements and Enhancements Through Strategic Refactoring

Throughout the past year, our team has undertaken a series of system refinements and enhancements, focusing on optimizing the performance and usability of our platform. While these efforts reflect our commitment to ongoing development, they also address the practical needs of managing a complex and evolving platform.

Streamlining API Configuration

We took steps to streamline the API configuration by removing some unused properties. This effort aimed to simplify configuration management, making it more intuitive and accessible for users and developers.

Enhanced Snapshot Performance

Significant optimization was achieved in database performance, especially in how we handle snapshots. By adopting a more efficient method for tracking hashes, we’ve enhanced the speed and reliability of our snapshot processes, although we recognize the need for ongoing monitoring to ensure continued effectiveness.

Refinement of Merkle-Related Components

We undertook a significant overhaul of Merkle files and methods, aiming to improve clarity and efficiency in this critical area of our platform. Merkle trees were also moved back into their dedicated package, as part of our ongoing effort to maintain clean and organized code practices.

Increased Reusability of the Errors Package

We enhanced the errors package to increase its reusability with different sets of status codes. This change was designed to improve the versatility of the package and streamline error handling across various system components.

Removal of the Connection Manager

Aligned with our goal to optimize system components, we removed the connection manager, a component that had become redundant. This step is part of our broader efforts to declutter and streamline the platform, helping to reduce complexity where possible.

P2P Network Improvements

This past year included significant updates to our Peer-to-Peer (P2P) network, primarily through the integration of libp2p, a modular network stack. These updates represent an effort to enhance network performance and reliability, addressing both long-standing and emerging challenges.

Enhanced Network Dynamics with libp2p Integration

The integration of libp2p into our P2P network was a notable step in our network development strategy. Known for its modular design, libp2p offers customizable network architecture, which we leveraged to improve network scalability and enhance security protocols. This integration aims to make our network more adaptable and resilient, though it also introduced new complexities that required careful management.

Strategic Peer Management Improvements

An important aspect of our P2P network refinement was optimizing peer interactions. Utilizing the libp2p framework, we developed a system to more effectively manage peer connections, including limiting reconnection attempts with unreliable peers. This approach is intended to stabilize the network and enhance overall efficiency, but it continues to be an area of active development and monitoring.

Refinement in Error Handling and Serialization

Along with improving peer management, we refined our approach to error handling and serialization within the P2P network. By employing libp2p’s advanced features, we aimed to streamline error identification and resolution processes, contributing to a more stable network experience.

Impact on User Experience and System Performance

The enhancements to the P2P network, particularly the libp2p integration, are expected to positively impact user experience. Users should see more stable connections and faster data transfer rates. For developers, these changes are intended to provide a more reliable foundation for building and operating applications, though the full impact of these changes will continue to be evaluated.

The integration of libp2p into our network is part of our ongoing effort to adopt advanced technologies that enhance our platform’s capabilities. While this move represents a significant step towards a more dynamic and secure network, it also underscores the need for continuous adaptation and improvement in our technology stack. 

These updates and refinements represent our ongoing efforts to improve and evolve our platform. While these changes have brought improvements in several areas, we continue to focus on balancing the introduction of new functionalities with the stability and reliability of the system as a whole.

Enhancements in Testing, Continuous Integration (CI), and Documentation

Our team has been diligently working on enhancing various aspects of our platform, focusing on testing, continuous integration (CI), and documentation. Here’s a summary of key improvements in these areas:

Overhaul of Test Harness’s Condition Support

We’ve significantly upgraded the test harness’s condition support, adding capabilities for conditions on signatures and messages produced by them. This includes an update in the ID method of the Credit Payment message type to align with routing principles, ensuring more accurate and efficient testing of our messaging and transaction systems.

Making the Validation CI Jobs More Reliable 

We’ve made various improvements to the validation CI job to enhance its reliability. These improvements are expected to provide more consistent and reliable CI outcomes, ensuring smoother development and deployment processes.

Enhanced CI/CD Pipeline

Continuous improvements have been made to our CI/CD pipeline, focusing on streamlining and automating our development and deployment processes. These enhancements not only improve the efficiency of our software development lifecycle but also ensure that new features and fixes are deployed with high quality and minimal delay.

Comprehensive Testing Enhancements 

Our team has implemented extensive enhancements to our testing frameworks, ensuring thorough coverage and rigorous validation of all new features and updates. This commitment to comprehensive testing is key to maintaining the high reliability and performance standards of our platform.

Documentation Updates for User Clarity 

We’ve updated and expanded our documentation to provide clearer, more detailed guidance for users and developers. This includes elaborating on new features, usage instructions, and and correcting a series of typos, making it easier for users to navigate and leverage our platform’s capabilities.

Refining the Accumulate Wallet CLI Tool

The past year saw the Accumulate Wallet CLI Tool undergo several enhancements aimed at improving its functionality and user experience. While these developments marked progress, they also brought to light the challenges of evolving a complex tool to meet diverse user needs.

Keybook Delegation and Page Resolution for Transaction Signing

One of the key updates was the enhancement of the CLI tool’s ability to handle keybook delegation and page resolution for transaction signing. This feature aimed to simplify the transaction signing process, offering users a more streamlined approach to managing signing authorities. It was a step towards providing users with greater control over transaction authorizations, though it also highlighted the need for ongoing user education and support.

Wallet Daemon and JSON-RPC API Innovations

Significant improvements were made to the wallet daemon, particularly with the introduction of the timed wallet unlock feature, enhancing operational flexibility. The expansion of the daemon’s capabilities, including third-party integration support through the robust JSON-RPC API, marked a notable development. The JSON-RPC API itself evolved into a more versatile tool, facilitating a range of wallet functionalities and complex interactions.

Multivault Support and 1Password Integration

The introduction of multivault support represented a significant development in wallet management. Integration with 1Password was implemented, aiming to blend high security with user convenience in managing encryption passphrases. These developments were crucial steps towards enhancing user experience and security, although they also underscored the importance of balancing feature complexity with usability.

User Interface and Experience Enhancements

The user interface received considerable updates, including the introduction of a more intuitive UI. These improvements were part of our effort to make the wallet not only functional but also user-friendly. However, we recognize that continuous feedback and iteration are essential to meet the diverse preferences and needs of our users.

Security and User Experience Focus

Maintaining security was a primary focus, with the implementation of features like client-side interactive password prompts and multi-recipient token issuance functionalities. These developments were targeted at creating a secure yet user-friendly experience, but they also required us to continually assess and address any emerging security challenges.

Testing, Documentation, and Looking Ahead

Our commitment to delivering a quality product was reflected in our rigorous testing processes and comprehensive documentation updates. As we look forward to the future, we aim to build on the successes of 2023, with a continued focus on innovation and technological enhancement of the wallet CLI tool. However, we also acknowledge the importance of stability and the need to integrate user feedback into our ongoing development process.

Conclusion

As we close the books on the past year, we take a moment to consider the developments within the Accumulate Protocol and Wallet CLI Tool. 2023 has been one of both progress and learning, marked by efforts to enhance our technology and address the challenges that have arisen.

Achievements and Challenges

Our work in 2023 involved significant developments in testing, continuous integration, and documentation, alongside feature development and system refinements. While these efforts have moved us forward, they have also brought to light areas needing further attention and improvement.

The advancements in the core protocol and the Wallet CLI Tool demonstrate our commitment to enhancing the platform’s capabilities. We have focused on strengthening the foundation of our technology and expanding its features to better serve our user base. However, with these enhancements came the necessity to manage complexities and integrate new technologies effectively.

Looking Ahead with a Focus on Stability and Improvement

As we move into the future, we do so with a balanced perspective, recognizing both our achievements and the areas where we need to grow. The experiences of 2023 have laid important groundwork, and we look forward to building on this. Our focus will be on continuing to innovate while ensuring the stability and usability of our platform.

Our approach will include deeper engagement with our user community and a commitment to refining the user experience. We understand that the journey ahead requires not only a pursuit of technological advancements but also a dedication to reliability, user-centricity, and thoughtful growth.

Final Thoughts

In sum, 2023 was a year that combined technological strides with valuable insights. It was a period that highlighted the importance of balancing innovation with practicality and user needs. As we step into the next year, we remain committed to evolving our platform in a way that upholds our standards of quality and aligns with the dynamic needs of the blockchain space.


SC Media - Identity and Access

New GenAI ‘upload file’ options spur data risk fears

File upload features added recently to generative AI platforms represent a new and bigger data pipe for sensitive information to wander outside corporate networks.

File upload features added recently to generative AI platforms represent a new and bigger data pipe for sensitive information to wander outside corporate networks.


liminal (was OWI)

Location Intelligence and Its Impact on Authentication and Security Practices

As mobile transaction volumes rise, location intelligence is increasingly becoming a critical tool that enables companies to better authenticate customers and detect fraud. The increase in location-sensitive transactions, such as payments, content streaming, and online sports betting, necessitates verifying users’ physical locations at the point of transaction. Location intelligence vendors respon
As mobile transaction volumes rise, location intelligence is increasingly becoming a critical tool that enables companies to better authenticate customers and detect fraud. The increase in location-sensitive transactions, such as payments, content streaming, and online sports betting, necessitates verifying users’ physical locations at the point of transaction. Location intelligence vendors respond by offering solutions that authenticate legitimate users and escalate high-risk ones to traditional authentication methods. According to Meticulous Research, the Location Intelligence market is projected to reach $67.5 billion by 2030, at a CAGR of 12.9% from 2023 to 2030.

Location intelligence, or geolocation, involves collecting, analyzing, and applying spatial data to derive insights and make strategic decisions. This enables businesses to understand patterns and trends, improve operations, target audiences effectively, and enhance security. As the need for secure, seamless user experiences increases, geolocation is essential for various applications, including account security, customer authentication, and identity verification. To stay competitive, some vendors in this market seek to integrate with larger platforms or broaden their product offerings in response to the trend toward Integrated Identity Platforms (IIPs).

Location intelligence, or geolocation, involves collecting, analyzing, and applying spatial data to derive insights and make strategic decisions. This enables businesses to understand patterns and trends, improve operations, target audiences effectively, and enhance security. As the need for secure, seamless user experiences increases, geolocation is essential for various applications, including account security, customer authentication, and identity verification. To stay competitive, some vendors in this market seek to integrate with larger platforms or broaden their product offerings in response to the trend toward Integrated Identity Platforms (IIPs).

Liminal Members and Link Users can access the full report by logging into Link. Don’t have an account yet? Sign-up for free. Here are some key takeaways from a recent Outside-in Report: Location Intelligence: Applications and Insights in an Evolving Market.

6 Key takeaways: The growing sophistication of fraud threats and rising consumer expectations for low-friction user experience will drive future demand for location intelligence, with limited threats posed by consumer privacy and regulatory concerns. Customer experience is crucial in authentication, with over 60% of consumers abandoning purchases due to forgotten passwords in the last six months. Frictionless, location-based authentication enables platforms to let low-risk users skip password steps entirely. Businesses seek solutions with low-friction and cross-lifecycle capabilities, growing geofencing mandates for regulated industries, and continuing mobile transaction growth. Nearly 75% of consumers are willing to share location data to enhance mobile app functionality. While hesitancy to share data may affect the use of location intelligence for authentication, it doesn’t significantly hinder the market’s growth. Regulatory constraints are consent-based, with stricter privacy laws like GDPR and CCPA posing limited threats to location intelligence platforms since regulations emphasize consumer consent and don’t prohibit geolocation data collection. The global legalization of regulated online gaming is expanding, with over 32 countries and 34 US states plus the District of Columbia now allowing online betting. Geofencing is crucial in adhering to local regulations, fueling the worldwide demand for location intelligence. How It Works: Location Intelligence for User Authentication

For authentication use cases, geolocation providers combine multiple location intelligence signals with device-level data to risk score a login attempt and allow friction-free access or escalate to challenging a user with alternate authentication methods. For a detailed workflow, log into Link and download the full Outside-In Report, Location Intelligence: Applications and Insights in an Evolving Market.

Use Cases

Geolocation has many use cases, including account opening and fraud prevention, account takeover protection, passwordless authentication, and geofencing.

Account Opening and Fraud Prevention: A common use case aims to prevent account creation by fraudsters using false identities, driven by the post-COVID-19 increase in online account creation across various sectors.

Account Takeover Protection: This targets the need for protection against cybercrimes and expressly unauthorized access to online accounts or systems, driven by the rise in sophisticated fraud techniques and the demand for advanced security technologies.

Passwordless Authentication: This method enables password-free account or system access, reducing user friction. Location intelligence facilitates user authentication with minimal added friction, acting as a primary or secondary authentication factor.

Geofencing: This method uses geolocation data to control access to online platforms based on geographic constraints. It is crucial for regulated online gambling to meet regulatory location verification requirements in the growing number of countries and US states legalizing it.

Why Now? 

Demand for location intelligence data and analytics is rising due to enterprises seeking low-friction, cross-lifecycle solutions, increasing geofencing regulations in regulated industries, and the growth of mobile transactions. Enterprises aim for high user experience across the identity lifecycle, noting that location intelligence can reduce transaction abandonment in 42% of consumers facing high friction. These solutions, especially with other authentication methods, position location intelligence vendors as crucial players in Integrated Identity Platforms (IIPs).

Regulatory concerns are minimal, with privacy regulations like GDPR focusing on consent-based data collection and specific laws like the Massachusetts Location Shield Act allowing for authentication and fraud detection uses. Geofencing requirements in states with legal online sports betting create additional opportunities for these vendors.

Additionally, the rise in smartphone transactions, now expected to constitute over 40% of online transactions, and the increasing use of online-to-offline platforms like Uber, DoorDash, and Instacart highlight new use cases for location intelligence. This trend expands the technology’s application and supports the market’s growth, providing a positive outlook for location intelligence solutions in enhancing authentication strategies and meeting regulatory compliance while opening new market opportunities.

Log-in to Link, or sign up for free, to download the detailed analysis of location intelligence market trends and strategic insights

Related content: Market and Buyer’s Guide to Customer Authentication  Facial Biometrics: Trends and Outlook Link Index Report for Account Opening in Financial Services 

The post Location Intelligence and Its Impact on Authentication and Security Practices appeared first on Liminal.co.


Tokeny Solutions

Tokeny’s Talent | Fabio

The post Tokeny’s Talent | Fabio appeared first on Tokeny.
Fabio Espinosa is Back-End Software Engineer at Tokeny.  Tell us about yourself!

Hello! I’m Fabio, originally from Colombia and currently living in Madrid. I work as a back-end software engineer at Tokeny. I’ve had a lifelong fascination with technology and how it has the potential to change people’s lives. Now, I think it is really cool to be able to work on technology myself.

What were you doing before Tokeny and what inspired you to join the team?

Before joining Tokeny, I worked as a tech lead at another startup and occasionally freelanced through Toptal.

I recognized a point in my career where my knowledge growth in programming had stagnated. Despite my experience in back-end, front-end, and dev ops, the work became repetitive, offering limited space for new things, and consequently, it was not so interesting anymore. Moreover, I was managing people, which was not giving me much joy. 

It was this realization that made me curious to explore something new, particularly related to blockchain. Tokeny presented itself as an exciting opportunity to grow.

Also, it’s widely recognized that tokenization is positioned for substantial growth—something that will take off. Such advancements hold significant potential, particularly for early entrants who typically experience disproportionate gains. This trend appears to be unfolding, as we are beginning to see growing interest from new investors wanting to join the market.

As Elon Musk says, “I could either watch it happen, or be part of it”. I chose the latter.

How would you describe working at Tokeny?

I’d describe it as responsible freedom. The flexible work-from-home policy allows me to work from many locations, provided that expectations are met. This flexibility offers a great work-life balance.

Also, there’s a shared sense among the team that the industry is on the verge of taking off. While the exact timing remains uncertain, we are confident that it will inevitably happen.

Regarding Tokeny, a company is always about its people. In that regard, everyone at Tokeny is super awesome. We know how to have fun, connect, enjoy, share, learn, be there for one another, and share a good meme on the #random channel.

What are you most passionate about in life?

It’s hard to find the thing I’m most passionate about. I can think of traveling, exploring cultures, writing, skiing, hiking, horse riding, creating things through code, enjoying food, music, spending time with family, and being there for my sisters, trips with friends.

But if I were to choose one, I’d say I’m most passionate about connecting with people. I live for interesting, deep conversations. When someone shares with me a bit of themselves they’d usually keep hidden or a strange, unique way to think about something, I honor it and feel connected.

What is your ultimate dream?

My ultimate dream has always fluctuated over time. At some point, it used to be about money. I wanted to be as rich as not to have to work anymore and do whatever I want.

Nowadays, I see it differently. I mean, money will always be important, but my ultimate dream is to be constantly at peace with myself and aware enough to be in alignment with what is true to me the most I can.

On my deathbed, I want to look back and know that it wasn’t the external things that moved me, but the internal things, the really important ones.

What advice would you give to future Tokeny employees?

I’d say working from home is always a challenge. I’d advise creating an environment where you can both focus, and disconnect to take breaks. It’s funny how sometimes the most productive moments come after a long, nice break when you can look at the code with different eyes.

Let yourself be taken through the ride and try to develop consistency in your pace of work. Everyone is there to help you, so count on anyone; they all want to see you succeed.

What gets you excited about Tokeny’s future?

It excites me to work on something different from the usual corporate environment, within a small team, with significant upside potential, and contributing to beneficial technological progress for people.

I’m also super excited about the next team building event; they somehow keep getting better.

He prefers:

Coffee

Tea

check

None

check

Movie

Book

Work from the office

check

Work from home

check

Dogs

Cats

check

Call

Text

check

Burger

Salad

check

Mountains

Ocean

Wine

check

Beer

check

Countryside

City

check

Slack

Emails

check

Casual

Formal

check

Crypto

check

Fiat

Night

check

Morning

More Stories  Tokeny’s Talent|Laurie’s Story 26 January 2023 Tokeny’s Talent|Héctor’s Story 29 July 2022 Tokeny’s Talent|Tony’s Story 18 November 2021 Tokeny’s Talent|José’s Story 19 August 2021 Tokeny’s Talent|Sefa’s Story 24 November 2022 Tokeny’s Talent | Tiago 27 July 2023 Tokeny’s Talent|Nida’s Story 15 January 2021 Tokeny’s Talent|Thaddee’s Story 2 June 2022 Tokeny’s Talent | Ali 29 September 2023 Tokeny’s Talent|Xavi’s Story 19 March 2021 Join Tokeny Solutions Family We are looking for talents to join us, you can find the opening positions by clicking the button. Available Positions

The post Tokeny’s Talent | Fabio first appeared on Tokeny.

The post Tokeny’s Talent | Fabio appeared first on Tokeny.


Elliptic

Regulatory Outlook 2024: Taking a look at the US regulatory landscape

Over the past few years, the United States has presented a challenging environment for crypto firms and financial institutions seeking certainty about the direction of regulation and policymaking. 

Over the past few years, the United States has presented a challenging environment for crypto firms and financial institutions seeking certainty about the direction of regulation and policymaking. 


KuppingerCole

Policy Based Access Management

by Graham Williamson Efficient, effective management of access controls from infrastructure to applications remains an aspiration for enterprises. The main drivers of this goal include the need for strengthening the cybersecurity posture, efficiency gains in managing access controls, the need for consistency in access controls across multiple solutions and layers, and regulatory compliance. Most o

by Graham Williamson

Efficient, effective management of access controls from infrastructure to applications remains an aspiration for enterprises. The main drivers of this goal include the need for strengthening the cybersecurity posture, efficiency gains in managing access controls, the need for consistency in access controls across multiple solutions and layers, and regulatory compliance. Most organizations today struggle with a mixture of point solutions for managing access controls, many of these relying on static entitlements causing massive work and tending to become inaccurate. A consistent, policy-based solution for managing access controls ensures that the right people have the right access, at the right time, from the right place. This Leadership Compass features vendors offering policy-based access control solutions and provides guidance on aligning a vendor’s solution to common corporate access control requirements.

Mar 20, 2024: Road to EIC: eIDAS 2.0 – The Way to "Trusted, Voluntary, and User-Controlled" Digital Identity

The regulation on electronic identification and trust services (eIDAS), is designed to increase trust and security in electronic transactions within the EU's internal market. One of eIDAS’ aims is to make it possible for individuals and businesses to use their own national e-Identification systems to access public services in other EU countries. EIDAS 2.o focuses on expanding the availability and u
The regulation on electronic identification and trust services (eIDAS), is designed to increase trust and security in electronic transactions within the EU's internal market. One of eIDAS’ aims is to make it possible for individuals and businesses to use their own national e-Identification systems to access public services in other EU countries. EIDAS 2.o focuses on expanding the availability and usage of digital wallets, whereby member states provide citizens with secure and privacy-preserving digital wallets to manage digital credentials, aiming for 80% EU resident adoption by 2030.

IBM Blockchain

6 ways to elevate the Salesforce experience for your users

In this article, explore six ways you can elevate your Salesforce experience for customers, partners and employees. The post 6 ways to elevate the Salesforce experience for your users appeared first on IBM Blog.

Customers and partners that interact with your business, as well as the employees who engage them, all expect a modern, digital experience. According to the Salesforce Report, nearly 90% Of buyers say the experience a company provides matters as much as products or services. Whether using Experience Cloud, Sales Cloud, or Service Cloud, your Salesforce user experience should be seamless, personalized and hyper-relevant, reflecting all the right context behind every interaction.

At the same time, Salesforce is a big investment, and you need to show return on that investment as quickly as possible. Ensuring maximum user adoption and proficiency is key. The more useful and relevant the experience is, the more effective users will be on the platform—and the more frequently they will return to it.

Here are six ways you can elevate your Salesforce experience for customers, partners and employees.

1. Continuously inform and engage your users.

Keep users abreast of everything they need to know about your business, and share valuable, engaging content related to their needs and interests. Deliver timely information and critical alerts through tailored announcements. Keep your audience informed and engaged with virtual and in-person events and targeted news, blogs or other articles. Manage and surface all of this within Salesforce to minimize context switching and to keep users coming back to the platform.

2. Personalize the user experience for hyper-relevance.

Infuse context and personalized content to enrich the entire experience and make it more relevant to individual customers. Don’t make employees struggle with out-of-the-box search and list views; dynamically present what they need in the flow of work, so they don’t have to leave the current task to find it. Whether it is location mapping, embedded video, targeted news and events, assigned learning, or recommended products and knowledge articles, strive to give users the information they need when they need it.

3. Escape the confines of the typical Salesforce look and feel.

Break away from limiting, out-of-the-box layouts, view, and UI components to give users the beautiful, modern experience they expect. Follow current UX design principles and ensure that every touchpoint represents your unique branding look and feel, rather than just looking like any other Salesforce implementation.

4. Accelerate platform adoption and mastery.

Develop a plan to thoroughly onboard users and get them proficient with the platform as quickly as possible to start realizing value. Streamline and automate the onboarding process. Gathe data to drive users to the site or platform, personalize the experience, and equip them with the knowledge and resources they need for success. Then, go deeper and give your employees, partners and customers an immersive digital learning experience tailored to their specific needs. A highly skilled ecosystem is a loyal and effective one, and educated customers are advocates for the brand.

5. Enable users to serve themselves and each other.

Give your customers, partners and employees the ability to serve themselves 24/7, whether researching products, making purchases, managing accounts or troubleshooting and solving issues. This means making your product information, knowledge articles and other content easily accessible, searchable and filterable. Deflect cases by giving customers access to the same content your service employees use via the knowledge base or a chatbot.

6. Empower your users to be your advocates.

An effective way to get your brand and messaging in front of as many potential customers as possible is to give your users ways to advocate for you. Organically expand the reach and influence of your brand by enabling users to share, contribute to and interact with your content. Enable partners and employees to contribute blogs and articles, empower customers to share your content in their social networks, and enable users to rate and review products, services and other records. Use this active user base to crowdsource the best ideas for improving your business and your Salesforce implementation.

Achieve an elevated experience with IBM® Accelerators for Salesforce

You can achieve this elevated experience with IBM® Accelerators for Salesforce. Its library of pre-built components can be used to quickly implement dozens of common use cases in Salesforce with clicks, not code. You can drag, drop, configure and customize components to create engaging, hyper-relevant experiences for your employees, partners and customers on Sales Cloud, Service Cloud, and Experience Cloud. Accelerators like Announcements, Experience Components, News, Ideas, Learning Adventure, Onboarding, and many more enable you to create a highly relevant and personalized experience.

IBM developed these accelerators with the expertise we gained through thousands of successful IBM Salesforce Services engagements. Now, these same products are available to purchase and use in your projects! Unleash the power of our pre-built components to reduce customization efforts, empower administrators and speed the ROI of your Salesforce implementation.

Innovate and accelerate your Salesforce journey with IBM Accelerators Transform your business with IBM and Salesforce

The post 6 ways to elevate the Salesforce experience for your users appeared first on IBM Blog.

Thursday, 15. February 2024

FindBiometrics

AI Update: Lights, Camera… Type!

Welcome to the newest edition of FindBiometrics’ AI update. Here’s the latest big news on the shifting landscape of AI and identity technology: OpenAI has revealed Sora, its first text-to-video […]
Welcome to the newest edition of FindBiometrics’ AI update. Here’s the latest big news on the shifting landscape of AI and identity technology: OpenAI has revealed Sora, its first text-to-video […]

Anonym

No More Overdraft Fees: How to Make Up Revenue  

The post No More Overdraft Fees: How to Make Up Revenue   appeared first on Anonyome Labs.

The Biden administration and the Consumer Financial Protection Bureau announced a change to drop overdraft fees to as low as $3. The average fee for overdrafts is over $26.

Why this matters: Financial institution’s bottom lines will be greatly affected. We are talking about the nation’s largest banks losing roughly $8 billion.

Who does this affect: Only “very large” banks, savings associations, and credit unions with assets of $10 billion or more will be affected.

The Consumer Financial Protection Bureau (CFPB) found that applying these rules to FIs with assets of $10 billion or more covers:

80% of consumer deposits 68% of overdraft charges

While smaller FIs won’t be directly affected by this rule, it is crucial to stay vigilant to changes in the market. 

Reactions:

The banking industry is gearing up to fight back with a multimillion-dollar marketing and lobbying campaign. Banks will need to start implementing new products or campaigns to make up for the revenue loss such as cyber safety, secure ID verification, safe browsing, and other privacy tools. (We’ll cover more on that below 👀) Dive Deeper

The financial landscape is on the brink of change. The new rules will reshape revenue streams for banks and credit unions.

Overdraft lending: “Very Large Financial Institutions Proposed Rule”

The “Very Large Financial Institutions Proposed Rule” is part of an executive order that “Established a whole-of-government effort to promote competition in the American economy.” This order’s initiatives create change across multiple industries, including health care, finance, technology, and more.

The Very Large Financial Institutions Proposed Rule:

Was presented by The Consumer Financial Protection Bureau (CFPB). Wants FIs to use overdraft fees to break even by either: FIs calculate their own costs and losses using standards outlined in the proposal. Using the benchmark overdraft fees of $3, $6, $7, or $14. Only applies to FIs with assets of $10 billion or more. Change Regulation Z to expand on the definition of open-end credit: “plan” and “finance charge.”

👀 You can read the full document from the CFPB here.

Other changes for the financial space from the executive order have included:

The U.S. Department of the Treasury submitted a report to the White House Council to show that fintech needs more regulations.  The CFPB proposed a new rule to lower credit card late fees. The CFPB is making credit card comparison more accessible.

For all banks and credit unions, these new rules can have significant impact. The CFPB is setting new standards in the FI industry, which consumers will come to expect from all FIs, no matter the size.

The Overdraft Fee Changes: Making New Revenue Streams

The proposed regulations can significantly reduce overdraft fee revenue, prompting banks and credit unions to seek alternative revenue sources. While this presents a challenge, it also creates opportunities for proactive innovation and the maintenance of financial well-being.

Privacy Tools as Revenue Generators: A Win-Win Strategy

Privacy tools are a great way to start a new revenue stream. By charging a small monthly subscription fee, you will keep your customers safe from fraud and foster customer loyalty.

Here is the suite of privacy tools you could offer your customers:

Cyber Safety Tools: Password Manager: Offer customers a secure way to manage and store passwords, ensuring enhanced cybersecurity. Virtual Private Network (VPN): Provide a VPN service for safe and private online activities. Virtual Cards: Introduce virtual card solutions for secure online transactions, reducing the risk of fraud. Secure ID Verification: Secure Communications: Enable encrypted communication channels, increasing customer trust in sensitive interactions. Encrypted Voice, Video, Email, and Messaging: Ensure end-to-end encryption for various communication channels, safeguarding customer privacy. Safe Browsing: Private Browser: Introduce a private browsing experience to protect users from tracking and potential cyber threats. Ad & Tracker Blocker and Site Reputation Service: Enhance online safety with tools to block ads, trackers, and assess website reputations.

👷 No need to build it yourself. You can offer all of these products to your customers quickly by partnering with Anonyome Labs. Learn more here.

The Path Forward: Embracing Innovation

As FIs face a shifting financial landscape, embracing innovation is necessary for success. Integrating privacy products not only addresses revenue concerns but positions banks and credit unions as committed to customer security and satisfaction.

💡 To learn more about adding privacy tools to your offerings, click here.

👋 Or, to see exactly what privacy tools can do for your FI, request a demo from Anonyome today!

The post No More Overdraft Fees: How to Make Up Revenue   appeared first on Anonyome Labs.


FindBiometrics

Wendy’s Settles Its BIPA Beef – Identity News Digest

Welcome to FindBiometrics’ digest of identity industry news. Here’s what you need to know about the world of digital identity and biometrics today: Mobile Malware Collects Face Biometrics A new […]
Welcome to FindBiometrics’ digest of identity industry news. Here’s what you need to know about the world of digital identity and biometrics today: Mobile Malware Collects Face Biometrics A new […]

Shyft Network

A Guide to Crypto Travel Rule Compliance in Japan

The Crypto Travel Rule threshold in Japan is US$3000. It has been in effect in the country since June 2023. During its evaluation, the FATF praised Japan’s anti-money laundering efforts and called them effective while also noting areas for potential improvement. Japan was among the first countries to recognize the potential of the decentralized crypto ecosystem and took significant
The Crypto Travel Rule threshold in Japan is US$3000. It has been in effect in the country since June 2023. During its evaluation, the FATF praised Japan’s anti-money laundering efforts and called them effective while also noting areas for potential improvement.

Japan was among the first countries to recognize the potential of the decentralized crypto ecosystem and took significant measures to position itself as the world’s crypto hub. One of the measures Japan took included the introduction of various crypto-related regulations, such as the FATF Travel Rule, aimed at preventing malicious entities from exploiting digital assets for illicit activities, including terrorist financing and money laundering.

Background of the Crypto Travel Rule in Japan

Japan legally recognized cryptocurrencies in 2016 and further solidified their status by amending the Payment Services Act in 2019 to treat crypto assets as legal property. It led to the establishment of clear guidelines for crypto exchanges, including the need for strict anti-money laundering measures.

The country later introduced the Crypto Travel Rule in 2022, requiring exchanges to share transaction data to combat money laundering. FATF Travel Rule, however, did not become active in 2022 and instead came into force in June 2023.

The current threshold at which the Crypto Travel Rule applies to a crypto transaction in Japan is $3000. This means VASPS must transmit the required information only if the amount exceeds this limit.

Crypto Travel Rule Japan — Compliance Requirements

In line with FATF’s Recommendations, beneficiary and originator VASPs in Japan must obtain, hold, and make available on request to appropriate authorities the following information:

- Name and account number of the originator

- Place and date of birth of originator as well as address or national identity number, customer identification number

- Name and account number of the beneficiary

The information makes it possible for authorities to track the transaction routes of crypto assets and stablecoins.

Virtual Asset Service Providers, according to Japan’s financial regulation PSA, are those that are involved in:

- Purchase, sale, and exchange of crypto assets

- Intermediating, brokering, or acting as an agent for the above-mentioned activities

- Managing customer’s money in connection with all the above-mentioned activities

- Managing crypto assets for the benefit of another person

Now, a recent development seeks to refine this regulatory framework further, with the FSA proposing to simplify the Crypto Travel Rule on January 26th this year. The new draft requires VASPs to report restricted crypto-assets to a certified payment service association and publicly disclose this information. The agency is currently receiving public comments on the proposal until February 26th, 2024.

Japan’s self-regulated crypto organization, JVCEA, also noted in its “2024 New Year’s Thoughts” notice that by revising the Act on Prevention of Transfer of Criminal Proceeds (APTCP) to implement the FATF Travel Rule, it has become the “first major developed country to do so.”

Impact on Cryptocurrency Exchanges and Wallets

In Japan, centralized cryptocurrency exchanges are required to register with the FSA, which is responsible for overseeing banking, securities, and exchange sectors. Meanwhile, the Payment Services Act has been applied to all crypto exchanges operating in Japan since April 2017.

Moreover, through amendments to its Payment Services Act, Japan already requires exchanges to implement Know Your Customer (KYC) and Anti-money Laundering (ML) measures. Under this, crypto exchanges need to provide customer due diligence procedures, maintain records, ensure customer assets are secure, and improve their security.

Now, with the Crypto Travel Rule in place, it further tightens these regulations for crypto exchanges in Japan, as they now have to assess the money laundering and financial terrorism risks for self-hosted wallets as well.

The APTCP also plans to update its rules, requiring VASPs to collect sender and receiver information for transactions involving self-hosted wallets or unregistered VASPs, including when these transactions occur in jurisdictions where the Travel Rule is not yet enforced.

Although Japan does not mandate covering transactions with counterparties in locations that haven’t yet implemented the Travel Rule regulations, VASPs must still evaluate the risks and gather necessary information. To aid this process, the FSA regularly updates its list of countries that meet Japan’s Travel Rule standards.

Crypto Travel Rule Japan — Global Context and Comparisons

Japan’s Travel Rule sets a US$3,000 threshold for cryptocurrency transactions, higher than Singapore’s SGD 1,500 and Canada’s CAD 1,000, and unlike many countries that have no threshold. This stringent threshold is part of Japan’s broader anti-money laundering and counter-terrorist financing strategy, which the FATF has deemed effective while also highlighting areas that could benefit from further improvements.

Moreover, according to PWC’s report, Japan is among the only 23 countries that were engaged in advancing crypto-focused regulations and legislation in 2023 across all four key focus areas: Travel Rule compliance, stablecoin regulation, licensing and listing guidance, and crypto framework development.

Concluding Thoughts

In conclusion, Japan’s adherence to clear guidelines for crypto companies and global anti-money laundering standards, exemplified by initiatives like the Crypto Travel Rule, reinforces its standing in the crypto industry amid heightened global regulatory scrutiny.

Relevant Reads

A Guide to FATF Travel Rule Compliance in the United States

A Guide to FATF Travel Rule Compliance in Hong Kong

Comprehensive Guide to Crypto Regulations and FATF Travel Rule in Singapore

A Guide to FATF Travel Rule Compliance in the UAE

Shyft Network powers trust on the blockchain and economies of trust. It is a public protocol designed to drive data discoverability and compliance into blockchain while preserving privacy and sovereignty. SHFT is its native token and fuel of the network.

Shyft Network facilitates the transfer of verifiable data between centralized and decentralized ecosystems. It sets the highest crypto compliance standard and provides the only frictionless Crypto Travel Rule compliance solution while protecting user data.

Visit our website to read more, and follow us on X (Formerly Twitter), GitHub, LinkedIn, Telegram, Medium, and YouTube. Sign up for our newsletter to keep up-to-date on all things privacy and compliance.

A Guide to Crypto Travel Rule Compliance in Japan was originally published in Shyft Network on Medium, where people are continuing the conversation by highlighting and responding to this story.


SC Media - Identity and Access

Hackers ‘steal your face’ to create deepfakes that rob bank accounts

As hacking techniques evolve to capture video recognition data, security pros say the industry will need more AI-based tools that focus on zero-trust, threat detection, and MFA.

As hacking techniques evolve to capture video recognition data, security pros say the industry will need more AI-based tools that focus on zero-trust, threat detection, and MFA.


Quiet privacy policy modifications allowing data mining for AI to face FTC charges

Artificial intelligence companies and other organizations that do not properly inform consumers regarding privacy policy changes that would permit the utilization of their data for strengthening AI tools are poised to face charges from the Federal Trade Commission, according to The Record, a news site by cybersecurity firm Recorded Future.

Artificial intelligence companies and other organizations that do not properly inform consumers regarding privacy policy changes that would permit the utilization of their data for strengthening AI tools are poised to face charges from the Federal Trade Commission, according to The Record, a news site by cybersecurity firm Recorded Future.


Over a decade's worth of US Internet emails leaked

Major Minnesota-based regional internet service provider U.S. Internet had internal emails and emails from thousands of individuals served by its Securence division spanning over a decade exposed due to an unsecured server, according to Krebs on Security.

Major Minnesota-based regional internet service provider U.S. Internet had internal emails and emails from thousands of individuals served by its Securence division spanning over a decade exposed due to an unsecured server, according to Krebs on Security.


Nearly 385M records exposed by misconfigured Zenlayer cloud database

Nearly 385M records exposed by misconfigured Zenlayer cloud database Global on-demand edge cloud services provider Zenlayer had almost 385 million records, or 57.46 GB of data, exposed as a result of a misconfigured cloud database that did not have any password protection, Hackread reports.

Nearly 385M records exposed by misconfigured Zenlayer cloud database Global on-demand edge cloud services provider Zenlayer had almost 385 million records, or 57.46 GB of data, exposed as a result of a misconfigured cloud database that did not have any password protection, Hackread reports.


Entrust

Seller Impersonation Fraud: A Crisis of Identity in High-Value Real Estate Transactions

Back in March of 1950, the FBI added Willie “The Actor” Sutton to their list... The post Seller Impersonation Fraud: A Crisis of Identity in High-Value Real Estate Transactions appeared first on Entrust Blog.

Back in March of 1950, the FBI added Willie “The Actor” Sutton to their list of Ten Most Wanted Fugitives. During his tenure as an American bank robber, Sutton stole an estimated $2 million ($25 million in today’s dollars) and was known for using disguises. Police officer, maintenance worker, postal carrier, and telegraph messenger were a few of the roles the fraudster assumed to execute his heists.

When asked why he robbed banks, he replied, “because that’s where the money is.”

For a modern Willie Sutton, real estate seller impersonation might be considered the latest fraud method of choice. Fraudsters are focusing on compromising the identity of property owners to sell real estate they do not own. Their goal is to collect the funds a buyer intends to use for purchasing the property before other transaction participants realize the fraudster is not the legal owner.

Imagine owning a piece of land where you haven’t built anything yet, and one day as you drive past it you discover that there is a construction site on it. As you finally get to speak with the person in charge of the construction, you hear them claiming they rightfully own the land, with a perfectly valid deed of sale as proof.

How is this even possible in 2024, you may wonder? It’s simple: With the explosion of online business transactions over the last five years, technology has made it easy to buy and sell remotely – but not necessarily easy to do it securely.

It doesn’t take long for a bad actor to find leaked data online, and to spot weak identity verification methods in a remote real estate sales process. Personally identifiable information (PII) belonging to real estate owners, loan applicants, sellers, and real estate agents is valuable to bad actors looking to commit seller impersonation fraud.

Seller impersonation fraud follows a specific methodology, and understanding the nuances is critical in safeguarding your organization. Our latest white paper provides more details about how it happens, the challenges it creates, and best practices you can put in place to prevent these risks without negatively impacting your sales processes.

Whether you’re seeking detailed insights or tailored solutions, our team is ready to assist. Download the white paper or contact us to discuss your specific needs.

The post Seller Impersonation Fraud: A Crisis of Identity in High-Value Real Estate Transactions appeared first on Entrust Blog.


IBM Blockchain

Climate change examples

A wide range of organizations from the public and private sector are working on climate actions to address the causes of climate change. The post Climate change examples appeared first on IBM Blog.

What do global climate change and global warming look like? Surface temperature statistics paint a compelling picture of the changing climate: 2023, according to the European Union climate monitor Copernicus, was the warmest year on record—nearly 1.5 degrees Celsius warmer than pre-industrial levels.

To gain a holistic understanding of the current climate crisis and future climate implications, however, it’s important to look beyond global average temperature records. The impacts of climate change may be organized into three categories:

Intensifying extreme weather events Changes to natural ecosystems Harm to human health and well-being Extreme weather events

While climate change is defined as a shift in long-term weather patterns, its impacts include an increase in the severity of short-term weather events.

Heat waves: Dangerous heat waves are becoming more common and are one of the most obvious effects of climate change as the Earth’s temperature continues to rise. Droughts: Higher temperatures can cause faster water evaporation, making arid regions even more dry. Climate change-linked shifts in atmospheric circulation can further exacerbate drought conditions as rain bypasses dry regions. Wildfires: Droughts and faster water evaporation can lead to drier vegetation, fueling larger and more frequent wildfires. According to NASA, even typically rainy regions will be more vulnerable to wildfires and wildfire seasons are extending around the globe. Heavy rain and tropical storms: Climate change alters precipitation patterns, with NASA reporting more frequent periods of excess precipitation. Scientists project further increases in tropical cyclone rainfall in particular, due to greater atmospheric moisture content. Increased coastal flooding: Sea level rises associated with global warming are leaving low-lying coastal areas vulnerable to greater flooding, according to the Intergovernmental Panel on Climate Change (IPCC). Changes to natural ecosystems

Due to climate change, natural ecosystems are undergoing long-term changes and declines in biodiversity. Here are a few examples:

Sea ice loss and melting ice sheets: Declining levels of Arctic sea ice threaten the habitats of species such as polar bears and walruses. Polar bears hunt seals in the Arctic sea ice habitat while walruses rely on the ice as a place to rest when they’re not diving for food. In Greenland and Antarctica, melting ice sheets are contributing to rising sea levels, endangering coastal ecosystems around the world. Damage to coral reefs: Ocean temperature increases in warmer climates from Australia to Florida are causing coral reefs to lose colorful algae, leading to what’s known as “coral bleaching.” Ocean acidification: Marine life is also at risk from ocean acidification, stemming from greenhouse gas emissions and the greater concentration of carbon dioxide in the atmosphere. That carbon dioxide is absorbed by seawater, leading to chemical reactions that make oceans more acidic. Shellfish are especially vulnerable to ocean acidification, which NOAA describes as having “osteoporosis-like effects” on oysters and clams. Invasive species proliferation: Warmer temperatures allow invasive species to move to new areas, often to the detriment of native wildlife. The spread of the purple loosestrife plant in North America, for instance, has reduced nesting sites and resulted in the decline of some bird populations. Harm to estuarine ecosystems: Droughts reduce freshwater flows and increase salinity in estuaries, while greater precipitation increases stormwater runoff, introducing more sediment and pollution. These changes threaten the wildlife that rely on specific estuarine conditions to thrive. Harm to human health and well-being

Climate change is increasingly impacting the quality of life on Earth, affecting people’s health and economic well-being.

Illnesses and fatalities: Rising global temperatures foster conditions for infectious diseases to spread, and extreme weather events cause tragic loss of life as well as illnesses. Poor air quality from wildfire smoke can exacerbate asthma and heart disease, for example, while heat waves can cause heat exhaustion. More than 60,000 people died in European heat waves in 2022. Food insecurity: Droughts and scarcity of water supplies, severe storms, extreme heat and invasive species can cause crop failures and food insecurity. Most of those at risk of climate change-linked hunger are in Sub-Saharan Africa, South Asia and Southeast Asia, according to the World Bank. Financial consequences: Climate change can hurt businesses and individuals’ financial well-being. For example, changing weather patterns have imperiled wine production in California, while rising sea levels threaten the future of Caribbean coastal resorts. Meanwhile, insurance companies are increasingly declining to provide property insurance in areas vulnerable to extreme weather, leaving homeowners there at greater financial risk. Damage to infrastructure: Wildfires, powerful storms and flooding can damage energy grids, leading to power outages, as well as transportation networks, hindering people’s ability to access services and goods to meet their daily needs. Damage to one type of infrastructure can lead to consequences for another: As noted by the U.S. government’s National Climate Assessment, “failure of the electrical grid can affect everything from water treatment to public health.” Hope for the future

Though some of the impacts on Earth’s climate are irreversible, a wide range of organizations from the public and private sector are working on climate actions that address the causes of climate change. These include ongoing mitigation strategies and targets for the reduction of greenhouse gas emissions, such as emissions of carbon dioxide and methane.

Meeting these targets relies in part on the growth of clean, renewable energy production that reduces the world’s reliance on energy derived from the burning of fossil fuels. Other climate science innovation could also contribute to climate change mitigation measures, ranging from carbon capture technology to methods of neutralizing ocean acidity.

Existing sustainable technologies can also help companies lower their carbon footprint. Artificial intelligence-powered analysis, for example, can help companies identify what parts of their operations produce the most greenhouse gas emissions; carbon accounting can inform their strategies on reducing those emissions.

Of utmost importance, scientists say, is acting quickly.

“If we act now,” IPCC Chair Hoesung Lee said in a 2023 statement, “we can still secure a livable sustainable future for all.”

Put your sustainability initiatives into action by managing the economic impact of severe weather and climate change on your business practices through the IBM Environmental Intelligence Suite.

Explore sustainability strategy Learn about climate and weather risk management

The post Climate change examples appeared first on IBM Blog.


KuppingerCole

Mar 07, 2024: Proactive Cyber Defense with Intelligent SIEM Platforms

Security information and event management (SIEM) solutions have dominated the enterprise security market for nearly two decades, but due to high operating costs, a shortage of skilled security experts, and the rapid pace of change in the business IT and cyber threat environments, traditional SIEMs are no longer effective. A new generation of SIEMs has emerged.
Security information and event management (SIEM) solutions have dominated the enterprise security market for nearly two decades, but due to high operating costs, a shortage of skilled security experts, and the rapid pace of change in the business IT and cyber threat environments, traditional SIEMs are no longer effective. A new generation of SIEMs has emerged.

IBM Blockchain

How IBM is using Real User Monitoring and DNS to deliver premium Global Server Load Balancing for business-critical applications 

For larger enterprises or businesses with a worldwide user footprint, GSLB is an essential service to keep traffic distributed efficiently. The post How IBM is using Real User Monitoring and DNS to deliver premium Global Server Load Balancing for business-critical applications  appeared first on IBM Blog.

Global Server Load Balancing (GSLB) isn’t for everyone. If your business operates at a local or regional level, regular load balancing will probably meet your needs.   

Yet, for some larger enterprises or businesses with a worldwide user footprint, GSLB is an essential service. Having a “load balancer for load balancers” keeps your traffic distributed in an efficient way and ensures the performance that your customers expect from an internet-enabled application.  

The promise and pitfalls of in-line load balancers 

Just about every GSLB solution on the market today is an in-line solution. Whether it’s a traditional on-prem box, a SaaS solution, or something else, the architecture is basically the same.  

Routing all your traffic through a load balancer may be convenient architecturally, but it does introduce some significant downsides. First, there’s the problem of choke points. If your load balancer goes down, so does your application. You can add more resilience by tacking on more load balancers, but that’s merely diffusing the problem instead of solving it. 

In-line load balancers also lack visibility into the sources of inbound traffic. Since they only control pathways from the endpoint to back-end resources, in-line load balancers can’t see the impact of “last mile” traffic on latency and application performance. That connection is often the most important piece of the puzzle when you’re trying to deliver a consistent application experience across different geographies, device types and local network conditions. 

The role of DNS and RUM data in GSLB 

Even though the Domain Name System (DNS) is the underlying technology used by most in-line load balancers, there’s often a disconnect between DNS and GSLB when it comes to how most network teams structure their operations. Authoritative DNS and load balancing are often handled by separate teams with very little overlap in functional responsibility. 

IBM® concluded that this separation between DNS and GSLB is counterproductive. DNS holds the keys to a more effective, more resilient, and even less expensive GSLB. 

Authoritative DNS has always been able to control traffic across the entire connection pathway by removing points of failure introduced by the architecture of in-line GSLB solutions. Its out-of-band nature makes DNS an ideal solution to the challenge of resilience, all without the need to purchase and deploy more appliances or endpoints. 

“Last mile” latency is the second piece of the puzzle and it requires more than just the ability to steer traffic. It needs data to inform applications about the best way to connect to back-end workloads at any given moment.   

Real User Monitoring (RUM) data provides instant information about user experiences directly from devices and can form the basis of steering decisions that route traffic around deprecated resources or congested connections. The result: faster connections, better user experiences, and more resilience. 

Disrupting load balancing with IBM NS1 Connect GSLB 

It’s almost strange that authoritative DNS and RUM data aren’t the solution of choice for GSLB.  IBM NS1 Connect® GSLB is going to significantly disrupt the load balancing market and change how in-line load balancers are viewed forever. Even better, DNS-based GSLB is far less expensive than in-line offerings currently on the market. 

SaaS authoritative DNS solutions like IBM NS1 Connect don’t involve the expense (and operational headache) of deploying boxes or buying more endpoint licenses. The additional cost of running authoritative DNS for load balancing is a tiny fraction of what most companies charge – regardless of your solution. 
 
The applications we use for work and play every day deserve better connectivity. We as consumers demand it, and organizations need the best connectivity during their shining moments. With greater resilience, faster connection performance, and a dramatically lower cost, it’s a win for both consumers and network teams. 

Join our webinar and download our eBook to see how DNS and RUM are a perfect combination for strategic traffic steering and application connectivity.  

Explore IBM’s GSLB

The post How IBM is using Real User Monitoring and DNS to deliver premium Global Server Load Balancing for business-critical applications  appeared first on IBM Blog.


1Kosmos BlockID

Combatting Biometric Spoofing

Biometric spoofing, also known as biometric spoof attacks or biometric presentation attacks, refers to the manipulation or falsification of biometric data to deceive a biometric authentication system. What Is Biometric Spoofing? Biometric spoofing refers to deceptive biometric authentication attempts by presenting fake samples related to fingerprints, facial scans, or iris scans–a “presentation at

Biometric spoofing, also known as biometric spoof attacks or biometric presentation attacks, refers to the manipulation or falsification of biometric data to deceive a biometric authentication system.

What Is Biometric Spoofing?

Biometric spoofing refers to deceptive biometric authentication attempts by presenting fake samples related to fingerprints, facial scans, or iris scans–a “presentation attack.” While biometric identification methods are significantly stronger than password-based systems, this does not make them invulnerable. Modern threats are evolving to bypass such protections. As such, many strong authentication requirements in compliance and regulatory standards require some form of liveness detection that can determine if the credentials presented are real.

In general, liveness detection methods can be classified as active or passive:

Active methods require user interaction, such as performing a specific action. For example, an active detection method might require the user to undertake some action, like smiling or speaking. Passive methods work in the background without the user’s awareness. This method requires no direct interaction with the user.

Broadly speaking, active methods are typically harder to spoof but require more engagement with the user, which can impact usability. Passive methods are smoother and seamless but present more opportunities for fraud.

What Are Some Types of Biometric Spoofing?

Because different forms of biometric spoofing align with various forms of authentication, each attempt to address the specific weaknesses and opportunities of those methods.
As such, presentation attacks can target various biometric modalities, including:

Facial Recognition Spoofing Attacks

Attackers may use different techniques to deceive facial recognition systems. Some of the common methods include:

Print Attack: The attack uses a printed photograph of the target person’s face to trick the facial recognition system. This is one of the simplest methods and can be effective against less sophisticated systems (most of which need to be deployed in a context where they would protect important information). Replay Attack: Hackers record a video of the target person’s face and play it back in front of the camera. This approach is often more successful than a print attack since it incorporates motion, which some facial recognition systems may require. 3D Mask Attack: The attacker creates a realistic 3D mask of the target person’s face and wears it during authentication. This method can be more challenging to detect, but it’s equally challenging to do effectively without specific skills and equipment. Deep Fake Attack: An attack uses a machine learning/AI program to create a video of the target’s face. Deepfake technology can create convincing facial movements and expressions, making it difficult for some facial recognition systems to differentiate between real and fake.

Facial recognition liveness detection techniques can include analyzing facial movements like blinking or verifying 3D depth information. Additionally, while there isn’t a consensus on how accurate deep fakes are, new technology from Intel can look for artifacts that signal that a video is artificial–rendering deep fakes relatively niche.

Fingerprint Spoofing Recognition Attacks

Fingerprint verification systems, while generally secure, can still be vulnerable to spoofing if appropriate countermeasures are not in place. Some of the common fingerprint spoofing methods include:

Fake Fingerprints: Hackers create artificial fingerprints using materials like gelatin that replicate the target user’s fingerprint pattern, often taken directly from a fingerprint. The fake fingerprint can then be placed over the attacker’s finger or a dummy finger to deceive the fingerprint scanner. Latent Fingerprints: An attacker lifts a target user’s latent fingerprint from a surface using adhesive tape or other methods and then transfers it onto a material that can deceive the fingerprint scanner. 3D-Printed Fingerprints: A sophisticated attack that involves someone creating a 3D model of the target user’s fingerprint using digital techniques and then 3D printing it with materials that mimic human skin properties. This method can create realistic replicas that can deceive some fingerprint scanners.

Countermeasures against these attacks include measuring finger skin temperature, moisture, or electrical properties to ensure the presented fingerprint comes from a live person.

Iris Recognition Spoofing Attacks

While iris recognition is generally considered to be a highly secure biometric modality, it can still be vulnerable to spoofing attacks if appropriate countermeasures are not in place.
Some of the common iris presentation attacks include:

Digital Iris Images: Displaying a digital image or video of the target user’s iris on a device screen, such as a smartphone or tablet, and presenting it to the iris scanner. This method can use different lighting and sharpness settings on devices to fool some biometric scanners. Artificial Eyes or Contact Lenses: Creating an artificial eye or a custom contact lens with the target user’s iris pattern imprinted. These can be harder to detect if the contacts are created well. Physical Eyes: Although a rare and extreme method, using a preserved cadaver eye with the target user’s iris pattern can also deceive the iris recognition system. It would require someone to steal the eye of a dead subject and use it relatively quickly to be effective, which may be its form of deterrence.

To defend against iris presentation attacks, defenders may use techniques like examining the natural movement and contraction of the iris, verifying light reflection patterns, or analyzing the unique texture of the iris surface.

How Is Liveness Detection Used in Identity Assurance Level (IAL) Verification?

Identity Assurance Level (IAL) is a classification system used by the National Institute of Standards and Technology (NIST) Special Publication 800-63-3, “Digital Identity Guidelines” to categorize the level of confidence in an individual’s asserted identity. These standards are often used to add layers of identity verification to processes involving sensitive government systems or classified data.

IAL2 is an intermediate level of assurance, the second of three levels. At IAL2, an individual’s identity must be verified through remote or in-person proofing processes, which involve validating and verifying identity information against trusted records and sources. Liveness detection plays a role in IAL2 by ensuring the integrity and authenticity of biometric data collected during the identity-proofing process.

Combating Biometric Spoofing with 1Kosmos

Biometrics can improve security by replacing passwords, but they can be subject to theft, spoofing, and decision bias. As discussed, bad actors are implementing approaches to bypass biometric assurances with increasing success.

The 1Kosmos platform performs a series of checks to prevent biometric-based attacks. For instance, 1Kosmos LiveID can perform both “active” liveness (requiring the user to perform randomized expressions) and “passive” liveness, one without the user’s involvement. Additionally, 1Kosmos utilizes true-depth camera functionality to prevent presentation attacks and offers an SDK to protect against camera manipulation to prevent an injection attack. Alongside these advances, 1Kosmos BlockID also offers the following features:

Anti-Spoofing Algorithms: 1Kosmos anti-spoofing algorithms detect and differentiate between genuine biometric data and spoofed data. Our algorithms analyze factors like texture, temperature, color, and movement to determine the authenticity of the biometric sample, catching virtual/hardware camera and JavaScript injections and ensuring the validity of the transmitted identity. Data Encryption: 1Kosmos ensures that biometric data is encrypted both during transmission and storage to prevent unauthorized access. Implementing strict access controls and encryption protocols prevents man-in-the-middle and protocol injections, ensuring the validity of the transmitted identity. Regular Audits and Penetration Testing: 1Kosmos conducts regular audits and penetration testing to identify and address vulnerabilities, including access to a user’s biometric data. This helps ensure that security measures are effective and up to date. Regulatory Compliance: 1Kosmos complies with regulations and standards related to biometric data protection and security, such as the National Institute of Standards and Technology (NIST 800-63-3), iBeta DEA EPCS, UK DIATF, General Data Protection Regulation (GDPR) and Know Your Customer/Employee (KYC/KYE). For a list of certifications, click here. Human “Failover”: 1Kosmos offers 24×7 staffed call centers to assist when an attack is detected or if a user has trouble completing a verification process.

Learn how 1Kosmos can help your organization modernize Identity and Access Management and prevent biometric-based attacks—visit our Architectural Advantage page and schedule a demo today.

The post Combatting Biometric Spoofing appeared first on 1Kosmos.


Ocean Protocol

Winners of the Ocean Protocol Holiday Build-A-Thon

This hackathon-like competition started before the December holidays. It ran through mid-January, providing participants a one-month window to submit proposals for leveraging Ocean Protocol technology in business applications. Here are the top 3: A Brief Synopsis Using Ocean Protocol Technology, the Holiday Build-A-Thon offered a unique chance for data enthusiasts, developers, and imag
This hackathon-like competition started before the December holidays. It ran through mid-January, providing participants a one-month window to submit proposals for leveraging Ocean Protocol technology in business applications. Here are the top 3: A Brief Synopsis

Using Ocean Protocol Technology, the Holiday Build-A-Thon offered a unique chance for data enthusiasts, developers, and imaginative scientists to collaborate and pioneer data economy-inspired products and services.

Results

In December, thirty-two teams registered, but only three emerged victorious. Each of these three teams demonstrated exemplary execution, as evaluated by our scoring committee. The primary goal was to explore the participants ability to think innovatively and apply creativity in devising fresh and engaging ideas for new projects using Ocean technology.

Top 3 1st Place— Marco Rodrigues

Marco impressed the review committee with his innovative web application, Post3 Engine. Post3 is designed to extract and analyze data from Web3 media platforms. The Post3 platform addresses a recurring demand for searchability and data analysis in Web3 news, alerts, and digital media. Content that is easy to digest and understand, and offers insights to trends and business intelligence. This proposition underscores the potential of leveraging Web3 technologies to increase digital content ownership and permanency. Congratulations to Marco on his award-winning proposal!

Check out Marco’s Proof of Concept Website:

Post3 Links

The repository related to the website platform above is below:

GitHub - macrodrigues/post3-engine: A platform to upload Ocean datasets and call models to obtain different insights

Marco's project summary and building experience, written himself:

How I Built a Web Application to Draw Insights From Articles Acr...

2nd Place — Stigma

This proposal aims to improve how delivery and ride-sharing services plan their routes. Congratulations to this fantastic idea for winning second place! It targets making food delivery and ride-sharing more efficient by focusing on better route planning. To explain the goal, here’s a video summary:

Stigma proposed an idea to shift the focus away from big companies like Uber and DoorDash and give more control to businesses and consumers themselves. The project introduces a Route Optimization app that cuts out the middleman in delivery services. Using Streamlit, Stigma developed a user-friendly web tool that improves how delivery routes are planned and managed. It is built with a focus on ease of use.

This application comes packaged with various functionalities like distance matrix calculations, data encryption, route plotting, and transferring of data using Ocean Protocol Technology.

GitHub - Opt-Coders/OCEAN-route-optimization

A Route Optimization demonstration was built in a simple Streamlit app, and includes easy-to-follow steps:

Route Optimization DEMO

3rd Place — Mohammad Jamali

In the video below, Mohammad Jamali personally describes how he created an app that blends an NFT marketplace, a wallpaper manager, and a virtual 3D gallery to make the user experience more engaging and give artists a new way to display and monetize their art. He shares the motivation behind the app, the challenges faced, including technical hurdles with blockchain technology, and the unique features that distinguish it, such as real-time wallpaper updates from favorite artists.

Objectives & Outcomes:

Before: This was for individuals or teams to create their own business proposal that has real-life applicability. Areas like automation, data processing, and predictive analytics were potential fields to explore for solutions. Teams submitted their proposals when they signed up for the Build-A-Thon, making sure to address key items in the evaluation criteria

Innovation Technology Implementation Social & Business Impact

After: The 3 best submissions included a data-driven statistics newsletter, advanced privacy and customization data tooling, and technology disruption in the delivery services industry; each podium finisher uses Ocean Technology to improve an existing workflow or demonstrates a use case in practical, real-world examples.

Congratulations to all 3 award-winning teams mentioned above. The teams followed an agenda given these key dates:

Timeline:

Launch Date: Monday, December 7, 2023

Registration Deadline: Thursday, December 14, 2023, 11:59 PM

Final Entry Deadline: Friday, January 12, 2024 (last day for registered teams to submit their entry)

Conclusion: Monday, February 12, 2024

Next Steps

The 3 podium finishers were invited to explore further collaboration with Ocean through the Shipyard grants program, and the 2024 data challenge championship.

Collaboration & Benefits of participation in data challenges and hackathon-style events include opportunities in grant and fundraising rounds, expanding networking opportunities, sponsored research, and co-marketing/communication initiatives.

Opportunities to Partake in the Ocean Ecosystem

Future experimental business applications, scientific research, bi-weekly data science intensive competitions, and hypothesis testing can be found through Ocean Protocol Data Challenges.

Development & User Guidelines on Ocean Technology can be found in our docs: https://docs.oceanprotocol.com/

About Ocean Protocol

Ocean was founded to level the playing field for AI and data. Ocean tools enable people to privately & securely publish, exchange, and consume data.

Follow Ocean on Twitter or Telegram to keep up to date. Chat directly with the Ocean community on Discord. Or, track Ocean progress now on GitHub.

Winners of the Ocean Protocol Holiday Build-A-Thon was originally published in Ocean Protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.


DF76 Completes and DF77 Launches

Stakers can claim DF76 rewards. DF77 runs Feb 15— Feb 22, 2024. Bug fix in reward calculation 1. Overview Ocean Data Farming (DF) is Ocean’s incentives program. In DF, you can earn OCEAN rewards by locking OCEAN, curating data, and making predictions (in Predictoor). Here are DF docs. Data Farming Round 76 (DF76) has completed. 150K OCEAN + 20K ROSE was budgeted for rewards. Rewards counti
Stakers can claim DF76 rewards. DF77 runs Feb 15— Feb 22, 2024. Bug fix in reward calculation 1. Overview

Ocean Data Farming (DF) is Ocean’s incentives program. In DF, you can earn OCEAN rewards by locking OCEAN, curating data, and making predictions (in Predictoor). Here are DF docs.

Data Farming Round 76 (DF76) has completed. 150K OCEAN + 20K ROSE was budgeted for rewards. Rewards counting started 12:01am Feb 8, 2024 and ended 12:01 am Feb 15. You can claim rewards at the DF dapp Claim Portal.

DF77 is live today, Feb 15. It concludes on Feb 22. 150K OCEAN and 20K ROSE are budgeted in total for rewards.

This post is organized as follows:

Section 2: DF structure Section 3: How to earn rewards, and claim them Section 4: Specific parameters for DF77

And, for this post we have a special section 5, about a bug fix in DF reward calculation. The summary is: some Data Farmers were underpaid in the last few rounds. We have fixed the issue. Data Farmers can claim their reimbursement in the usual way to claim, via the DF dapp.

2. DF structure Passive DF. As a veOCEAN holder, you get passive rewards by default. Active DF has two substreams.
– Volume DF. Actively curate data by allocating veOCEAN towards data assets with high Data Consume Volume (DCV), to earn more.
– Predictoor DF. Actively predict crypto prices by submitting a price prediction and staking OCEAN to slash competitors and earn. 3. How to Earn Rewards, and Claim Them

There are three ways to earn and claim rewards: passive DF (like before), Active DF : Volume DF (like before), and Predictoor DF (new).

Passive DF. To earn: lock OCEAN for veOCEAN, via the DF webapp’s veOCEAN page. To claim: go to the DF Webapp’s Rewards page; within the “Passive Rewards” panel, click the “claim” button. The Ocean docs have more details. Active DF
– Volume DF substream. To earn: allocate veOCEAN towards data assets, via the DF webapp’s Volume DF page. To claim: go to the DF Webapp’s Rewards page; within the “Active Rewards” panel, click the “claim” button (it claims across all Active DF substreams at once). The Ocean docs have more details.
– Predictoor DF substream. To earn: submit accurate predictions via Predictoor Bots and stake OCEAN to slash incorrect Predictoors. To claim OCEAN rewards: run the Predictoor $OCEAN payout script, linked from Predictoor DF user guide in Ocean docs. To claim ROSE rewards: see instructions in Predictoor DF user guide in Ocean docs. 4. Specific Parameters for DF77

This round is part of DF Main, phase 1.

Budget. This round has 150,000 OCEAN + 20,000 ROSE rewards total. That OCEAN and ROSE is allocated as follows:

Passive DF: 50% of rewards = 75,000 OCEAN Active DF: 50% of rewards
– Predictoor DF. 50% = 37,500 OCEAN + 20k ROSE
– Volume DF. 50% = 37,500 OCEAN

Networks. Passive DF applies to OCEAN locked on Ethereum mainnet. Predictoor DF applies to activity on Oasis Sapphire. Volume DF applies to data assets published on Ethereum Mainnet, Polygon, BSC, EWC, and Moonriver. Here is more information about Ocean deployments to networks.

Volume DF rewards are calculated as follows:

First, distribute OCEAN across each asset based on rank: highest-DCV asset gets most OCEAN, etc. Then, for each asset and each veOCEAN holder:
– If the holder is a publisher, 2x the effective stake
– Baseline rewards = (% stake in asset) * (OCEAN for asset)
– Bound rewards to the asset by 125% APY
– Bound rewards by asset’s DCV * 0.1%. This prevents wash consume.

Predictoor DF rewards are calculated as follows:

First, DF Buyer agent purchases Predictoor feeds using OCEAN throughout the week to evenly distribute these rewards. Then, ROSE is distributed at the end of the week to active Predictoors that have been claiming their rewards.

Expect further evolution in Active DF: tuning substreams and budget adjustments among substreams. What remains constant is passive DF, and the total OCEAN rewards emission schedule.

Updates are always announced at the beginning of a round, if not sooner.

5. Bug fix in reward calculation 5.1 Overview

Some Data Farmers were underpaid in the last few rounds. They’ve now been compensated the difference. The rest of this section elaborates.

5.2 The Issue

In early February 2024, we identified an issue affecting DF reward calculation. The part of code that limits reward based on DCV (data consume volume) was supposed to limit in units of OCEAN. However it limited in units of USD. This led to Volume DF rewards being lower than intended, starting in DF63 (start of Predictoor DF). There were 69 wallets affected, with a total of 64,501 OCEAN less than it should have been.

5.3 Solution in Code

As soon as we discovered the issue, we fixed it by limiting using the correct units (OCEAN, not USD). Then we took steps to reimburse; see next section.

5.4 Reimbursement

We always strive to do the right thing. Trust is of utmost importance to us.In this case, that means: ensuring that every DF participant receives the DF rewards they earned.

We carefully calculated the missed amounts for each affected wallet. Section 5.5 lists each affected wallet, and the amount of OCEAN that was missing.

We are pleased to announce that these reimbursements have been processed and are now available for claiming. Participants affected by this issue can claim their reimbursement in the same manner as they would typically claim their DF rewards via https://df.oceandao.org/rewards.

5.5 OCEAN Reimbursed, per Wallet +--------------------------------------------+-----------------+
| Address | OCEAN amount |
+--------------------------------------------+-----------------+
| 0x8475b523b5fa2db7b77eb5f14edabdefc2102698 | 9764.654452 |
| 0xa7d40704a7cf779c8f94b1f4b8a01e919ece9da3 | 8716.383238 |
| 0xc1b8665bae4389d95d558ff3a0e58c2a24625f63 | 6537.057703 |
| 0xac517ed8283d629dd01fac97ece9f91b218203f9 | 6535.678203 |
| 0x2e434c18ae93ee2da937222ea5444692ed265ac0 | 4354.251220 |
| 0xf2f98a98b87beeea252a924eac36b23422efb5fb | 3509.238175 |
| 0x8978be1b2082d10ea95533d2897ddab53afb97e9 | 3260.005369 |
| 0x663052ad99b85a8c35040c4fd1cc87620f4b61f1 | 2665.678372 |
| 0xf264cd686a5cda5a9dca0063331ba3ac2441f1fa | 2202.044770 |
| 0xf92d2ff667f905ec34396af4769ff6d04e7ec84e | 2181.689739 |
| 0x3e0ac30da7f58a3c7e015bfd4595b7fc21ce08e0 | 2176.536838 |
| 0xfd7b89861524b7f02fce8f9a88b73c078bb60061 | 2175.767963 |
| 0x26e4674c09cbbf0b367aae65d8d08b112e307a53 | 2175.576704 |
| 0x15558eb2aeb93ed561515a47441bf49250933ba9 | 2175.152904 |
| 0xb1e24789311f14b6270b2b4ed11a768bf9b547ff | 2084.413905 |
| 0x5f148fa6a7fe1f282341c8fc54b781e42b09a518 | 1539.577851 |
| 0xf0a8802509421df907188434d4fc230cf9271672 | 744.8884978 |
| 0xcfd1d657f820404fedb594e5b0981a1b03dd6bef | 515.8695648 |
| 0xddf33dfac858b7b588836a88a162e9b626a60431 | 302.4806709 |
| 0xcd2e9293bf8887bfb61c30740b728f3ad734c3f5 | 195.4049799 |
| 0x175437b00da09f18d89571b95a41a15aa8415eba | 77.65258919 |
| 0x916166aa5e24015e354c21a4ffd18f1ccc563fe8 | 52.08977826 |
| 0xc07c6c07b44e5a35105e63294ab799d8baaae502 | 44.16986781 |
| 0xcf8a4b99640defaf99acae9d770dec9dff37927d | 43.4747607 |
| 0x506d48b2e55073b412f56378b58cb4025411a597 | 40.98570415 |
| 0x7c28328f90a74fb83d463d359a2aec0d6a3512ee | 37.85599714 |
| 0xe0dc24a3d7478eb840dc63baa20fcb06cdb123be | 35.68623347 |
| 0x655efe6eb2021b8cefe22794d90293aec37bb325 | 30.29108552 |
| 0x7996756d47cca280637279346854aef4405cb882 | 29.85739368 |
| 0x813888135cfdc67bd68b79aea1d27863dd2b990a | 29.21173116 |
| 0xd13294fa603d9243fb9aeff338a5639ee1bd6599 | 27.62911322 |
| 0x4f20e69e7ba5ab2fb2ae25a1d17c93fe5307faa9 | 26.48376284 |
| 0xd8a7b5f1bf69b66a1ba464af5c6adba17c120eec | 25.32397297 |
| 0x006d0f31a00e1f9c017ab039e9d0ba699433a28c | 24.74086964 |
| 0x744f397d0d1bad2468f53cf33d6806f49f4a3fc1 | 23.27638782 |
| 0xaa26335c0d78087524a15d492ab7fc810ab47b66 | 16.67443834 |
| 0x4602f9ed469c6bbdabd7c3708ea01f9d1739941d | 14.23924217 |
| 0x2196b9fe1117df9324108d82dffc849492c23c14 | 11.77966672 |
| 0x5cdc664bcabe8c55699ce7e0e5b5b52dbff3fc44 | 10.73340056 |
| 0x4ac8e3fd41c06499fc63427ec3ae9d86d46d9d71 | 9.395761090 |
| 0xc5bf01b2b657823075204379743a7dff6e914da1 | 9.301161370 |
| 0x28f39db34bd36ba5dc9009015f67a57f90748422 | 7.809973608 |
| 0x454fc291c102c970ba266b20fb5b0f8c03af792a | 7.663219428 |
| 0x009ec7d76febecabd5c73cb13f6d0fb83e45d450 | 7.585875124 |
| 0x312107b47d020468fa97a28b1d537202e4a83fdf | 6.831125637 |
| 0xc84f2a72b6dbce73ef285d80b60f1b285e88196f | 6.814097828 |
| 0x64b0012681110fc8aa09f9e0eb29f64ec3010594 | 6.478189163 |
| 0xc76357b8a519e617e3fd76494c9e8538930d6078 | 5.994009374 |
| 0x64e98f813a2184c5b41e6c91aa1917d84fd09897 | 3.914065609 |
| 0xeb18bad7365a40e36a41fb8734eb0b855d13b74f | 3.106771961 |
| 0x11300251b903ba70f51262f3e49aa7c22f81e1b2 | 2.925225177 |
| 0xd95929b3d82f7f76afaf23ae716c691e489a43b1 | 2.456127636 |
| 0x341a85f21a1e2f60e70e1b5b326e1fc70af8da4b | 2.066781088 |
| 0x3984e965054644f4ebb53e56651d88e9698b644a | 1.249984361 |
| 0x06a2006ca85813e652506b865e590f44eae3928a | 0.904003486 |
| 0x4e779926cf83b9d2d6f206b1a77fe52b3a391b45 | 0.708954632 |
| 0x27f476724960e15aa405ce5a0d43c272a1faea0e | 0.385602980 |
| 0x97915e9449dac3085227f62e368af6158f277777 | 0.257356131 |
| 0x375bb5678079e8cd1ac192fbf375633913a136c7 | 0.242664758 |
| 0xedebbaba0fe81c1ea0c7069f2cf08036a2ce57ce | 0.179643342 |
| 0x518c46332f905fe6ce58e828519cd8a0215b0db2 | 0.170393229 |
| 0x035a209abf018e4f94173fdeabe5abe69f1efbed | 0.122208613 |
| 0x1bd00642155e8350f7d15728c1df8673697064c2 | 0.120553964 |
| 0xfd50d4d48896e076a02bd5f7534df1244948eaae | 0.045053166 |
| 0x519a621dbf0bef97ba0475eff78cb1f3d819b691 | 0.004050152357 |
| 0xdc17e4309f20924c98de7be633272562abaf27fb | 0.004050152357 |
| 0x6c5a1c8fae7a46c3cc143d5a67919b776d390f09 | 0.004050152357 |
| 0xb54531f038d85be6a8ad09e391f95c6431a50212 | 0.004050152357 |
| 0x921d58c0cc78aaf04652ff28e68f66d15ac7b73e | 0.004050152357 |
+--------------------------------------------+-----------------+
| Total OCEAN | 64501.25617 |
+--------------------------------------------+-----------------+ Appendix: Further Reading

The Data Farming Series post collects key articles and related resources about DF.

About Ocean Protocol

Ocean was founded to level the playing field for AI and data. Ocean tools enable people to privately & securely publish, exchange, and consume data.

Follow Ocean on Twitter or Telegram to keep up to date. Chat directly with the Ocean community on Discord. Or, track Ocean progress directly on GitHub.

DF76 Completes and DF77 Launches was originally published in Ocean Protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.


DF75 Completes and DF76 Launches

Stakers can claim DF75 rewards. DF76 runs Feb 8— Feb 15, 2024 [Note: this article was meant to be published on Feb 8, 2024]. 1. Overview Ocean Data Farming (DF) is Ocean’s incentives program. In DF, you can earn OCEAN rewards by locking OCEAN, curating data, and making predictions (in Predictoor). Here are DF docs. Data Farming Round 75 (DF75) has completed. 150K OCEAN + 20K ROS
Stakers can claim DF75 rewards. DF76 runs Feb 8— Feb 15, 2024

[Note: this article was meant to be published on Feb 8, 2024].

1. Overview

Ocean Data Farming (DF) is Ocean’s incentives program. In DF, you can earn OCEAN rewards by locking OCEAN, curating data, and making predictions (in Predictoor). Here are DF docs.

Data Farming Round 75 (DF75) has completed. 150K OCEAN + 20K ROSE was budgeted for rewards. Rewards counting started 12:01am Feb 1, 2024 and ended 12:01 am Feb 8. You can claim rewards at the DF dapp Claim Portal.

DF76 is live as of Feb 8, 2024. It concludes on Feb 15. 150K OCEAN and 20K ROSE are budgeted in total for rewards.

This post is organized as follows:

Section 2: DF structure Section 3: How to earn rewards, and claim them Section 4: Specific parameters for DF76 2. DF structure Passive DF. As a veOCEAN holder, you get passive rewards by default. Active DF has two substreams.
– Volume DF. Actively curate data by allocating veOCEAN towards data assets with high Data Consume Volume (DCV), to earn more.
– Predictoor DF. Actively predict crypto prices by submitting a price prediction and staking OCEAN to slash competitors and earn. 3. How to Earn Rewards, and Claim Them

There are three ways to earn and claim rewards: passive DF (like before), Active DF : Volume DF (like before), and Predictoor DF (new).

Passive DF. To earn: lock OCEAN for veOCEAN, via the DF webapp’s veOCEAN page. To claim: go to the DF Webapp’s Rewards page; within the “Passive Rewards” panel, click the “claim” button. The Ocean docs have more details. Active DF
– Volume DF substream. To earn: allocate veOCEAN towards data assets, via the DF webapp’s Volume DF page. To claim: go to the DF Webapp’s Rewards page; within the “Active Rewards” panel, click the “claim” button (it claims across all Active DF substreams at once). The Ocean docs have more details.
– Predictoor DF substream. To earn: submit accurate predictions via Predictoor Bots and stake OCEAN to slash incorrect Predictoors. To claim OCEAN rewards: run the Predictoor $OCEAN payout script, linked from Predictoor DF user guide in Ocean docs. To claim ROSE rewards: see instructions in Predictoor DF user guide in Ocean docs. 4. Specific Parameters for DF76

This round is part of DF Main, phase 1.

Budget. This round has 150,000 OCEAN + 20,000 ROSE rewards total. That OCEAN and ROSE is allocated as follows:

Passive DF: 50% of rewards = 75,000 OCEAN Active DF: 50% of rewards
– Predictoor DF. 50% = 37,500 OCEAN + 20,000 ROSE
– Volume DF. 50% = 37,500 OCEAN

Networks. Passive DF applies to OCEAN locked on Ethereum mainnet. Predictoor DF applies to activity on Oasis Sapphire. Volume DF applies to data assets published on Ethereum Mainnet, Polygon, BSC, EWC, and Moonriver. Here is more information about Ocean deployments to networks.

Volume DF rewards are calculated as follows:

First, distribute OCEAN across each asset based on rank: highest-DCV asset gets most OCEAN, etc. Then, for each asset and each veOCEAN holder:
– If the holder is a publisher, 2x the effective stake
– Baseline rewards = (% stake in asset) * (OCEAN for asset)
– Bound rewards to the asset by 125% APY
– Bound rewards by asset’s DCV * 0.1%. This prevents wash consume.

Predictoor DF rewards are calculated as follows:

First, DF Buyer agent purchases Predictoor feeds using OCEAN throughout the week to evenly distribute these rewards. Then, ROSE is distributed at the end of the week to active Predictoors that have been claiming their rewards.

Expect further evolution in Active DF: tuning substreams and budget adjustments among substreams. What remains constant is passive DF, and the total OCEAN rewards emission schedule.

Updates are always announced at the beginning of a round, if not sooner.

Appendix: Further Reading

The Data Farming Series post collects key articles and related resources about DF.

About Ocean Protocol

Ocean was founded to level the playing field for AI and data. Ocean tools enable people to privately & securely publish, exchange, and consume data.

Follow Ocean on Twitter or Telegram to keep up to date. Chat directly with the Ocean community on Discord. Or, track Ocean progress directly on GitHub.

DF75 Completes and DF76 Launches was originally published in Ocean Protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.

Wednesday, 14. February 2024

FindBiometrics

Yoti Provides Age Assurance Tech for Kid-friendly Social Platform

Wizz, the operator of an eponymous social media app for children, has now rolled out a biometric age check system, and made it mandatory for all users. The technology has been […]
Wizz, the operator of an eponymous social media app for children, has now rolled out a biometric age check system, and made it mandatory for all users. The technology has been […]

SC Media - Identity and Access

Glow fertility tracker forum vulnerability leaked entire user base's data

All users of the Glow fertility tracking app totaling nearly 25 million individuals had their personal data exposed as a result of a vulnerability in the app's online forum, according to TechCrunch.

All users of the Glow fertility tracking app totaling nearly 25 million individuals had their personal data exposed as a result of a vulnerability in the app's online forum, according to TechCrunch.


Year-old Pentagon breach only disclosed now

The U.S. Department of Defense has begun informing current and former employees, partners, and job applicants regarding the potential exposure of their personally identifiable information stemming from a service provider's inadvertent leak of several emails between Feb. 3 and Feb. 20, 2023, reports DefenseScoop.

The U.S. Department of Defense has begun informing current and former employees, partners, and job applicants regarding the potential exposure of their personally identifiable information stemming from a service provider's inadvertent leak of several emails between Feb. 3 and Feb. 20, 2023, reports DefenseScoop.


FindBiometrics

AI Risks, Biometric ID, and a New Concept Car – Identity News Digest

Welcome to FindBiometrics’ digest of identity industry news. Here’s what you need to know about the world of digital identity and biometrics today: Expert Committee to Weigh AI Risks in […]
Welcome to FindBiometrics’ digest of identity industry news. Here’s what you need to know about the world of digital identity and biometrics today: Expert Committee to Weigh AI Risks in […]

auth0

A Passwordless Future: Passkeys for Developers

Passkeys and WebAuthn for developers. Learn how passkeys work and the benefits they provide.
Passkeys and WebAuthn for developers. Learn how passkeys work and the benefits they provide.

IBM Blockchain

How to achieve Kubernetes observability: Principles and best practices

Learn how Kubernetes observability works, and how organizations can use it to optimize cloud-native IT architectures. The post How to achieve Kubernetes observability: Principles and best practices appeared first on IBM Blog.

Kubernetes (K8s) containers and environments are the leading approach to packaging, deploying and managing containerized applications at scale. The dynamic, open-source, microservices-based configuration of Kubernetes can be a great fit for businesses that are looking to maximize infrastructure agility. However, the distributed flexibility that makes Kubernetes appealing can also make implementing Kubernetes monitoring and observability practices challenging.

Observability comprises a range of processes and metrics that help teams gain actionable insights into a system’s internal state by examining system outputs. It’s an essential part of maintaining any IT infrastructure. But managing the sheer volume of data, nodes, pods, services and endpoints that comprise Kubernetes environments requires observability practices that are appropriate for the job.

In this blog, we discuss how Kubernetes observability works, and how organizations can use it to optimize cloud-native IT architectures.

How does observability work?

Broadly speaking, observability describes how well internal system states can be inferred from external outputs. It’s the ability to diagnose and understand why a system is behaving in a particular way, which is vital to troubleshooting, deciphering performance issues and improving system design.

In DevOps, the concept of observability has evolved to refer to the end-to-end visibility of a system state as dictated by telemetry data. The primary data classes used—known as the three pillars of observability—are logs, metrics and traces.

Logs

Logs include discrete events recorded every time something occurs in the system, such as status or error messages, or transaction details. Kubernetes logs can be written in both structured and unstructured text.

Metrics

CPU usage, memory consumption, network I/O, request latency or any business-specific indicators. Kubernetes metrics are often aggregated to create time-series observability data that can help teams spot trends and identify patterns.

Traces

Traces help teams follow a request or transaction through the various services and components of a distributed system. They also help teams visualize the dependencies between different components of an infrastructure so that delays and errors can be located quickly.

Achieving successful observability requires the deployment of appropriate Kubernetes monitoring tools and the implementation of effective processes for collecting, storing and analyzing the three primary outputs. This might include setting up and maintaining monitoring systems, application log aggregators, application performance management (APM) tools or other observability platforms.

However, Kubernetes environments also necessitate a more thorough examination of standard metrics. Kubernetes systems comprise a vast environment of interconnected containers, microservices and other components, all of which generate large amounts of data. Kubernetes schedules and automates container-related tasks throughout the application lifecycle, including:

Deployment

Kubernetes can deploy a specific number of containers to a specific host and keep them running in their desired state.

Rollouts

A rollout is a Kubernetes deployment modification. Kubernetes enables teams to initiate, pause, resume and roll back rollouts.

Service discovery

Kubernetes can automatically expose a container to the internet or other containers using a DNS name or IP address.

Autoscaling

When traffic spikes, Kubernetes can automatically spin up new clusters to handle the additional workload.

Storage provisioning

Teams can set up Kubernetes to mount persistent local or cloud storage for containers.

Load balancing

Based on CPU utilization or custom metrics, Kubernetes load balancing features can distribute workloads across the network to maintain performance and stability.

Self-healing for high availability

Kubernetes can automatically debug, restart or replace a failed container to prevent downtime. It can also decommission containers that don’t meet health check requirements.

With so many shifting, interacting and layered components comes as many potential issues and failure points, therefore lots of areas where real-time monitoring becomes a necessity. It also means that a conventional approach to monitoring logs, metrics and traces might prove insufficient for observability in a Kubernetes environment.

Kubernetes observability principles

Because every component in a Kubernetes architecture is interdependent on other components, observability requires a more holistic approach.

Kubernetes observability requires organizations to go beyond collecting and analyzing cluster-level data from logs, traces and metrics; connecting data points to better understand relationships and events within Kubernetes clusters is central to the process. This means that organizations must rely on a tailored, cloud-native observability strategy and scrutinize every available data source within the system.

Observability in a K8s environment involves:

1. Moving beyond metrics, logs and apps. Much like virtual machine (VM) monitoring, Kubernetes observability must account for all log data (from containers, master and worker nodes, and the underlying infrastructure) and app-level metrics. However, unlike VMs, Kubernetes orchestrates container interactions that transcend apps and clusters. As such, Kubernetes environments house enormous amounts of valuable data both outside and within network clusters and apps. This includes data in CI/CD pipelines (which feed into K8s clusters) and GitOps workflows (which power K8s clusters).

Kubernetes also doesn’t expose metrics, logs and trace data in the same way traditional apps and VMs do. Kubernetes tends to capture data “snapshots,” or information captured at a specific point in the lifecycle. In a system where each component within every cluster records different types of data in different formats at different speeds, it can be difficult—or impossible—to establish observability by simply analyzing discrete data points.

What’s more, Kubernetes doesn’t create master log files at either the app or cluster level. Every app and cluster records data in its respective environment, so users must aggregate and export data manually to see it all in one place. And since containers can spin up, spin down or altogether disappear within seconds, even manually aggregated data can provide an incomplete picture without proper context.

2. Prioritizing context and data correlation. Both monitoring and observability are key parts of maintaining an efficient Kubernetes infrastructure. What differentiates them is a matter of objective. Whereas monitoring helps clarify what’s going on in a system, observability aims to clarify why the system is behaving the way that it is. To that end, effective Kubernetes observability prioritizes connecting the dots between data points to get to the root cause of performance bottlenecks and functionality issues.

To understand Kubernetes cluster behavior, you must understand each individual event in a cluster within the context of all other cluster events, the general behavior of the cluster, and any events that led up to the event in question.

For instance, if a pod starts in one worker node and terminates in another, you need to understand all the events that are happening simultaneously in the other Kubernetes nodes, and all the events that are happening across your other Kubernetes services, API servers and namespaces to get a clear understanding of the change, its root cause, and its potential consequences.

In other words, merely monitoring tasks is often inadequate in a Kubernetes environment. To achieve Kubernetes observability, get relevant system insights or conduct accurate accurate root cause analyses, IT teams must be able to aggregate data from across the network and contextualize it.

3. Using Kubernetes observability tools. Implementing and maintaining Kubernetes observability is a large, complex undertaking. However, using the right frameworks and tools can simplify the process and improve overall data visualization and transparency.

Businesses can choose from a range of observability solutions, including programs that automate metrics aggregation and analysis (like Prometheus and Grafana), programs that automate logging (like ELK, Fluentd and Elasticsearch) and programs that facilitate tracing visibility (like Jaeger). Integrated solutions, like OpenTelemetry, can manage all three major observability practices. And customized, cloud-native solutions, like Google Cloud Operations, AWS X-Ray, Azure Monitor and and IBM Instana Observability, offer observability tools and Kubernetes dashboards optimized for clusters that are running on their infrastructure.

Best practices for optimizing Kubernetes observability

• Define your KPIs. Figure out which key performance indicators, like app performance, system health and resource usage, give you the most useful insights into your infrastructure’s behavior. Revise them as needed.
• Centralize logging. K8s environments generate massive amounts of data. Aggregating and storing it using a centralized logging solution is integral to data management.
• Monitor resource usage. Collect real-time data on memory, CPU and network usage so you can proactively scale resources when necessary.
• Set up alerts and alarms. Use established KPI thresholds to configure alerts and alarms. This practice allows teams to receive timely notifications when issues arise.

Establish Kubernetes observability with IBM® Instana® Observability

Kubernetes is the industry-standard container orchestration platform, managing containerized workloads with remarkable efficiency. However, the distributed, multi-layered microservices architecture of Kubernetes demands robust observability mechanisms and advanced solutions, like IBM Instana Observability.

Instana Observability provides automated Kubernetes observability and APM capabilities that are designed to monitor your entire Kubernetes application stack—from nodes and pods to containers and applications—for all Kubernetes distributions.

Observability in Kubernetes is not just a technical implementation; it’s a strategic approach that requires attentive planning and an organizational culture that values data transparency.

Instana Observability helps teams gain a comprehensive understanding of their Kubernetes environments and deliver robust, high-performing applications in an increasingly cloud-based world.

Explore Instana Observability

The post How to achieve Kubernetes observability: Principles and best practices appeared first on IBM Blog.


auth0

Reasons to Fall in Love with Passkeys

This Valentine’s Day ditch passwords and fall in love with passkeys, the future of authentication!
This Valentine’s Day ditch passwords and fall in love with passkeys, the future of authentication!

Dark Matter Labs

Towards multivalent currencies, bioregional monetary stewardship and a distributed global reserve…

Towards multivalent currencies, bioregional monetary stewardship and a distributed global reserve currency In Blogs 1–3 of this series we have been exploring the conceptual aspects of an alternative monetary system. In this final blog, we would like to shift gear and channel our thinking into a series of practical next steps. The blog series is centred on the following enquiries and th
Towards multivalent currencies, bioregional monetary stewardship and a distributed global reserve currency

In Blogs 1–3 of this series we have been exploring the conceptual aspects of an alternative monetary system. In this final blog, we would like to shift gear and channel our thinking into a series of practical next steps.

The blog series is centred on the following enquiries and this reflection aims to address question four:

What are the issues that make money (and our dominant monetary systems) so problematic? (Blog 1) Can we use design principles to help us imagine desirable future scenarios? (Blog 2) What could a desirable future scenario actually look like? (Blog 3) What can we start building and testing now to begin scaffolding a parallel system?

Whilst writing this post, we began to feel a niggling sense of doubt. Something about it felt quite anticlimactic. In this initiative we started with a bold vision underpinned by some detailed research of the problem space. Now, as is so often the case we are grappling with the details of how to make things happen on the ground. In the messy reality of managing relationships, roles, time constraints and funding pressures, our suggestions for a way forward seem insubstantial. Rather than trying to pad them out or use technical language to shield the lack of clarity, we have decided to go ahead and share where we actually are, both in our thinking and in our aspiration for this work.

Questions we need to begin working on

We are publishing this Blog at an early stage of our thinking. At this point our ambitions far outweigh our intellectual ability and team capacity. This is overwhelming, exciting and humbling! To break through this paralysis we are sharing some open questions based on our initial internal brainstorming sessions and responses to the previous blogs (credited below). If you have reflections or questions of your own then then we would really welcome comments:

What would a low tech, minimum viable proof of possibility look like? How can we use existing technology meaningfully in different contexts (for example a rural area in the Global South)?¹ How can we account for different closed loop systems in their own rights without going down the data rabbit hole? Can we build on the energy and intelligence of the Geospatial Community to leverage data in a richly contextual yet feasible format?² How can we ensure that the idea of a Bioregion doesn’t perpetuate a fractured worldview? How do we decide which cycles or measurements are the most important for a region? What are we pegging a system’s health against? How can we estimate what the optimal regenerative potential might be? What can people do with the individual tokens that they receive? Why would anyone want to take part in a project? How will the main RegenCoins be exchanged both within and outside of the region? What can we learn from previous attempts to implement local currencies such as those pioneered by the Transition Town Network?³ How can we make sure that the ability for different people to earn tokens is fair? For example, if somebody lives close to a river it might be easier to demonstrate acts of care than for somebody who lives further away. Existing technology, tools and research that we can build on

We are both sure (and hugely relieved) that we are far from the first people to explore this subject. Many of the ideas outlined in the previous blogs have been inspired by the work of others, and we will continue to learn and incorporate as many diverse perspectives as possible. As we move towards our first prototype we have identified a number of initiatives that we think will provide invaluable insights:

Using market based credit platforms as a basis for quantifying living systems: the level of data being collected is already significant. For example the Seed Biocomplexity Index has been developed to link geospatial satellite data with scientific literature. Another example is the Regen Network which describes itself as ‘a platform to originate and invest in high-integrity carbon and biodiversity credits from ecological regeneration projects’. Although we are categorically not looking to price elements of the bioregion, the research underpinning some of the Network’s projects (e.g. pollinator health) could be a powerful learning framework. We are also interested to learn from less established initiatives such as land trust tokens being linked to river basin resilience programmes in South Africa¹. Taxes pegged to positive activities: we hope to learn from proposals for tax incentive schemes that are specifically designed to encourage stewardship activities. An example is the Malta Atlantica Restoration Currency that is being proposed in Brazil. Universal basic income / services pilots: there are a number of live and closed UBI/S pilots that we are intrigued to learn from. Of particular interest is to try to understand how people’s behaviour and perception of value is affected, when the link between monetary transactions and daily activities is less linear.³ Statistical forecasting models used in decision making: there is an exciting body of research emerging in the context of applying Bayesian forecasting techniques to environmental modelling. We are imagining that this might link closely to crypto economic research currently being conducted at hubs such as WU Vienna and ETH Zurich⁴. Partnering with an academic centre to bring their research into our prototype will be a priority. Relational, flow based technology: the Holochain Community has developed a number of different peer-to-peer tools and resources that have strong value alignment with this proposal. In particular, the hREA value flows specification could provide a scaffold to connect the different activities that we would like to explore. Currency and token design: community experts such as Arthur Brock and Grace Rachmany have deep knowledge and experience with alternative currency design. We would aim to consult and build on this expertise rather than trying to reinvent the wheel. Thrutopia storytelling: the novelist and podcast host Manda Scott describes Thrutopias as ‘clear, engaging routes through to a world we’d all be proud to bequeath to future generations’. As we strive to test pathways in the present, we do not want to lose touch with the imaginative potential that we started to engage with in Blog 3. Prominent authors such as Kim Stanley Robinson⁵ and Yanis Varoufakis⁶ have explored ideas such as a carbon coin and free PerCap accounts within sci-fi narratives, and we will seek to engage with emerging writing in this field.⁷ A prototype proposition

As an initial call to action, we would like to partner with a community (or a number of communities) to build the foundations for the world’s first decentralised, bioregional bank. To make the project realistic we imagine that the areas involved will need to be small enough to make the data capture manageable. However they will also need to represent a number of different biophysical and social cycles to ensure a meaningful entangled value interaction is captured. It will also be important to engage with a supportive governance and legal environment for some stages of the work, to ground the prototype in the reality of the existing system.

We would invite interested communities and collaborators to self-select as high levels of trust and value alignment will be fundamental pillars of success. We will therefore be circulating these blogs and having conversations with a number of potential demonstrator locations. It feels important to clearly state that we do not have the financial resources at this point to pay people for their help. This is difficult and at times feels uncomfortable; the fear is that we may unwittingly perpetuate extractive behaviour in the pursuit of a regenerative goal. We are trying to balance our commitment to push this thinking forward, with a deep sense of respect for our ecosystem and will continually review the relationship between the two for signs of stress.

Credit: Eunji Kang, Dark Matter Labs

The prototype will be designed as 6 interlinked stages:

Value mapping: here we will select a number of elements to visualise. The mix will depend on the location but could include a combination of air and water quality, soil health, biodiversity resilience, equity, care and trust. The mapping is likely to combine geospatial data with interviews, surveys and participatory workshops. Real time sensory data gathering: based on the value mapping we will investigate the best ways of capturing the relational flows that define our selected elements. For some flows, sensors will already be in place whilst for others the process will be participatory. Multivalent currency design and use: once we have an understanding of how value is being both created and eroded in our ecosystem, we will attempt to design self-regulating currencies to represent these multivalent flows. Building a probabilistic model based on data inputs: this stage will combine the data and knowledge gained to simulate how the different elements are impacting on each other. The model will be interactive and infer each forecast from an updated understanding of the underlying systems. Linking the model to governance decisions: this phase will focus on the interaction between the living agents and the value simulation. It will involve using the model to process multiple data inputs to inform a number of decisions. The outputs will be tested with the community to assess how accurately their views and values are being represented. Creating a feedback loop between decisions and currency levels: as the system becomes more refined, the collective decisions can be used to adjust the currency levels of the base elements. Grounding the results in regulatory or policy instruments: we will strive to link the multi-sensory decision making capabilities that have been co-developed with the community, to real world impacts. We hope that this might be achieved by testing a cross boundary tax payable in one (or a blend) of the new currencies. Another idea is to link the currencies to specific policy issues to elevate the agency and collective voice of the community. This second point has a strong link to the work that we have been prototyping in Sweden, Scotland and Canada to create multi-sensory, living indicators.

We don’t know the exact details of how the above will be achieved but we are confident that the collective knowledge and tools are already available. The steps need not be taken consecutively and we imagine that different elements can be worked on across multiple contexts and time frames. A big part of this journey will be connecting people, ideas and technologies to work towards this common goal. If you would like to contribute or be kept up to date on progress then please do get in touch.

Thank you to everyone who has engaged with the project so far — we are honoured to be in community with you.

This blog was written by Emily Harris (emily@darkmatterlabs.org) under Dark Matter’s Next Economics LAB.

Shifting from the We of Dark Matter to a more personal angle, I (Emily) would also like to acknowledge that I am probably a long way out of my depth in trying to seed this idea. I am a chartered accountant by background and spent the early part of my career working for Deloitte in corporate finance in the City of London. I am supported by amazing colleagues with diverse skills and backgrounds, but my own understanding of the technological (and other specialist fields) pointed to above is limited. My hope is that I can play a small role in bridging between these different worlds. I’m willing to try because it feels deeply important.

References and credits:

Credit: question from Michael Haupt via Medium comments. Credit: suggestion from Linda Stevens via Medium comments. Credit: provocation from Jake Hoban via Medium comments. Credit: thank you Mark Ballandies for the research paper on this topic sent by email. Kim Stanley Robinson, The Ministry For The Future (Chapter 45, 50). Yanis Varoufakis, Another Now: Dispatches from an Alternative Futures (Chapter 6). For example, Paddy Le Flufy who is currently working on several connected ideas including a One Planet Living Token. Paddy le Flufy is the author of Building Tomorrow: Averting Environmental Crisis With a New Economic System.

Towards multivalent currencies, bioregional monetary stewardship and a distributed global reserve… was originally published in Dark Matter Laboratories on Medium, where people are continuing the conversation by highlighting and responding to this story.


PingTalk

How to Detect & Prevent Bots on Your Website & Apps | Ping Identity

Bots are automated software applications that help humans to automate tasks that are often repetitive and time-consuming. Though not all bots are bad, and some are even helpful – think chatbots, search engine bots, web scraping bots, and so on – they can also be used for harm, and wreak havoc on a company’s analytics and security. Companies must therefore be vigilant and ready to mitigate any risk

Bots are automated software applications that help humans to automate tasks that are often repetitive and time-consuming. Though not all bots are bad, and some are even helpful – think chatbots, search engine bots, web scraping bots, and so on – they can also be used for harm, and wreak havoc on a company’s analytics and security. Companies must therefore be vigilant and ready to mitigate any risks that come with bots. Here’s how.

Tuesday, 13. February 2024

SC Media - Identity and Access

Deepfake-proofing the president: What is cryptographic verification?

The White House’s AI advisor says work is underway to distinguish official statements from AI fakes.

The White House’s AI advisor says work is underway to distinguish official statements from AI fakes.


FindBiometrics

Scottish Biometrics Commissioner Confirms Police Compliance with Code of Practice

The Scottish Biometrics Commissioner has published his office’s first annual assessment of law enforcement’s compliance with the Scottish Code of Practice, determining that Police Scotland has checked all the boxes. The Code […]
The Scottish Biometrics Commissioner has published his office’s first annual assessment of law enforcement’s compliance with the Scottish Code of Practice, determining that Police Scotland has checked all the boxes. The Code […]

Checking Rules and Regulations in Ireland, Scotland, and the US – Identity News Digest

Welcome to FindBiometrics’ digest of identity industry news. Here’s what you need to know about the world of digital identity and biometrics today: Irish Parliamentary Committee Hears Testimony on Biometric […]
Welcome to FindBiometrics’ digest of identity industry news. Here’s what you need to know about the world of digital identity and biometrics today: Irish Parliamentary Committee Hears Testimony on Biometric […]

SC Media - Identity and Access

IntelBroker claims partial Facebook Marketplace database exposure

Facebook Marketplace had a portion of its database claimed to be exposed by IntelBroker, Hackread reports.

Facebook Marketplace had a portion of its database claimed to be exposed by IntelBroker, Hackread reports.


Spruce Systems

Developer Update #40

In case you missed it, check out our previous update here: SpruceID Developer Update #39 In case you missed it, check out our previous update here: SpruceID Developer Update #38At SpruceID, we’re letting users control their identity and data across the web. Here’s the latest from

In case you missed it, check out our previous update here:

SpruceID Developer Update #39 In case you missed it, check out our previous update here: SpruceID Developer Update #38At SpruceID, we’re letting users control their identity and data across the web. Here’s the latest from our development efforts.SpruceIDSpruce At SpruceID, we’re letting users control their identity and data across the SpruceIDSpruce

At SpruceID, we’re letting users control their identity and data across the web. Here’s the latest from our development efforts:

Product Updates

Last month, we continued building out credential issuance workflows for customer engagements. Our development efforts have been primarily focused on building new features for Credible, our credential issuance and lifecycle management platform. We have exciting developments underway for Credible and the associated Reference Wallet, which we will share more on as they get closer to public launch.

Open-Source Library Updates (SpruceKit)

SpruceKit is a collection of libraries that power your application to accept digital credentials from users on their terms, originate trusted information for users, and interact with user data vaults. SpruceKit consists of the following open-source libraries:

TreeLDR

We added a mechanism to add custom properties to layouts to build extensions. We added support for CBOR (#111).

SpruceID Joins the Rust Foundation

We’re pleased to announce that SpruceID has joined the Rust Foundation, an independent non-profit organization dedicated to stewarding the Rust programming language, nurturing the Rust ecosystem, and supporting the set of maintainers governing and developing the project. Read the announcement.

About SpruceID: SpruceID is building a future where users control their identity and data across all digital interactions.


Entrust

Reshaping Trust and Security for the Realities of 2024

The new year is always a time for reflection, and after a year shaped by... The post Reshaping Trust and Security for the Realities of 2024 appeared first on Entrust Blog.

The new year is always a time for reflection, and after a year shaped by AI, biometrics, and nation-state attacks there’s a lot to think about. So where are we headed from here? A utopian episode of Star Trek or a more dystopian Terminator 10 movie? The future as they say is up to us and that means reshaping trust and security for the realities of our brave new world.

The intensifying geopolitical landscape characterized by nation-sponsored attacks has made cyber the new battleground. Employing Zero Trust principles is a best practice, but not a cybersecurity guarantee in an era of AI-enhanced phishing, zero-day brokers, and malware-as-a-service. Adopting a term from U.S. Cold War simulations, red teaming is back in vogue. The red team is a pretend enemy that attempts to mount a cyberattack acting under the direction of the target organization. One such example comes from the European Central Bank, which has started to conduct vulnerability assessments and incident response tests on banks to assess their cyber resilience. Whether simulated or real, all organizations should seek to turn cyber breaches into a blueprint for future security with actionable strategies to strengthen their cyber posture.

One particular area of cyber exposure is critical infrastructure, from the power grid to water treatment plants to public service providers and beyond. We live in an “everything is connected to everything” world, making critical infrastructure a vulnerable and attractive target to bad actors. In 2024, it’s long past time to get serious about IoT and Industrial IoT (IIoT) security. As well, AI-enabled biometric identity verification of employees, partners, and customers is increasingly essential to flushing out deepfakes and keeping organizations and people safe.

There are also some encouraging signs that regulators around the globe are stepping up to the challenges ahead. Similar to data privacy, consumer protection, and digital identity legislation, Europe is on the vanguard once again with the EU AI Act. The White House has also issued the Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence Executive Order and the bipartisan Schatz-Kennedy AI Labeling Act has been tabled. However, being that it’s an election year in the U.S., it’s likely that big tech will be left largely to self-regulate over the next 12 months. That said, the tech sector is stepping up, with seven technology behemoths including Amazon, Google, and Microsoft agreeing to adopt AI safeguards with the Biden administration. As well, ChatGPT maker and close Microsoft collaborator OpenAI has released its framework to mitigate catastrophic risks like using generative AI to build biological weapons, spread malware, or carry out social engineering attacks. Meanwhile, the National Institute of Standards and Technology (NIST) has been busy building out the Artificial Intelligence Risk Management Framework (AI RMF) in close collaboration with the private and public sector to continue to leverage the power of AI while mitigating the risk.

So, at the outset of 2024, I remain cautiously optimistic that the forces of good will prevail and it will be a utopian technology-fueled future.

The post Reshaping Trust and Security for the Realities of 2024 appeared first on Entrust Blog.


auth0

A Farewell Letter to Passwords

My farewell letter to passwords and why our relationship is not working anymore.
My farewell letter to passwords and why our relationship is not working anymore.

Indicio

Why use the Proven Mobile SDK?

The post Why use the Proven Mobile SDK? appeared first on Indicio.
Save your organization valuable development time while building digital wallet functionality right into your existing applications.

By Tim Spring

An organization’s digital app is often one of the most effective ways it engages customers. For example, the American Bankers Association found that 48% of bank customers prefer mobile apps as their method of choice for managing their finances.

Given this, you want to make sure of two things: One, that it offers easy-to-use features, and two,  that your customers stay on your app spending their time and money with you rather than going elsewhere.

Indicio designed the Proven Mobile SDK specifically to help you do both of these things.

Faster App Feature Deployment

App design and construction is not a simple process: for applications with multiple features, most sources say the average development time is nine months or longer. With this much time and money at stake, adding new features that require complex code changes is daunting, and it can be tempting to simply add on a third party application to achieve the required functionality.

But relying on a third party to handle features means sending your customers to that third party, reducing their time spent on your platform, and — crucially — losing control of their customer experience. For example, if you run a business and the only way to purchase something on your website is a third-party payment platform that is constantly buggy, the experience will be tied to using your website and risk losing you customers. People do not reward complexity and hassle online.

This is where the Proven Mobile SDK comes in. It helps you build digital wallet functionality directly into your existing application without your team having to reinvent the wheel. Built with packages for Swift, Kotlin, and React Native, your developers can add decentralized identity to your applications without the time commitment of learning new programming languages or practices.

But what functionality and features does the SDK add to your application that will enhance the customer experience?

Faster interactions through pre-verified data

Full digital wallet functionality allows you to hold and share verifiable data and identity information inside a digital credential — a “verifiable” credential. With verifiable credentials you can be cryptographically confident in who you’re interacting with online. Unlike passwords, verifiable credentials automatically share who issued them and who is presenting them in a provable way, as well as whether any information has been altered since being created, making them impossible to copy or steal.

Because these credentials can prove who you are on their own, they remove the need for the customer to jump through hoops when accessing their account, such as passwords, MFA, security keys, and more, letting customers move right on to what they came for.

Indicio Proven Mobile SDK in action:

One powerful use case for verifiable credentials is Know Your Customer (KYC) regulations. A financial institution can do its full customer vetting for account creation, a business loan, or mortgage and issue them a credential containing the relevant information. This means that the customer can share that information in a way that is provable,  removing the need to repeat the expensive KYC process.

Schools can issue credential-based diplomas that allow students to share them without the need to reach back out to the school for verification; businesses can use them for both employee access management and customer loyalty programs; countries can allow their citizens to share their resident status for all kinds of things from travel to paying bills to proving residency.

In short, once you’ve met your initial identity assurance requirements, verifiable credentials can eliminate redundant paperwork and data input, and provide a simpler, streamlined, better customer experience.

And the Proven Mobile SDK provides all the pieces you need. In a fraction of the time. On any application. In whatever language your team is familiar with.

If you would like to learn how to get started you can read more about the Proven Mobile SDK, or reach out to our team.

The post Why use the Proven Mobile SDK? appeared first on Indicio.


liminal (was OWI)

From OTPs to Passkeys: Navigating the Customer Authentication Landscape

In today's digital age, navigating the customer authentication landscape is paramount for ensuring data security. With the increasing need to protect sensitive data against cyber threats, organizations across various sectors are seeking more robust, user-friendly authentication solutions. The market for customer authentication has been evolving rapidly, with technological innovations reshaping tra
In today’s digital age, navigating the customer authentication landscape is paramount for ensuring data security. With the increasing need to protect sensitive data against cyber threats, organizations across various sectors are seeking more robust, user-friendly authentication solutions. The market for customer authentication has been evolving rapidly, with technological innovations reshaping traditional security measures.

Despite the availability of advanced authentication technologies, such as FIDO2 passkeys and biometric systems, many organizations are hesitant to transition from traditional multi-factor authentication (MFA) methods. Concerns about the opportunity cost of change, the complexity of new technologies, and educational barriers for stakeholders are some of the key challenges faced. Additionally, integrating authorization features into authentication solutions has become a critical requirement, yet many are unsure how to implement these advancements effectively.

In light of these complexities and challenges, the critical questions for business leaders in the market for authentication solutions include: How can my company effectively navigate these changes in customer authentication methods? What strategies can be adopted to balance the need for robust security and a seamless user experience? How can my company reconcile the gap between consumer readiness and implementing newer authentication technologies?

Watch the video for highlights:  Key market insights:  48% of practitioners planning to adopt passwordless solutions in the next two years prefer biometric authentication. Only 5% of surveyed practitioners prioritize FIDO2 passkeys as their chosen passwordless authentication method due to limited consumer education. 83% of businesses express concern about the security of OTPs, but 74% still plan to continue their use over the next two years. 81% of solution seekers use identity platforms for customer authentication, with CIAM being the most popular, indicated by 47% of buyers. AI-enabled adaptive authentication is seen by 64% of buyers as the most effective for balancing security and user experience. 69% of buyers demand continuous authentication beyond login, valuing AI-enabled anomaly detection. In the purchasing process for customer authentication, only 36% of respondents prioritize compliance, and just 9% consider it a top criterion, with a greater emphasis on scalability, solution accuracy, and authenticator variety.

Download the report for more insights and a buyer’s guide

Related content: Facial Biometrics: Trends and Outlook The Rise of Integrated Identity Platforms Creating a Unified Cross-Enterprise Authentication Layer Outside-in Report: Biometrics Primer (Members Access Only)  What is Customer Authentication?

Customer authentication is a critical security process that identifies and verifies users before allowing them access to digital accounts. It’s built on login credentials established during user onboarding, utilizing a blend of knowledge-based (passwords), possession-based (security tokens), and inherence-based (biometrics) authenticators. The adoption of multi-factor authentication (MFA) has become standard practice, requiring users to provide multiple identity proofs to enhance security and prevent unauthorized access.

This security measure is essential for protecting against common threats like account takeover attacks by ensuring that only authorized users can access sensitive information. Despite the prevalence of password-based systems, their effectiveness is compromised by common security pitfalls, such as the reuse of passwords. Therefore, more advanced authentication methods, including MFA and adaptive authentication, are increasingly important. These methods assess risk levels and adjust security measures dynamically, improving the security posture without compromising user experience. New technologies like FIDO2 passkeys and biometric authentication pave the way toward more secure, passwordless authentication methods, emphasizing ease of use and improved security.

Customer authentication is a crucial building block gatekeeper for user data and access in digital security. As the digital landscape continues to evolve, customer authentication methods have also undergone significant advancements, aiming to strike a balance between robust security and user-friendliness. This balance helps to ensure that authorized users can easily access the digital world while maintaining the highest level of security measures to protect sensitive information.

The post From OTPs to Passkeys: Navigating the Customer Authentication Landscape appeared first on Liminal.co.


SC Media - Identity and Access

Azure account takeover campaign targets senior execs

The ongoing campaign incorporates individualized phishing lures and has targeted hundreds of user accounts across dozens of organizations.

The ongoing campaign incorporates individualized phishing lures and has targeted hundreds of user accounts across dozens of organizations.


This week in identity

E46 - SecureAuth acquire Cloudentity / Entrust to acquire OnFido / Cisco announces Identity Intelligence / Mastercard Emerging Trends

This week Simon and David focus on a new raft of pending acquisitions. They discuss the impact of SecureAuth and Cloudentity joining forces as well as news that Entrust are in talks to buy OnFido. They also cover the announcement that Cisco has launched a new Identity Intelligence offering hot on the back of acquiring ITDR vendor Oort in 2023. They finish up by taking a look at an emerging technol

This week Simon and David focus on a new raft of pending acquisitions. They discuss the impact of SecureAuth and Cloudentity joining forces as well as news that Entrust are in talks to buy OnFido. They also cover the announcement that Cisco has launched a new Identity Intelligence offering hot on the back of acquiring ITDR vendor Oort in 2023. They finish up by taking a look at an emerging technology trends report released by Mastercard. Is Data security the next big IAM integration story?


IDnow

IDnow joins consortium aimed at making crypto assets compliant with new EU regulations

System to achieve full compliance of Crypto Asset Service Providers and self-hosted wallets with AML regulations and the Transfer of Funds Regulation   Munich, February 13, 2024 – IDnow, a leading identity verification platform provider in Europe, joins a consortium of five partners including the IOTA Foundation, walt.id, SPYCE.5, and Bloom Labs with the goal of […]
System to achieve full compliance of Crypto Asset Service Providers and self-hosted wallets with AML regulations and the Transfer of Funds Regulation  

Munich, February 13, 2024 – IDnow, a leading identity verification platform provider in Europe, joins a consortium of five partners including the IOTA Foundation, walt.id, SPYCE.5, and Bloom Labs with the goal of making Crypto Asset Service Providers (CASPs) and self-hosted wallets compliant with the European Anti-Money-Laundering Regulation and the Transfer of Funds Regulation (TFR). 

The new TFR regulation in the EU mandates that all cryptocurrency transactions will need to carry identifying data of the sender and the receiver. According to the new rule, compliance with TFR is mandatory for all CASPs. Additionally, the new AML Regulation will require all CASPs to comply with similar AML rules as other financial institutions. For example, when a user opens an account and registers a wallet with a CASP, an identification process is required to comply with the new AML Regulation and TFR.

One challenge for CASPs to adhere to the new rules lies in GDPR compliance, as personal identifiable information (PII) should not be stored on blockchains or Distributed Ledger Technologies (DLT). However, to comply with the new regulations, CASPs need to know with whom they are doing business and continuously verify this information.

Raising trust and transparency in crypto asset transactions

To address this challenge, the partners have formed a consortium to propose a system where a trusted party tokenizes an identification process it has witnessed, allowing CASPs to have confidence in this process, without revealing any PII. The resulting soul-bound token (SBT) can be used for blockchain processes, enabling web3 native interactions. Furthermore, the trusted party can reveal the identity information, if requested by an authorized party, such as law enforcement, as well as revoke the SBT, if needed.

Within the consortium, the IOTA Foundation, a non-profit foundation supporting the development of the IOTA protocol, will provide the underlying network as the proposed solution will be implemented on an Ethereum Virtual Machine (EVM)-compatible IOTA Smart Contract Chain. walt.id, a leading open source vendor of decentralized identity and wallet infrastructure, will develop, provide, and maintain the trusted witness service for creating and verifying SBTs like identity proofs, while IDnow will deliver the identity verification solution to onboard users into the wallet solution. Bloom, an all-in-one wallet for the IOTA, Shimmer and EVM ecosystem, will provide the capabilities for users to store, present and prove ownership of the SBT. SPYCE.5, specializing in the integration of hybrid blockchain technologies, will provide the essential infrastructure for seamless interchain communication and transaction validation, ensuring the system’s overall efficiency and regulatory compliance.

Statements

“We are excited to be part of this forward-thinking consortium alongside highly esteemed crypto asset industry players to address a pressing need for crypto wallet solutions that comply with the latest EU regulations. Crypto companies are facing a race against the clock to implement new requirements, which is why we wish to submit this proposed solution to the EU for consideration to address the technological and regulatory challenges around AML, KYC and TFR in crypto,” says Rayissa Armata, Director Global Regulatory and Government Affairs at IDnow.

“We are thrilled to be working with such incredible partners to create a truly seamless user experience and to provide the underlying distributed ledger technology. Identity verification in Web3 environments should be simple and straightforward, without sacrificing privacy or security. As regulatory requirements grow, we need innovative solutions that are easy to use for both businesses and everyday users,” says Dominik Schiener, Chairman of the Board of the IOTA Foundation.

“We have been building decentralized identity and wallet solutions for many clients across various industries, but this project is particularly interesting as it shows new ways of onboarding and verifying users for crypto and DeFi use cases that have traditionally been struggling with regulatory compliance,” says Dominik Beron, Founder of walt.id.

“Bloom is excited to be collaborating with the partners in this consortium. This project will ensure our users and the crypto ecosystem can benefit from a self-hosted wallet that complies with EU regulations,” says Nicole O’Brien, Co-founder of Bloom Labs.

“We’re proud to be part of this groundbreaking consortium, leveraging SPYCE.5’s hybrid chain architecture to navigate the complex landscape of EU regulations. Our innovative approach ensures not only compliance but also maintains the integrity and efficiency vital in the crypto space,” says Holger Köther, Managing Director of SPYCE.5.

Learn more about the project in the video “KYC Done Right with IOTA”. Source: IOTA Foundation

Monday, 12. February 2024

SC Media - Identity and Access

Unsecured database leaks WinStar app customer data

TechCrunch reports that Oklahoma-based casino and hotel resort WinStar had its customers' sensitive data inadvertently leaked by an exposed database owned by Nevada software startup Dexiga, which developed the casino resort giant's My WinStar app.

TechCrunch reports that Oklahoma-based casino and hotel resort WinStar had its customers' sensitive data inadvertently leaked by an exposed database owned by Nevada software startup Dexiga, which developed the casino resort giant's My WinStar app.


FTC to bolster action against big tech data violations, says chair

Federal Trade Commission Chair Lina Khan has noted that more aggressive actions will be taken by the agency against data misuse by big tech companies after it was sued by Meta to reverse a rule preventing the monetization of children's information, according to The Record, a news site by cybersecurity firm Recorded Future.

Federal Trade Commission Chair Lina Khan has noted that more aggressive actions will be taken by the agency against data misuse by big tech companies after it was sued by Meta to reverse a rule preventing the monetization of children's information, according to The Record, a news site by cybersecurity firm Recorded Future.


FindBiometrics

InterBio Gets More ID & ABIS Business in Indonesia

TOTM Technologies’ wholly-owned subsidiary InterBio has won additional contracts from the Indonesian government, totalling $7.5 million. The bulk of the award, which was signed by the Ministry of Home Affairs, concerns technical […]
TOTM Technologies’ wholly-owned subsidiary InterBio has won additional contracts from the Indonesian government, totalling $7.5 million. The bulk of the award, which was signed by the Ministry of Home Affairs, concerns technical […]

London Mayor Predicts Biometric Border Screening ‘Chaos’ for Eurostar

The Mayor of London, Sadiq Khan, has called on British ministers to support HS1, the operator of the Eurostar rail line linking the United Kingdom and France, to ensure it will […]
The Mayor of London, Sadiq Khan, has called on British ministers to support HS1, the operator of the Eurostar rail line linking the United Kingdom and France, to ensure it will […]

Elliptic

Crypto Gaming Platform PlayDapp Suffers $290 Million Breach

Crypto gaming platform PlayDapp has suffered breaches resulting in the loss of PLA tokens worth $290 million - based on their market value at the time of the thefts.

Crypto gaming platform PlayDapp has suffered breaches resulting in the loss of PLA tokens worth $290 million - based on their market value at the time of the thefts.


auth0

Auth0 SDK for .NET Desktop and Mobile Applications Supports MAUI

Support for .NET MAUI is now available in the Auth0 SDK for .NET Desktop and Mobile Applications. Let's have a quick overview.
Support for .NET MAUI is now available in the Auth0 SDK for .NET Desktop and Mobile Applications. Let's have a quick overview.

SC Media - Identity and Access

Getting zero-trust initiatives off the ground

A new survey finds that zero-trust efforts frequently go off the rails. We look at what it takes to establish a successful ZT-forward strategy.

A new survey finds that zero-trust efforts frequently go off the rails. We look at what it takes to establish a successful ZT-forward strategy.


KuppingerCole

Mar 13, 2024: Unlocking Zero Trust Network Access

Zero Trust Network Access (ZTNA) is becoming increasingly essential as organizations adapt to remote work, cloud adoption, and the growing sophistication of cyber threats. As a result, ZTNA solutions play a critical role in today's cybersecurity landscape by providing a holistic approach to secure access to business applications and resources regardless of user location. These solutions fundamental
Zero Trust Network Access (ZTNA) is becoming increasingly essential as organizations adapt to remote work, cloud adoption, and the growing sophistication of cyber threats. As a result, ZTNA solutions play a critical role in today's cybersecurity landscape by providing a holistic approach to secure access to business applications and resources regardless of user location. These solutions fundamentally align with the principles of Zero Trust, extending its influence across devices, networks, systems, applications, and data. This webinar provides a comprehensive overview of ZTNA solutions, vendor offerings, and adoption strategies, addressing the growing need for adaptive security measures amidst remote work and cloud adoption.

IDnow

What’s love got to do with it? Exposing the romance scammers, with Becky Holmes.

We sit down with the romance fraud vigilante-turned-author to discuss the responsibility she feels for victims, why social media platforms need to be held more accountable, and why she put a photo of Keanu Reeves on the cover of her book. Despite having a popular Twitter account (114,000 followers and growing), with the sole purpose […]
We sit down with the romance fraud vigilante-turned-author to discuss the responsibility she feels for victims, why social media platforms need to be held more accountable, and why she put a photo of Keanu Reeves on the cover of her book.

Despite having a popular Twitter account (114,000 followers and growing), with the sole purpose of exposing romance scammers, Becky Holmes says she still gets contacted by hopeful, hapless fraudsters. In fact, she was contacted just 15 minutes before our interview. 

“Fortunately, a lot of them are not very observant, so I get to do what I do without too much recourse,” said Becky. 

What Becky does is two-fold: She exposes the often-hilarious exchanges with romance scammers, some of whom claim to be celebrities, and, more recently, provides a forum in which people can share their experiences and learn more about this increasingly common form of fraud.  

The conversation is clearly needed. Romance scams rose by 19% globally in 2023. Considering only 58% of victims ever report the crime to law enforcement, that figure is likely to be much higher. 

Becky has just released her debut book, ‘Keanu Reeves is Not in Love with You’, featuring first-hand accounts of victims, examples of scripts used by fraudsters, and a look into the psychology of fraud. 

Although the book itself only took nine months to write, the aim of helping people and raising awareness of the dangers of social media fraud began when Becky first joined Twitter (now X) in 2020. 

“Almost immediately I got a flurry of messages from all these fabulously handsome men, and they were all immediately in love with me. They all said the same thing and were all from very similar looking men. Some were in the army, and some were in pilots’ uniforms, and it was very clear that there was a pattern.  

“At first, I was blocking and deleting them and having a bit of fun by seeing how nonsensical I could be before they picked up on what I was saying. I began posting the exchanges on Twitter a few months later, and people found them funny and were laughing and we were all having a good time. Then, in 2021, quite a few victims of romance fraud got in touch with me and said that they’d been a victim and had lost a large amount of money. That’s when it all changed.” 

In this interview, we discuss the different ways in which romance scammers tend to target men and women, the delicate balance in presenting information in a relatable, digestible format that does not undermine the danger and impact, and much more.

What were the main reasons behind becoming a ‘romance scam vigilante’? Did you ever become a victim yourself [i.e. transfer funds, or ‘fall in love’]?

So, I’m fortunate enough to have never been the victim of romance fraud. However, I was in a very coercively controlled relationship at one point, and the parallels between what victims of romance fraud go through and victims of coercive control are very similar. 

I ended up speaking on the phone with many victims and they said they hadn’t told anybody, and they didn’t know what to do. I hated the idea of people feeling lonely because I’ve been in that situation myself. So, I became kind of a sounding board for a few people and felt a responsibility.

Most had reported to Action Fraud but not heard anything back, so were in a very isolated place. I started realizing what a huge subject it was and how underreported it was, and it became like a passion project.

[Read more about the steps that the UK government is taking to address fraud, in our blog ‘UK declares war on fraud, calls on tech giants to join the fight.’]

I see that many of the messages you share on Twitter are actual Twitter exchanges, in your personal experience, is this where romance fraud is most common?

No, I get messages on Instagram, on Twitter, and dating sites are, of course, rife with them. I’ve also heard about people receiving messages on TikTok, and bizarrely, if you play online games, romance fraudsters are lurking there too. Whatever platform it is, the one thing they immediately want to do is to take you to WhatsApp or Google Chat. I have a separate Google Chat account where I have these exchanges.

Romance scams target both men and women. Are there any differences in approaches?

Although I never view stats as wholly reliable, statistically, romance fraud targets men and women equally. I want to stress that not all women are approached with the promise of romance and men with the promise of sex, but more men do tend to be victims of sextortion. Sextortion is when you have exchanged intimate images with somebody, and that person then blackmails you. For example, a married man with kids or a high position at work will send images to someone [who is almost certainly not a Russian model], and the recipient will then reply and say that they’ve hacked into your Facebook and can now see your children’s Facebook accounts, and your colleagues’ accounts, and that’s when the requests for money start.

It’s the old adage, isn’t it, and, of course, there’s a danger of generalizing, but men are more led by what they see, and women by what they hear.

That’s right, although the example that I use in the book is of a lady who believed she was in a relationship with somebody, but very quickly realized it was a scam. When she ended the relationship, he said, “Yes, you’ve caught me. I’m not this person, but I’ve got your images. I know who your children are on Facebook.” He extorted about £14,000 from her. With that case, she never tried to get the money back, or go to the police as she had grown up children and grandchildren, and even though I assured her that the police would keep such investigations confidential, she was terrified her family would find out.

I once interviewed an ex-police officer who believed the danger of romance scams/ pig butchering is how it often leads to more serious forms of deception, and often spoke about the biggest hurdle being the perception of fraud, with many people not taking it as serious as, say, burglary. What is your opinion on how fraud is perceived in the UK?

So, as you’ll know, pig butchering scams originate from Southeast Asia and this is an area where human trafficking is rife, because during COVID people with good computer skills were offered jobs and taken to scam compounds. 

So, in some cases, it’s very cruel for both sides – even the name, ‘pig butchering’ and likening it to fattening up a pig for slaughter is just so repulsive.

However, it’s hard for me to think of the majority of romance fraudsters as victims. I’ve spoken to a couple of scammers and most of them are definitely not being trafficked. Sometimes it even becomes a racial issue. One scammer from Ghana told me it’s a way of getting back at the West for everything that’s happened to them over the centuries, which is a ridiculous excuse because he’s not putting the money into rebuilding parts of his home, he’s spending it on trainers and luxury goods.

Are there any anecdotes that have stuck with you and that you would like to share?

I’ve heard of people taking their own life, which you can understand if they’ve lost absolutely everything and are in the depths of depression. Some of the pig butchering scams, which leverage fake investment platforms are technologically very impressive. Regarding stories that have stuck with me, there was a lady who contacted me after seeing my Twitter and we spoke on the phone quickly after. She told me she had stage 4 cancer and had lost a fortune to this one guy. She knew that he was a fraudster; he’d admitted it, but she was still sending him money because she felt this bond with him.

I remember her telling me that the only time her son will know what’s happened is when she dies, and he sees that he’s got no inheritance.

Becky Holmes, author of ‘Keanu Reeves is Not in Love with You’

I interviewed her, and then a few months later, as I was getting closer to submitting the book to the editors, I messaged her and asked if we could we meet up.  

She replied and told me she was in a hospice, but that she really wanted to stay strong for the book launch, and I told her don’t worry, we can talk about this another time. 

A few days later, I received a message from her Twitter account, and it was from her son saying his mum had passed away, but that he had seen our exchanges, and knew she had lost money. He wanted to know what on Earth had happened. I remember calling him at about 9pm one Friday night and explaining what had happened to his mum and where this money had gone. The saddest thing for me was how lovely her son was. He said how he wasn’t angry with his mum; he was just devastated that she hadn’t confided in him. It’s heartbreaking to think how many people don’t confide in family and friends when they will probably get the same understanding reaction. There is nothing to be ashamed or embarrassed about.

It’s a delicate balance to strike, between presenting this information in a relatable, digestible format, and in a way that undermines the danger and impact of romance fraud. How did you decide the best way to set the tone in raising awareness of the scourge of romance fraud?

So, the feedback to the book has been exactly what I’d hoped it would be. People have said they’ve been able to digest it because it’s readable and funny, but I’ve also shared the victim stories, and explained what happens in some of the countries.

Humor to me is a really good way of bringing to life what can be quite a dry subject. Often when people talk about fraud, everyone in the room switches off, and so I think as long as it’s never seen as humor at the expense of the victims [which I’ve never been accused of], then it’s fine.

Becky Holmes, romance fraud vigilante

One thing that surprised me from researching for this interview you was the sheer number of ‘celebrity’ scams. This has never happened to me [yet!]. How common is it?

Well, I never received a message from a celebrity on a dating site, but Twitter and Instagram are crammed full of them. My book is called ‘Keanu Reeves is Not in Love with You’ because Keanu Reeves is not on any form of social media, and yet he is the most impersonated celebrity. Particularly on Twitter. 

Something I always get asked is why anybody would send money to an apparent multi-millionaire. The fraudsters have come up with new ways of getting money from people under the guise of being a celebrity. One of the most common is that they will say they really want to meet the person, but their manager is a bit of a tyrant, and when the celebrity next comes to the UK, the manager wants to set up a series of meet and greets, all, of course, at a cost. Another effective approach is when they say that they are raising money for a charity. Then, even if you just send £10, you’re considered a viable target.

People exposing scams sometimes face repercussions. Do you ever fear for your safety?

I’ve had some death threats. Once, when a scammer asked me for money, I agreed and asked for him to send me his bank account details. Noticing it was registered in the United States, and obviously not his account, I contacted the bank and told them what it was being used for, and it was shut down. Obviously, as only a day had passed between him sharing his details with me and the account being closed, he knew it was my doing. So I received some horrific abuse and threats then.  

However, as I’ve spent so long researching this topic, I’m aware that a lot of these fraudsters are outside of the UK so have never been too concerned. Having said that, a lot of people will have links to the UK, so I never take my safety for granted.

How much do you hold these platforms accountable? Do you pass on the scammers’ details to relevant authorities, and how involved are you in the prosecution etc? Do you think enough is being done to crack down on this type of fraud?

Well, I mean certainly more could be done. It’s difficult, Jody, because a lot of the fraudsters are very technologically advanced and always seem to be one step ahead. I also hold the social media platforms accountable. In Nigeria, romance scammers are called the Yahoo Boys, and they even have their own Facebook group with almost 13,000 active members. Now I’ve reported the page, as have many other people, to Facebook, but it’s never been taken down. When the key word of the group is categorically associated with fraud, how is that allowed to stay up? 

Facebook should be 100% be held accountable. I think it’s outrageous. Twitter and Instagram are slightly better because they’re quite good at taking accounts down when they’re reported. However, all you need is for somebody to set up a new email address, and they’re straight back on. 

Elon Musk removing the blue tick from Twitter is a fraudster’s dream. Suddenly you have an influx of people pretending to be celebrities who don’t need a blue tick anymore. I obviously hold the fraudsters most accountable, but I do think the platforms have a duty to be looking at this more carefully.  

It’s pointless me pointing out Twitter profiles of people because they’ve been kicked off the platform before, and they don’t use their real names either. I’ve also been told by UK police that if a WhatsApp number, for example, is out of UK jurisdiction, there is pretty much nothing they can do.  

40% of all crime in this country is fraud, and yet police dedicate just 1% of resources to combat it, so, they’re not going to use that 1% trying to find somebody overseas.

Certain platforms, such as LinkedIn are experimenting with an identity verification step, although it is not yet a mandatory requirement. Perhaps because they do not want to alienate their customer base?

Yes, some dating platforms also require an identity verification step, where you have to wave your hand to show you’re a living person; however, what’s to stop a fraudster just doing that? So, more can be done, absolutely. I know the CEO of the Online Dating & Discovery Association, and I do know that they take it very seriously, and steps are being taken. For example, if a fraudster is swiping right on every single woman’s profile in the hope that somebody connects, the ODDA will pick that up.

How much do you keep in contact with victims after they’ve messaged you?

I count three or four of them as friends now. A lot of these women that I speak to are people that I would naturally gravitate to. They are intelligent, interesting women, and people that I would like to be friends with, and they’re kind as well, and we could all do with some kind friends, couldn’t we?

Technology offers great opportunities to flag problematic behavior and identity users, thereby creating safer online environments, but it is also a double-edged sword. With the advent of AI-generated content like deepfakes, how optimistic are you regarding the future of romance fraud?

I fear for the future. A lot of romance fraudsters will pretend to be in the army or on an oil rig or a doctor or a surgeon working abroad, which they’ll use as an excuse for not being able to speak to somebody. This reluctance to video call can be a major red flag. But now, fraudsters will be able to transpose someone’s face over their own face and talk naturally. It’s terrifying.

If you’re interested in more insights from industry insiders and thought leaders from the world of fraud and fraud prevention, check out one of our interviews from our Spotlight Interview series below.

Jinisha Bhatt, financial crime investigator
Paul Stratton, ex-police officer and financial crime trainer
David Birch, global advisor and investor in digital financial services

By

Jody Houton
Senior Content Manager at IDnow
Connect with Jody on LinkedIn


 
 


PingTalk

How to Detect & Prevent Adversary-in-the-Middle Attacks | Ping Identity

Multi-factor authentication (MFA) is rightly considered a security upgrade on single-factor approaches like a simple username and password. MFA does a great job of reinforcing traditional login credentials, and in turn, is very effective at stopping fraudsters who take advantage of easy prey – such as the 12% of consumers who use one single password for every account across multiple platforms. In

Multi-factor authentication (MFA) is rightly considered a security upgrade on single-factor approaches like a simple username and password. MFA does a great job of reinforcing traditional login credentials, and in turn, is very effective at stopping fraudsters who take advantage of easy prey – such as the 12% of consumers who use one single password for every account across multiple platforms. In situations like these, where fraudsters rely solely on stolen credentials to perpetrate their crimes, MFA methods like SMS and email OTPs are very effective at preventing account takeover (ATO).

 

But although MFA makes organizations – and their users – feel safe, cybercriminals are constantly evolving their technologies and practices to get around this additional layer of security. Some of these methods rely on driving MFA fatigue, but others are more insidious and seek to bypass the protection offered by MFA altogether. All of this means that, while MFA is a great way to reinforce login credentials like usernames and passwords, it may not be enough to stop the latest types of ATO.

Sunday, 11. February 2024

KuppingerCole

Analyst Chat #201: Intelligent Defense - Is AI inevitable for Cybersecurity?

AI is becoming increasingly common and it appears to be unavoidable. However, is this really the case? Marina and Matthias have a conversation about the convergence of AI, machine learning, and cybersecurity. They highlight the importance of utilizing AI to protect our organizations, as attackers are also employing it. Marina explains the primary risks associated with AI in cybersecurity, such

AI is becoming increasingly common and it appears to be unavoidable. However, is this really the case? Marina and Matthias have a conversation about the convergence of AI, machine learning, and cybersecurity. They highlight the importance of utilizing AI to protect our organizations, as attackers are also employing it.

Marina explains the primary risks associated with AI in cybersecurity, such as automation mistakes, data security and privacy issues, and excessive dependence on AI. They also explore how defenders can safeguard their organizations from AI-generated threats and stress the significance of incorporating AI into a comprehensive cybersecurity strategy.



Saturday, 10. February 2024

Innopay

INNOPAY to take centre stage with keynote at Slovenia’s Bled eConference

INNOPAY to take centre stage with keynote at Slovenia’s Bled eConference from 08 Jun 2024 till 12 Jun 2024 trudy 10 February 2024 - 14:47 Bled, Slovenia 46.365421287915, 14.09594705 INNOPAY's Mariane ter Veen, Director and Lead Dat
INNOPAY to take centre stage with keynote at Slovenia’s Bled eConference from 08 Jun 2024 till 12 Jun 2024 trudy 10 February 2024 - 14:47 Bled, Slovenia 46.365421287915, 14.09594705

INNOPAY's Mariane ter Veen, Director and Lead Data Sharing, will open the renowned international Bled eConference in Slovenia this year with an insightful keynote titled 'Digital sustainability as a driving force for innovation'.

Having established itself as one of the longest-running and most respected international conferences in the digital commerce realm, the Bled eConference consistently draws speakers and delegates from diverse sectors such as business, government, information technology and academia. It serves as a pivotal platform for researchers engaged in all facets of digital transformation.

The 37th Bled eConference is scheduled to take place in the scenic lakeside town of Bled from 8-12 June 2024. The theme of this year’s event will be 'Resilience through Digital Innovation: Enabling the Twin Transition'.

For more information, visit the Bled eConference website.


KuppingerCole

Apr 09, 2024: Navigating Security Silos: Identity as a New Security Perimeter

Companies are grappling with countless challenges in the realm of identity security. These challenges range from dealing with the dynamic nature of identities, the rise of insider threats, the ever-evolving threat landscapes, handling the complexity of identity ecosystems to insufficient visibility into identity posture. This webinar explores the fundamental role of Identity Threat Detection &
Companies are grappling with countless challenges in the realm of identity security. These challenges range from dealing with the dynamic nature of identities, the rise of insider threats, the ever-evolving threat landscapes, handling the complexity of identity ecosystems to insufficient visibility into identity posture. This webinar explores the fundamental role of Identity Threat Detection & Response (ITDR) and Identity Security Posture Management in fortifying defenses against these challenges.

Friday, 09. February 2024

Entrust

Why it’s important to secure your Identity Provider (IdP) with high assurance identity

While breaches targeting identity as the initial attack vector are on the rise, with increasing... The post Why it’s important to secure your Identity Provider (IdP) with high assurance identity appeared first on Entrust Blog.

While breaches targeting identity as the initial attack vector are on the rise, with increasing success and significant financial and reputational damage inflicted, IdPs are quickly becoming the attack vector of choice. Attackers maximize the payload by infiltrating the most critical system in your organization designed to secure access to all your company and customer data.

The Threat in Context

Once a user with access to the organization’s IdP is compromised through an account takeover (ATO) attack, detection becomes increasingly difficult. The attacker gains persistence within the network, infiltrating multiple critical systems and applications. This enables them to cause damage over long periods of time.

One such attack recently compromised an IdP vendor through their customer support systems. The attack supposedly took place when a customer support user was logged in to their personal Google account and stored their username and password for their service account within Chrome. The attacker had compromised the employee’s Google account, thus gaining access to the credentials to the customer support system. Once access to the support system was gained, the attacker then used session tokens and cookies stored in the HTTP Archive (HAR) files (which were used to troubleshoot issues with the IdP) to access the customer’s network and deploy ransomware and/or exfiltrate data.

Another attack took place at a large entertainment and hospitality chain when an attacker used social engineering to gain access to information about a high-value target (user with privileged access to critical systems including their IdP). The attacker then contacted the IT support helpdesk requesting a reset of the high-value user’s MFA credentials. They provided all the required verification information obtained through the social engineering attack, enabling them to install their own MFA authenticator. The attacker was able to establish persistence within the organization’s network, gaining access to critical systems.

Securing Access to IdPs and Taking a Layered Approach

These examples showcase the need to secure access to IdPs with high assurance authentication. Privileged and high-value users with access to critical systems must use a high assurance passwordless authentication mechanism such as X.509 certificate-based authentication (CBA) when authenticating themselves. Enforcing CBA for users as well as devices ensures that the user is not logging in to critical systems on an unmanaged device. By enforcing CBA and eliminating passwords, organizations can defend against common identity-based attacks such as phishing and MFA bypass targeted at their privileged users. In addition, taking a layered approach to security is recommended, with risk-based adaptive access that can evaluate contextual information to dynamically grant or block access based on risk levels when authenticating to an application or service.

As we saw with the attack on the large entertainment and hospitality chain, protecting against these attacks during authentication is not enough; the processes of user registration or password/MFA reset requests must also be secure. By implementing identity proofing through identity verification of physical credentials such as passports and driver’s licenses, organizations enhance their security protocols. This is particularly valuable in processes such as resetting MFA credentials, registering or onboarding new users, or implementing step-up authentication for out-of-policy users or high-value transactions. By adopting these measures, organizations can further protect their critical systems and their existing IdP.

In addition to the above steps, organizations with on-premises environments can also secure their IdP with the use of hardware security modules (HSM) that provide enhanced security for securely storing keys, secrets, and tokens.

To learn more about how you can implement CBA or identity verification with Entrust IDaaS as part of your strategy to secure your most privileged and critical users, start a free trial of the platform today.

The post Why it’s important to secure your Identity Provider (IdP) with high assurance identity appeared first on Entrust Blog.


Production Analytics Key Indicators in Modern Card Issuance: Availability, Performance, Quality

Modern card issuance operations have embraced advanced technology, replacing outdated manual methods like clipboard monitoring.... The post Production Analytics Key Indicators in Modern Card Issuance: Availability, Performance, Quality appeared first on Entrust Blog.

Modern card issuance operations have embraced advanced technology, replacing outdated manual methods like clipboard monitoring. Today, operators and managers no longer jot down notes as they walk the floor; instead, they use digital platforms. Alerts, achievements, and operational details are instantly communicated across locations, giving operations executives real-time data to monitor machine performance and critical issues without leaving their offices. The shift to automated data analysis allows operators and technicians to access information from tablets anytime, anywhere, ending the need for manual charting. Additionally, aggregated machine intelligence provides a comprehensive view of plant productivity and cost-effectiveness in real time across multiple sites for an entire corporation.

These advancements are commonplace today, as Industry 4.0 cements itself in manufacturing history. It’s no surprise that digital intelligence brings increased efficiency and ROI opportunities to the forefront of smart card manufacturing. The Entrust Adaptive Issuance™ Production Analytics Solution is a software and technical consulting platform built on delivering this value proposition to card personalization operations through a customizable, real-time, aggregated architecture. It continues to drive next-generation smart manufacturing, bringing positive Overall Equipment Effectiveness (OEE) and enhanced value to operations. OEE is expressed as a percentage sum of three key production indicators: availability, performance, and quality. Let us take a closer look at each and how they bring a deeper understanding to issuance environments, focusing on net losses:

Availability Availability is the percentage of planned production time the operation is available to run. It is the ratio of Run Time to Planned Production Time and considers loss, which includes events that stop planned production. Run Time is defined as the total time when machines are used for production in a running state, excluding idle and pause. Planned Production Time is defined as the total calendar time for the measurement period minus planned downtime, such as maintenance tasks, changeover, holidays, and shift breaks. Availability Loss includes any unplanned stops such as idle time, equipment failure or material shortage, and planned stops as noted above. Performance Performance percentage is the actual total production output compared to theoretical perfect production output, which is calculated using the machine’s maximum cycle speed. A perfect performance score would mean machines are always running at maximum speed when active. Total Production is defined as the total number of cards processed, which includes both good and rejected cards. Maximum Machine Throughput is defined as the maximum machine speed in cards per hour, also considering that throughput could be reduced based on the total number of modules used. Performance Loss includes anything that causes the manufacturing process to run at less than the maximum possible speed when it is running. Examples of things that create Performance Loss include machine wear, inferior materials, line misfeeds, and jams. Quality Quality is the number of good units produced as a percentage of the total units started. A perfect quality score indicates that all products are good: successfully manufactured the first time, and no rejects were produced. Simply and more realistically, total production equals good production + rejected production. Quality Loss accounts for manufactured parts that do not meet quality standards as defined by the operation. Examples of Quality Loss include scrap and rework.

Net Losses

Net Losses for all three production indicators can be explored further here from Vorne as variations in issuance type, facilities, and machines contribute to loss variability. The Six Big Losses are also common in issuance operations and include: unplanned and planned stops, small stops, slow cycles, production rejects, and startup rejects.

To learn more about OEE and the three key production indicators, measurement & analysis, and calculating speed and efficiency, read our white paper, “Understanding Production Efficiency in High-Volume Card Issuance Operations.”

The Future of Digital Intelligence and Production Analytics

Machine learning and artificial intelligence are expected to revolutionize the industry further. These technologies will be integral in production planning, inventory management, resource planning, and cost analysis. Next-generation Industry 4.0 subject matter experts will take the output from these technologies and use it to make improvements across operational architectures, building more efficiency into the process. Digital intelligence can be used to mitigate operational bottlenecks, quality gaps, and overall line inefficiencies by focusing on the three key production indicators.

Learn more about the Entrust Adaptive Issuance™ Production Analytics Solution and how it can aid in operational efficiency, utilizing digital intelligence, data analytics, and advanced technologies essential for smart card manufacturing.

The post Production Analytics Key Indicators in Modern Card Issuance: Availability, Performance, Quality appeared first on Entrust Blog.


Northern Block

Decentralized Identifiers (DIDs): Strengths, Weaknesses, Opportunities and Threats (with Markus Sabadello)

This podcast episode delves into the nuances of DIDs, aiming to contribute to the broader goal of promoting their adoption. The post <strong>Decentralized Identifiers (DIDs): Strengths, Weaknesses, Opportunities and Threats</strong> (with Markus Sabadello) appeared first on Northern Block | Self Sovereign Identity Solution Provider.

 

🎧   Listen to this Episode On Spotify   🎧
🎧   Listen to this Episode On Apple Podcasts   🎧

About Podcast Episode

Note: This episode isn’t an introduction to DIDs; for those new to the concept, we recommend revisiting Episode 52 of the SSI Orbit Podcast for a foundational understanding.

In this podcast episode, we dive deeper into the world of Decentralized Identifiers (DIDs), moving beyond the basics to explore the critical aspects that will influence their trust and adoption.

Today, joined by Marcus, we embark on a detailed discussion framed around a SWOT analysis of DIDs, examining their strengths, weaknesses, opportunities, and threats. Our goal is to foster a comprehensive conversation that not only highlights the significant advantages of DIDs but also addresses the challenges and potential pathways to their widespread acceptance. We delve into the nuances of DIDs, aiming to contribute to the broader goal of promoting their adoption.

We aimed to follow the SWOT model, though it proved challenging at times due to the nuanced nature of many topics. Interestingly, what some may perceive as a strength could also be seen as a weakness, leading to a blend of perspectives. This complexity added depth to our discussion, making it particularly engaging!

The full list of topics discussed between Markus and I in this podcast include:

[00:47] Comparison of DIDs to Other Verifiable Identifiers: Exploring how DIDs contrast with other types of verifiable identifiers and their benefits over traditional identifiers like email addresses and phone numbers. [11:50] How to trust a DID?: Identifying the technical and governance aspects necessary for trusting DIDs. [15:46] Building Trust Around DID Methods: Discussing the challenges and approaches to building trust in various DID methods, considering technical integrity and governance. [23:34] Multi-Factor Authentication for DIDs: Considering whether incorporating multi-factor authentication concepts into DIDs is a viable framing. [25:34] Governance and Human Trust Inputs in DID Documents: Debating the inclusion of governance or human trust inputs in DID documents and whether these should remain purely technical. [31:13] Placing Claims in DID Documents: Discussing the implications of including claims in DID documents, especially concerning privacy, correlation, and data privacy laws. [39:11] Opportunities for DIDs: Identifying opportunities for DIDs to become easier to use, deploy, and provide value, including DID method discovery. [44:04] Value of DIDs Across Different Protocols: Questioning whether a particular DID method retains the same value across different credential exchange protocols. [52:01] Why the Market Should Focus More on Identifiers: Reflecting on the emphasis on verifiable credentials over identifiers and trust models in the context of ongoing digital identity programs. [58:00] Working Groups and Activities for DIDs: Highlighting current working groups and activities related to DIDs where listeners can contribute and engage.

 

About Guest

Markus Sabadello has been a pioneer and leader in the field of digital identity for many years and has contributed to cutting-edge technologies that have emerged in this space.

He is co-editor of the Decentralized Identifiers specification at W3C and co-chair of the Identifiers and Discovery Working Group at the Decentralized Identity Foundation.

Markus is founder of Danube Tech, a consulting and development company that works on DID-related infrastructure and products, including the Universal Resolver, Universal Registrar, and the Godiddy.com platform.

LinkedIn: https://www.linkedin.com/in/markus-sabadello-353a0821/

X: https://twitter.com/peacekeeper

The post <strong>Decentralized Identifiers (DIDs): Strengths, Weaknesses, Opportunities and Threats</strong> (with Markus Sabadello) appeared first on Northern Block | Self Sovereign Identity Solution Provider.


PingTalk

New Account Fraud: How it Works, Detection & Prevention | Ping Identity

Thursday, 08. February 2024

Lockstep

Digital ID and “digital identity”

Terminology Digital Identity: An amorphous concept attempting the impossible job of capturing all that a human is in digital form, leading to many years of disappointing national projects and draft bills.  Digital ID: A single attribute or fact about an entity, represented digitally, that is relevant in one or more specific contexts. A big little... The post Digital ID and “digital identity
Terminology

Digital Identity: An amorphous concept attempting the impossible job of capturing all that a human is in digital form, leading to many years of disappointing national projects and draft bills. 

Digital ID: A single attribute or fact about an entity, represented digitally, that is relevant in one or more specific contexts.

A big little shift in terminology 

The draft Australian federal Digital ID Bill no longer uses the phrase “digital identity”. The same shift in language has been made by the National Australia Bank (NAB) in its year-long multi-sector design collaboration

Both the NAB and the Australian government now refer to Digital ID. 

This change is much more than semantics. I hope it reflects a recognition by government that identification is different from identity, and there is no universal way of identifying people. 

NAB has highlighted that businesses and agencies are usually less interested in verifying “identity” than a particular attribute or eligibility. The bank’s roundtable report states “it is very rare that individuals need to prove who they are” and instead suggests a reframing of digital identity around “what do we need to know about an individual for a particular purpose”; that is, the process of identification in all its local variations.  

So what does Digital ID mean? Well, it’s just shorthand for a credential in a certain context. 

I reckon everyone knows what it means to be asked to “show some ID”.  We effortlessly interpret “ID” in context, whether it’s proof of age at a bar, a student card when sitting an exam, or an employee ID when entering the office. 

We’re also familiar with the formal identification process carried out when opening a bank account or starting a new job. In these cases, we are generally asked to provide a number of IDs, from a pretty universal set of options including driver licence, birth certificate, Medicare card, passport and so on. And there are alternative forms of ID for people who need them.  

Fresh thinking; new precision

None of these IDs constitute our identity. And that’s how we like it. 

As a rule, we should ask for and reveal as little personal information as possible, within the requirements of the transaction we’re doing. As the Canadian digital identity leader Tim Bouma once said, “I want my identity to be less than the real me”.

Digital ID is concrete, specific and familiar. But “digital identity” is abstract and open-ended. It means different things to different people. Invariably, digital identity is interpreted as a new universal means of proving who we are. But there is no such thing. 

So instead of imposing new identification standards and novel “digital identity” on Australians, the Digital ID Bill simply creates a governance regime to improve the quality and reliability of existing IDs when converted to digital form. 

The Australian Government Digital ID System (AGDIS) will conserve the meaning of existing IDs and leave businesses and agencies free to conduct identification as they see fit. The Digital ID regulator is being tasked to create rules and technology standards, with a view to fostering a marketplace of vastly improved digitised IDs. 

Best in class

I believe the AGDIS represents world’s best practice in digital identification. 

The AGDIS rules (to be developed after the Bill is enacted) will almost certainly embrace mobile phone digital wallets as carriers of Digital IDs. A large and fast-growing proportion of Australians are now familiar with using their payment cards in mobile wallets to “tap and pay” and “click to pay” (the Reserve Bank reports that 35% of card payments are now made via mobile wallets). 

The equivalent user experience of click-to-present any digital ID in context will be intuitive, widely available and vastly more secure than today’s plaintext handling of personal data. 

We will be able to prove relevant facts about ourselves with the same security, safety and privacy as we prove our credit card details. 

We’ve done this before

Smart payment cards (EMV chip cards) were introduced to replace magnetic stripe cards as they became vulnerable to skimming and counterfeiting. Chip-assisted presentation of cardholder IDs replaced the plaintext of a mag stripe and, as a result, essentially eradicated physical card fraud. 

Based on the speed with which smart credit card IDs cut payment fraud, we can expect mobile technology Digital IDs to dramatically reduce identity crime via stolen personal data and, as a result, neutralise the personal data black market.

The post Digital ID and “digital identity” appeared first on Lockstep.


KuppingerCole

IAM and Cybersecurity Insights and Trends in 2024

Martin Kuppinger was a virtual guest at SailPoint's SAILforward24 and was invited to have a chat with Matthew Mills about the industry and market, trends and his perspectives. Delve into the future of identity and access management (IAM) as Martin and Matthew discuss the profound impact of AI, decentralized identity, and the regulatory landscape. Gain insights into cybersecurity trends and learn

Martin Kuppinger was a virtual guest at SailPoint's SAILforward24 and was invited to have a chat with Matthew Mills about the industry and market, trends and his perspectives.

Delve into the future of identity and access management (IAM) as Martin and Matthew discuss the profound impact of AI, decentralized identity, and the regulatory landscape. Gain insights into cybersecurity trends and learn how SailPoint addresses these challenges.




HYPR

Five Tips to Strengthen Know Your Employee (KYE) Processes

The modern work environment creates situations unthinkable a decade ago. Today, many employees will never step foot in an office or have face-to-face interactions with colleagues. While this flexible work model offers advantages for businesses as well as employees, it also presents significant security challenges, especially when it comes to reliable identity verification.

The modern work environment creates situations unthinkable a decade ago. Today, many employees will never step foot in an office or have face-to-face interactions with colleagues. While this flexible work model offers advantages for businesses as well as employees, it also presents significant security challenges, especially when it comes to reliable identity verification.

Employee identity risk and fraud have skyrocketed in recent years. The FBI recently warned that fraudulent job applicants are using deepfakes and stolen PII to obtain remote work positions. “Bait-and-switch” schemes are also on the rise, where the person interviewed by a company is not the person who is actually hired. In the best case, companies end up hiring someone completely unsuitable for the position. In the worst case, they onboard and provide access to cybercriminals who can install malware, steal confidential and privileged information, and more.

The risks go far beyond hiring and onboarding. Just this week, a multinational finance firm revealed that an employee was tricked into making a $25 million wire transfer by deepfake video and audio. The thieves used AI-generated likenesses  of the company’s CFO and other known colleagues in a live video conference with the finance worker.

Know Your Employee (KYE) ensures that your employees are who they say they are. 

What Is Know Your Employee (KYE)?

Know Your Employee, commonly abbreviated as KYE, is the process of verifying the identities and backgrounds of an organization’s current workforce and prospective new employees. KYE aims to ensure that employees are who they claim to be, and do not have a criminal background that poses an undue risk.

The KYE - Know Your Employee process continues past initial hiring and onboarding. Effective KYE involves continuous monitoring and regularly updating and reviewing employee information to identify any potential risks that may arise.

KYE vs. KYC

Know Your Customer (KYC) and Know Your Employee (KYE) are two essential processes in the realm of compliance and risk management. KYC verifies customer identities to determine if they are who they claim to be and discover any potential risks, ensuring adherence to anti-money laundering (AML) and other regulations. In contrast, KYE involves verification of employee identities and thorough background checks to reduce the risk of employee fraud and theft, and address internal compliance needs.

Both KYC and KYE share the common goal of mitigating risks, upholding legal standards and preventing fraud. The main distinction is in their focus: KYC is customer-oriented, while KYE centers on internal workforce dynamics. KYC is a point in time process, while KYE should be a continuous process, throughout an employee’s tenure. Integrating both processes is crucial for a comprehensive risk management framework in business.

The Growing Importance of Know Your Employee (KYE)

Identity verification poses an ongoing challenge for organizations. In fact, 40% of IT security experts named it a top identity security challenge in the 2024 Workplace Identity Security Trends report. This is likely to climb as new attack and fraud tactics make accurate identity proofing and verification increasingly difficult. A recent report by Onfido found a 3000% increase in deepfake attempts in 2023, thanks to face-swapping apps, generative AI and other readily available fraud tools.

The risks go far beyond non-compliance. Employee fraud tactics such as interview fraud, bait-and-switch scenarios, and unauthorized outsourcing, mean that the person doing the work may not be the person you interviewed or hired. This creates massive security risks for organizations.

Moreover, identity risks persist throughout the employee lifecycle – not just at hiring and onboarding. The recent breach of MGM resorts succeeded because attackers were able to impersonate an employee and convince the IT help desk to provide them with credential access to the corporate IAM service. Robust, continuous KYE processes could have prevented the attack. In this evolving landscape, a comprehensive KYE strategy is crucial for organizations to stay ahead of emerging threats, ensuring the integrity of their workforce and safeguarding against potential malicious activities.

Tips for Implementing Know Your Employee (KYE)

While every organization will approach KYE differently depending on their specific needs and structure, there are basic principles that hold across all workplaces.

Employ Multiple KYE Verification Methods

A thorough KYE approach requires layering multiple verification methods — which ones, and when to invoke them, will depend on the risk level of the employee verification scenario as well as organizational and environmental risks at play. Verification methods include document verification, facial biometric verification, live video, chat verification, as well as passive signals such as device/endpoint and location detection.

As we’ve seen, identity fraud techniques evolve rapidly — it’s critical that your KYE methods keep up. A robust KYE strategy should incorporate advanced technologies such as liveness detection to detect deepfakes and other AI-aided fraud.

Make It Easy for Your Admins and Employees

It’s human nature to circumvent processes that add an unreasonable burden. When done well, KYE - Know Your Employee processes should be fast and easy for both employees and administrators alike. This means automating processes as much as possible, requiring the right level of identity verification for the scenario, and using technologies that people are familiar with, like their smartphones. Of course security cannot be sacrificed for usability.

Continuously Monitor for Risk

Employee identity risks do not end with secure onboarding. Higher risk scenarios arise throughout an employee’s career. Resetting credentials, getting a new phone, logging in from unknown locations, gaining increased access privileges, personal financial problems, active cyberattacks — all can be situations of elevated risk. Your KYE controls should be ready to address these, taking specific measures according to risk level. 

Don’t Operate in a Silo

It’s critical that your KYE processes and tools are connected and synchronized to your workforce identity security systems and HR system of record. The users accessing digital services may differ from the identities that IT believes they have. In other words, the digital identity may not match the real world identity. Your employee onboarding and identity verification procedures should be directly tied to your authentication processes, ideally using phishing-resistant, passwordless methods. As mentioned, it’s also important to invoke re-verification at certain times of increased risk. This means connecting your identity risk monitoring to your Know Your Employee (KYE) systems.

Keep Up With Current Regulations and Compliance

In the current regulatory landscape, it's not just ethical but imperative for businesses across various industries to comply with applicable statutes and standards. The incorporation of a robust "Know Your Employee" (KYE) process is a key aspect of this compliance. These checks are vital for verifying credentials, ensuring workforce integrity, and meeting legal obligations.

The consequences of non-compliance are severe, underlining the indispensability of KYE in any organization's overall compliance strategy. For example, a comprehensive KYE process is vital for Anti-Money Laundering (AML) compliance, helping identify potential risks and preventing misuse of the financial system.

Simplify and Strengthen Your KYE Controls With HYPR

HYPR Affirm is a cutting-edge identity verification solution designed specifically for large organizations to ensure trust in the identities of your workforce across the entire employee lifecycle. Utilizing AI-powered chat, video, and face recognition technologies, HYPR Affirm offers a secure and fully passwordless method for confirming employee identities. It provides IAL2 compliance and supports step-up re-proofing based on user behavior or risk.

Learn more about HYPR Affirm in this product brief. To see how HYPR can help your organization, schedule a personalized demo tailored to your identity security interests and needs.

FAQs

Q: What is Know Your Employee (KYE)?

A: Know Your Employee is the process of verifying the identities of an organization’s workforce to make sure that they are who they say they are, and that they do not pose a risk to the company.

Q: Why is KYE important?

A: KYE ensures that you do not hire a fraudulent candidate and that only legitimate employees can access company data and resources.  

Q: What are examples of real situations where it’s important to apply KYE processes?

A: KYE should be applied through the employee lifecycle, beginning with their first interactions with your company. This includes interviewing, at onboarding, at times of high-risk transactions, and other moments that bring an increased risk of fraud or attack.

Q: What is HYPR Affirm, and how does it facilitate know your employee (KYE)?

A: HYPR Affirm is tailored toward workforce identity verification requirements. It utilizes multiple identity proofing and verification methods, such as advanced biometrics and liveness detection, to guarantee that only authorized individuals can access corporate systems and data.


Holochain

Holochain: A New Link in Web3

Binding to EVMs

Holochain provides blockchains with auditable applications that can be tied to on-chain transactions, opening new possibilities for NFTs, DAO communities, and more.

Binding Wallets to Holochain Apps

We’ve developed a proof of concept for binding together Holochain agent keys with public keys on blockchains. Holochain uses a system of source chain validation to ensure that any action taken in an application is auditable, creating a secure environment for most application needs. By binding a Holochain agent with a public key in the Ethereum Virtual Machine (EVM), data created by the Holochain agent can be associated with the EVM key. This setup allows for unforgeable and bidirectional verifiability, facilitating provenance confirmation on both Ethereum and Holochain.

Validated Content for NFTs

Rather than storing NFT payloads in unvalidated and/or centralized storage, with Holochain you can create and validate the payload and the reference for an NFT fully within a distributed context. The demo below shows that you can also manage the minting and transfer of the NFT from a Holochain app precisely because you can first bind your wallet to your Holochain agent. By storing payloads for NFTs in Holochain, you can circumnavigate the storage limits of blockchain, providing a more validated and ultimately more dynamic solution than other current implementations like IPFS.

Notaries and Oracles

Combining patterns from oracles and notaries with agent binding on Holochain opens up new ways to move between peer-to-peer systems via “cross-chain” proofs.

Smart contracts pre-authorize the public keys of agents from whom they will accept proofs. The proofs are signed verifications that reference actions or decisions taken in Holochain, all of which are also transparent and verifiable by any member of the application. This enables scalable but still verifiable sensemaking.

An example of how this could be used is with DAOs. Large scale voting on-chain is costly, both in gas fees and network traffic. But a voting app on Holochain can provide the trust and transparency of blockchain without the gas fees, and then the fully verifiable decision can be notarized creating a proof which is used to make a single on-chain transaction. The cost savings to both individual DAO members and the organization as a whole are potentially immense. This also provides greater flexibility as DAO’s can now afford to conduct all of their business in validated environments.

Explore other benefits Holochain brings to DAOs.

Data Storage

Given the two examples above, you can see that what Holochain provides to blockchains is validated data storage. While the examples given talk about two major concerns for current blockchain development, you can store or interact with any data you choose to program into an application. Holochain is an incredibly flexible framework — the validation rules of each application can be customized based on the security and design needs of a particular use case. The combination of energy and cost efficiency, auditability, and flexibility makes Holochain a perfect choice for applications that need to interface with blockchain.

Plus, it’s all coded in Rust — with UIs using your favorite JavaScript framework — making it even easier for Ethereum devs to use and build with.

Learn More and Connect

Learn how Holochain can extend Web3, and get involved by introducing yourself to our team and community on Discord or by joining one of our upcoming events.


A Strange New Contraption

Sum>Parts

New technology is tough to explain sometimes. But Holochain isn’t reinventing the wheel. Rather, we are inventing the bicycle when all you have ever seen or used are wagons.

Holochain is built from familiar technical components and best practices, just as a bicycle is built from wheels, gears, frames, and steering mechanisms which all previously existed as basic forms or principles in other contexts.

Sum>Parts: The key components of Holochain already existed, but we put them on a new frame

Our back wheel is the Distributed Hash Table, similar to the ones used by BitTorrent or IPFS to revolutionize peer-to-peer file sharing. This is the database structure that allows data to be transferred in the context of a distributed app.

Our front wheel is built on a lot of the same principles as Git which pioneered local-first version control as we know it today. While our local-first tools are built out of slightly different materials, the principle remains: Agents can always make local changes which then update the network when connected online. Not dissimilar to how the front wheel determines the direction of the bike.

These wheels are powered by a familiar device, the bike chain. Holochain’s source chain uses an implementation of hashchains, which were popularized by blockchains, to connect one action to the next, every action containing the hash of the last. Each agent has their own bike and so they each update their own source chain, independent of the actions of other cyclists. This provides mobility compared to the wide wagon of a single central chain. The beauty of bikes is their independence, the ability to act without everyone needing to be in the same wagon.

The handlebars that steer this contraption are also familiar. Public/private key cryptography used for cryptographic signatures is a staple of most processes on the web, from HTTPS, to the Signal Protocol, to SSH which is used by devs everywhere to access remote machines. Alongside encryption, these keys act as the identity tools which connect the user, the human riding the bike, to their actions on the digital road. Every action they take is signed, controlled through those handlebars, and only the rider is able to steer.

Holochain pulls together all of these time tested tools and reconfigures them to do something none of them can do alone.

We’ve been talking about Holochain from the perspective of the agent — the rider and their bike. But Holochain is an application framework. In this analogy, every application is a unique path that controls where a rider can go, what actions can be taken, and how data is formatted. We talk about this from the perspective of the rider because they get to determine which paths they ride. Riders bridge between paths, connecting them into personalized neighborhoods. Each new app increases the area that the rider can travel and grows an ecosystem of potential routes. Different paths take you to different places, but Holochain lets you move between them seamlessly.

The affordances of each road dictate what can be done on it. Wide roads let many ride together, small curvy paths mean you slow down, etc. In the same way, the application rules are baked in. Changing them would be the same as changing roads.

As a framework for peer-to-peer applications, Holochain is taking on some of the toughest challenges of distributed computing. Ensuring data is validated and secure without central servers, allowing for asynchronous action across a network, and keeping agents honest, are all hard tasks. But each individual piece of this puzzle already exists, we are just recombining them to offer new possibilities.

Holochain is built to allow any group to come together, develop a bike path, and each with their own bike traveling at their own pace, ride the path together. Many different bike paths can overlap and intersect, and riders get to choose their route through them — traveling freely and at their own will.


Elliptic

Regulatory Outlook 2024: How crypto policy and regulation will shape up around the globe

As we’ve noted recently,2024 is already shaping up to be a year of exciting and intense activity on the crypto regulatory and policy front. 

As we’ve noted recently,2024 is already shaping up to be a year of exciting and intense activity on the crypto regulatory and policy front. 


Shyft Network

ESMA Seeks to Curb Non-EU Crypto Firms’ EU Presence

ESMA’s latest guidelines clarify that non-EU crypto firms can only serve EU residents when they request it first. It prohibits crypto firms from providing any additional services unrelated to the initial request from their EU clients. The draft guidelines seek to thwart any attempts by third-country firms to bypass the rules while also encouraging stakeholders to provide feedback to i
ESMA’s latest guidelines clarify that non-EU crypto firms can only serve EU residents when they request it first. It prohibits crypto firms from providing any additional services unrelated to the initial request from their EU clients. The draft guidelines seek to thwart any attempts by third-country firms to bypass the rules while also encouraging stakeholders to provide feedback to improve ESMA’s approach.

Recently, the European Securities and Markets Authority published a consultation paper discussing the draft guidelines on reverse solicitation under MiCA. Here, reverse solicitation refers to a situation where third-country firms solicit clients in the European Union, directly or indirectly.

The ESMA’s main purpose with these guidelines is to specify when third-country firms can offer crypto services to EU clients and to prevent firms from circumventing the reverse solicitation rule.

The agency also aims for these guidelines to be consistent, efficient, and effective when rolled out for implementation. For that, ESMA has urged all stakeholders, including crypto-asset service providers (including from third countries) and financial entities dealing with crypto-assets, to respond to this consultation paper.

Responses should address the questions asked, provide explanations, and suggest alternative options for the ESMA to consider.

When are Non-EU Crypto Firms Allowed to Offer Services to EU Residents?

The latest ESMA guidelines say that non-EU crypto firms can provide their services to EU residents only when they request it. However, after the initial contact, the firm can’t offer more cryptocurrencies or related services unless they are related to what the client asked for in the first place.

Instructions for Relevant National Regulators

To ensure an interaction is genuinely initiated by the EU-based client, the guideline suggests regulators look at how and when the client approached the firm. Was it truly on their own, without any nudges from ads or emails?

The draft guidelines also provide instructions to national agencies on how to decide if a third-country firm is marketing a new kind of crypto asset or service. They suggest that national authorities look at two things:

The type of crypto asset or service being offered. The risks involved with the new crypto asset or service.

It also notes that MiCA only mentions three crypto asset types (asset-referenced tokens, electronic money tokens, and tokens that don’t fall into these categories), and such broad categories might enable third-country firms to circumvent the rules. So, ESMA urges relevant national authorities through these guidelines to:

Monitor the marketing efforts that third-country firms deploy in targeting EU-based clients. Conduct relevant consumer surveys. Keep a tab on clients’ complaints and whistleblowing. Cooperate with authorities, including law enforcement, as and when required. Concluding Thoughts

Overall, the draft guidelines offer insights and guidance on when third-country firms can approach clients in the EU and provide strategies for detecting any attempts to bypass these rules. And since it’s a consultation paper, stakeholders are invited to share their feedback on ESMA’s approach, identifying any potential gaps or effectiveness concerns. Responses are accepted until April 29, 2024.

______________________________________

Shyft Network powers trust on the blockchain and economies of trust. It is a public protocol designed to drive data discoverability and compliance into blockchain while preserving privacy and sovereignty. SHFT is its native token and fuel of the network.

Shyft Network facilitates the transfer of verifiable data between centralized and decentralized ecosystems. It sets the highest crypto compliance standard and provides the only frictionless Crypto Travel Rule compliance solution while protecting user data.

Visit our website to read more, and follow us on X (Formerly Twitter), GitHub, LinkedIn, Telegram, Medium, and YouTube. Sign up for our newsletter to keep up-to-date on all things privacy and compliance.

ESMA Seeks to Curb Non-EU Crypto Firms’ EU Presence was originally published in Shyft Network on Medium, where people are continuing the conversation by highlighting and responding to this story.


Aergo

Aergo Mainnet Update : Aergo v2.5.1

Aergo Mainnet Update : Aergo v2.5.1 All Aergo networks including the mainnet, testnet, and the alphanet have been updated to v2.5.1 today, Feb 8th, 2024. This release contains an encryption algorithm upgrade. Overview This release includes bug fixes for potential hard fork and networking optimization improvements as well as starts laying the groundwork for the Web 3 environment. Impro
Aergo Mainnet Update : Aergo v2.5.1

All Aergo networks including the mainnet, testnet, and the alphanet have been updated to v2.5.1 today, Feb 8th, 2024. This release contains an encryption algorithm upgrade.

Overview
This release includes bug fixes for potential hard fork and networking optimization improvements as well as starts laying the groundwork for the Web 3 environment.

Improvements
- Improved internal status DB storage structure,
- Changed the default encryption algorithm,
- Added web3 API

More details can be found on the v2.5.1 release note. Details regarding previous releases and their objectives can be obtained from the previous release notes, offering the enhancements and improvements targeted in each version.

Notes
1) To take full advantage of Aergo v.2.5.1 and to maintain network stability, upgrading your nodes is essential, and we strongly advise you to do it. Aergo team provides the latest snapshot DB on Aergo Snapshots to speed up full node synchronization. Please delete the current DB data from your nodes and proceed with the latest one downloaded from the Aergo Snapshots for successful DB migration.

2)The issue report detailing the recent mainnet stall has been posted on Medium, ensuring transparency regarding ongoing developments and resolutions.

We apologize for any inconvenience this may cause and appreciate your understanding.

Aergo Mainnet Update : Aergo v2.5.1 was originally published in Aergo blog on Medium, where people are continuing the conversation by highlighting and responding to this story.


Post Mortem — Aergo Stall, February 1st 2024

Post Mortem — Aergo Stall, February 1st 2024 On February 1st at 13:13 KST, Aergo’s mainnet halted new block generations, which resulted in a temporary outage. The core dev team immediately responded to the issue, successfully restoring the network by applying a hotfix to address the root cause. To be clear, no funds were at risk as a result of the network halt. Cause Block gener
Post Mortem — Aergo Stall, February 1st 2024

On February 1st at 13:13 KST, Aergo’s mainnet halted new block generations, which resulted in a temporary outage. The core dev team immediately responded to the issue, successfully restoring the network by applying a hotfix to address the root cause.

To be clear, no funds were at risk as a result of the network halt.

Cause

Block generation was halted due to a bug in the Fee Delegation transaction processing within the Smart Contract engine.

Recovery Timeline

February 1st, 2024:

13:13 — Block generation stopped at block number 150677807. 15:30 — Blocked External nodes to enable communication exclusively between the Trusted Nodes (whitelisted based on IPs). The initial block regeneration attempt failed. 17:00 — Commenced testing the hotfix development version and synchronization test among the BPs and the Trusted Nodes.

Continued on February 1st:

19:00 — Snapshot data prepared. 21:00 — Completed the hotfix development and the local test was successful.

February 2nd, 2024:

01:30 — Block generation resumed. 08:00 — Ready to snapshot Block Producers (BPs) and initiated synchronization between BPs. 08:50 — Snapshot prepared for the Trusted Nodes, and the hotfix was tested on the Alphanet and the testnet. 10:20 — Initiated synchronization test of the hotfix on the mainnet. 13:10 — Hotfix development version (2.5.1-dev) deployed on the mainnet with BPs and the Trusted Nodes. 14:20 — Initiated synchronization for the mainnet Trusted Node 1 and dApp operation resumed. 16:00 — Completed recovery based on Mainnet BP/Trusted Nodes.

We will continue to strive to keep the Aergo blockchain platform stable. We apologize for any inconvenience this may cause and appreciate your understanding.

Post Mortem — Aergo Stall, February 1st 2024 was originally published in Aergo blog on Medium, where people are continuing the conversation by highlighting and responding to this story.

Wednesday, 07. February 2024

Indicio

How governance unlocks the value in verifiable identity and data

The post How governance unlocks the value in verifiable identity and data appeared first on Indicio.
Governance enables ecosystems for sharing data — it’s the secret sauce driving value, and the permutations are limitless. To stay updated on all things governance, sign up for the Indicio Governance Newsletter.

By Trevor Butterworth

One way to understand governance as a component of decentralized identity is to look at identity verification and data sharing as an “ecosystem.”

An ecosystem is any use case where you have:

One or more parties issuing verifiable credentials for digital identity to people, organizations, or devices. People, organizations, or devices who then hold the credentials to prove their identity and share data within the credential to other parties. People, organizations, or devices, that need to prove the identity of those presenting verifiable credentials, and the data within their credentials to accomplish some goal.

In an ecosystem, participants are interested in the same kind of data to solve the same kind of problems. This could be who is allowed to access a facility, an operational system, a software application.

Governance makes this happen.

The governance authority establishes who can issue credentials. If there is only one issuer, say in the case of a company issuing employee credentials, the company is the governance authority.

The authority establishes:

who can be issued a credential. who else can issue credentials What information is going to be issued in a credential. How that information is to be interpreted — if x, then y How that information can be shared in ways to comply with data privacy regulations.

While there are other technical aspects to governance (choosing credential formats, DID methods, protocols, cryptographic key management etc), the key governance actions can be reduced to two workflows:

Which credentials can I trust? What data needs to be presented and authenticated and how does it need to be presented?

These questions are solved by machine-readable governance. The governance authority publishes a file in a format that can be read by the software required and held by each participant in the ecosystem.

Once you have an ecosystem for a specific use case, you can extend the governance to other use cases, depending on how robustly you have authenticated the person, organization or device that you issued the credential to.

Governance would enable a credential issued to an employee to access a hospital to also access a specific database or device. Governance could allow this credential to be used as a proof of identity for specific organizations that provide employee benefits. So you’ve gone, just through a simple addition of code, from opening a door to opening an insurance policy.

Two different hospital systems could agree and use machine readable governance to authorize doctors in one system to work in another.

The fact that these rules can be rendered in a machine-readable format makes implementation and updating fast and simple. You can rapidly reconfigure your ecosystem to manage new data and requirements.

And once you are able to use verifiable identity and data, the permutations are limitless. Governance makes all these permutations work and enables expansion and scale.

It turns trust into an actionable, valuable commodity.

Quistions? contact us to learn more about Indicio Proven®, our complete verifiable credential solution, or, to stay updated on all things governance, sign up for the Indicio Governance Newsletter 

The post How governance unlocks the value in verifiable identity and data appeared first on Indicio.


Entrust

Creating a Digital Card Portfolio – How Entrust Supports IT Teams With a Seamless Customer Experience

Everywhere you turn, it seems there’s a new process, a new technology, and a new... The post Creating a Digital Card Portfolio – How Entrust Supports IT Teams With a Seamless Customer Experience appeared first on Entrust Blog.

Everywhere you turn, it seems there’s a new process, a new technology, and a new consumer preference that is “going digital.” More and more cardholders around the globe are preferring, and expecting, a digital-first payment experience. Because of the technology behind it all, IT professionals are at the center of this digital transformation. The pandemic accelerated an awareness of and migration to contactless and digital payments, and the payments ecosystem is now one of the fastest movers when it comes to digitization.

51% of U.S. cardholders state that their mobile banking app is their preferred method to engage with their financial institution. This data shows us that customers are looking for seamless, digital-first payment and card management experiences. With that data guiding us, it’s clear an elevated and enriched issuance and payment enablement process is now the gold standard to meet the demand of cardholders.

Bridging Physical and Digital Experiences for Cardholders

That gold standard experience – providing a compelling, seamless digital card and payment experience – can help a bank to achieve top-of-wallet status. We learned in a recent survey among 1,000 U.S. cardholders that digital card controls in the banking app make the cardholder more likely to use the card as a default payment method. This has led to some banks going all-in with fully featured, integrated digital card solutions. Others, especially small to midsize banks, have asked their IT teams to implement some fundamental digital features as building blocks to their digital card platform – a foundation that can be enhanced and expanded by partnering with a digital card solution provider to offer cardholders a truly digital-first payment experience.

But financial institutions have different challenges when it comes to enabling intuitive yet secure digital payment options for their cardholders. The building blocks themselves, like choosing the right digital features, knowing which features are most requested by cardholders, and meeting the demands of different demographic groups, can be challenging to organize and understand. Similarly, it’s important to consider a solution that is scalable and future-proof to ensure the biggest ROI. Most IT teams have heavy resource constraints with a multitude of existing projects. Smaller financial institutions may find it difficult to build out their own digital card solution, while bigger institutions will think closely about the existing infrastructure in place. For example, by leveraging what they’ve already put in place and partnering with Entrust, a financial institutions’ IT team can answer the need for an even more digital and seamless customer experience.

Simplifying Digital Card Implementation

The Entrust Digital Card Solution is a single mobile software development kit (SDK) that’s easy to implement and opens a world of possibilities. The entire menu of capabilities becomes immediately available, but with the flexibility and scalability to pick and choose the right configuration for your customers at that point in time, adding more capabilities seamlessly as the organization’s payments enablement strategy evolves.

This innovative strategy shifts an IT team’s highly complex digital-first strategy to a unified and simplified integration for digital payments enablement. Our SDK or web service integration aims to create faster time to value with minimal intervention on behalf of IT. Our implementation helps IT teams overcome common challenges like internal resources among small departments, total cost of ownership across multiple features, and time to market.

Choosing a solution developed by experts focused on the digital card makes adding new integrations for the cardholder a much smoother, less time-intensive process. Ultimately, these specific capabilities enable IT teams to focus on mission-critical initiatives while promoting a fully integrated digital card solution.

To learn more about the Entrust Digital Card Solution, download our white paper and register for our webinar hosted by our financial issuance experts.

The post Creating a Digital Card Portfolio – How Entrust Supports IT Teams With a Seamless Customer Experience appeared first on Entrust Blog.


Holochain

Making Sense of Identity in the Digital Age

#HolochainChats with Philip Sheldrake

As digital transformation accelerates across industries, technology thought leader Philip Sheldrake brings an urgent yet nuanced perspective on constructing digital identity systems centered on human needs.

With a perspective spanning engineering, computer science, and involvement in decentralized projects, Sheldrake understands the intricacies of designing digital ecosystems.

However, through his interdisciplinary lens, he sees profound gaps in how identity gets formulated in computational realms versus lived human experience.

In the same way that the internet has brought a series of helpful technological advancements, it’s also brought unintended consequences.

For Sheldrake, identity may be the premier challenge to align digital progress with human flourishing rather than harm.

By drawing out insights from his expertise in bridging social and computer sciences, this article explores the past, present, and most hopeful future pathways for digital identity.

Understanding and Describing Identity

Even defining digital identity proves slippery, as technologist Philip Sheldrake explains:

“It’s so difficult to actually even describe the space we're about to talk about. It's astonishing how few words have real meaning in this space.”

Identity intersects technology, business, government, philosophy, culture, and our personal lives. Thus, no singular perspective captures its complexity.

Sheldrake notes, “There are just a few throwaway phrases that just mislead people because they're so ubiquitous, they hardly mean anything anymore.” 

Concepts like innovation, transformation, and disruption get used to the point of losing meaning. Making progress requires bridging disconnected outlooks. For example, social scientists and computer programmers use distinctive language and assumptions, often talking past each other. Successful digital systems blend these varied perspectives and provide enough interdisciplinary expertise to clarify complexity.

Sheldrake brings decades of experience working across domains, from helping launch open source browser Firefox to involvement with Ethereum and decentralized technology projects. He sees identity as the crux where technology and humanity meet — for better or worse:

“I am obsessed with the potential for digital technologies to lend a hand in human flourishing. Except that, so far, the experiment of the digital has thrown up as many negative externalities… as positive. Little things, you know, like undermining democracy.”

The alluring promise of technology remains unfulfilled until we construct frameworks elevating both computer science innovations and social scientific understandings of identity.

Technology Changes Identity Dynamics Over Time

While databases and algorithms remain constant, the societal outcomes and experiences with digital systems transform remarkably across months and years. Sheldrake explains how innovations go through a “series of phases” regarding consequences as usage patterns and cultural incorporation structurally mature.

In the early days of social media, few guard rails for behavior existed, enabling exploratory identity play. Yet, as digital engagement became mobile and pervasive, external constraints tightened. Structures accreted, along with probabilities of harm.

Sheldrake gives the example of shifting sentiment around decentralized technology projects like blockchain. Initial optimism about democratization gave way to backlash concerning crypto’s environmental impacts. This reveals less about the fundamentals of blockchain itself changing and more about the emergence of unintended externalities at scale.

Contrasting Views on Identity

There are fundamentally different perspectives on identity in computer science and social science. Sheldrake points out, “In computer science...nothing is addressable in internetworking until it has an entity and identifier.” 

Rigid constructs like legal identities and cryptographic claims of personhood dominate programming approaches.

Meanwhile, in sociology and psychology, “identity is a sense-making capacity...a process, it’s a verb.” Human identity lives between people and contexts, ever-negotiated through information exchange. Instead of static entities, constantly co-created meaning flows.

Sheldrake argues the clash stems from differing historical customers driving innovation, noting “the first customers of IT were large corporations and governments...IT served the bureaucracy, it was the definition of bureaucracy.”

Thus, digitization elevated standardized legal identities for taxation and service provision — not the nuanced, fluid identities enlivening human relationships outside institutional walls. Much gets lost in translation.

The Path Forward Requires New Perspectives

Progress on digital identity relies on transcending entrenched disciplinary silos to enable genuine dialogue and collective innovation. While compromising on decentralized principles proves difficult, rigidly decentralized systems similarly risk harm without weighing human needs.

Sheldrake argues identity resides in “getting social science and computer science and the humanities, kind of riffing off each other's best strengths, rather than allowing computer science to riff off its worst weaknesses.” No single expertise holds the answers alone.

He continues: “One of my mentors mentioned to me, the beginning of the 90s, the 20th century was all about disciplinary excellence, the 21st century will all be about the interdisciplinary or even the transdisciplinary, trying to fill in the gaps between the disciplinary knowledge.”

To sufficiently fill those gaps, technologies like Holochain aim to bring individual sovereignty and collective capacity. While there isn’t a perfect formula for identity, the next wave of innovators can learn from how it’s translated digitally. 


Trinsic Podcast: Future of ID

Nick Thomas: Finicity’s Journey from Personal Finance App to $985M Acquisition for their Open Banking Platform

Today we talked with Nick Thomas, cofounder of Finicity, which was acquired by Mastercard, where Nick went on to be the EVP of Global Open Finance Innovation. Nick has a fascinating career that has paralleled digital identity for a long time, as a cofounder of Bluetooth and FDX, the major open banking standards body in the US. These are both organizations that brought an industry together around c

Today we talked with Nick Thomas, cofounder of Finicity, which was acquired by Mastercard, where Nick went on to be the EVP of Global Open Finance Innovation. Nick has a fascinating career that has paralleled digital identity for a long time, as a cofounder of Bluetooth and FDX, the major open banking standards body in the US. These are both organizations that brought an industry together around common standards to grow the market far bigger for everyone involved—and we talk lessons learned and how this applies to identity.

Nick shares a deep look into Finicity’s story starting as a consumer application and eventually becoming a major data aggregator. Then we dive into Finicity’s journey disrupting themselves by pushing into open banking. We explore the work Finicity did with verifiable credentials as an issuer, and the big challenge that prevented them from rolling it out. You’ll appreciate all of his takes on the parallels between fintech and IDtech.

To learn more about Nick you can find him on LinkedIn

Subscribe to our weekly newsletter for more announcements related to the future of identity at trinsic.id/podcast

Reach out to Riley (@rileyphughes) and Trinsic (@trinsic_id) on Twitter. We’d love to hear from you.


KuppingerCole

Intelligent SIEM Platforms

by Warwick Ashford This KuppingerCole Leadership Compass provides an overview of the market for Intelligent SIEM (I-SIEM) Platforms that go beyond traditional Security Information and Event Management (SIEM) capabilities to proactively identify threats and automatically suggest mitigation measures to meet the requirements of modern IT environments that are typically on premises as well as being mo

by Warwick Ashford

This KuppingerCole Leadership Compass provides an overview of the market for Intelligent SIEM (I-SIEM) Platforms that go beyond traditional Security Information and Event Management (SIEM) capabilities to proactively identify threats and automatically suggest mitigation measures to meet the requirements of modern IT environments that are typically on premises as well as being mobile and distributed across multiple cloud environments.

Mar 26, 2024: Navigating Identity Security: Integrating SAP Into an Identity Fabric

SAP customers are shifting away from SAP Identity Management (IDM), which will no longer be supported after 2027, as they are adopting newer SAP solutions. They need to find effective ways of dealing with the technical challenges, while at the same time ensuring secure access control and management of digital identities. Join identity and security experts from KuppingerCole Analysts, One Identit
SAP customers are shifting away from SAP Identity Management (IDM), which will no longer be supported after 2027, as they are adopting newer SAP solutions. They need to find effective ways of dealing with the technical challenges, while at the same time ensuring secure access control and management of digital identities. Join identity and security experts from KuppingerCole Analysts, One Identity, and SAP as they discuss the significance of and key considerations for integrating and complementing the SAP identity security model across various SAP solutions, including SAP Identity Governance and Administration (IGA), IDM, Business Technology Platform (BTP) Cloud Identity, and S/4 Hana.

Mar 12, 2024: Mastering CIAM: Advanced Techniques for Designing a Resilient CIAM Program

In today's digital landscape, mastering CIAM is crucial for IT professionals. Challenges like imperfect client touchpoints, high abandonment rates, and limited system insights hinder best practices. Join us to explore and overcome these hurdles, enhancing user experiences through strategic CIAM implementation. Explore advanced CIAM techniques in our webinar. From implementing best practices to dep
In today's digital landscape, mastering CIAM is crucial for IT professionals. Challenges like imperfect client touchpoints, high abandonment rates, and limited system insights hinder best practices. Join us to explore and overcome these hurdles, enhancing user experiences through strategic CIAM implementation. Explore advanced CIAM techniques in our webinar. From implementing best practices to deploying APIs for user insights, discover strategies to reduce abandonment rates. Unlock techniques for enhancing end-user engagement and revenue, supported by real-world testimonials. Learn to design a resilient CIAM program that meets current challenges and anticipates tomorrow's demands.

PingTalk

Prevent MFA Bombing & Fatigue: A Guide for Businesses | Ping Identity

Zero Trust security models call for the use of multi-factor authentication (MFA) to ensure that only authorized users may access protected IT resources. Many organizations are adopting MFA to add a layer of security for remote workers. Customer-facing organizations are also implementing MFA to mitigate identity-based attacks, such as phishing, and to help quash the rise in account takeover fraud.

Zero Trust security models call for the use of multi-factor authentication (MFA) to ensure that only authorized users may access protected IT resources. Many organizations are adopting MFA to add a layer of security for remote workers. Customer-facing organizations are also implementing MFA to mitigate identity-based attacks, such as phishing, and to help quash the rise in account takeover fraud.

 

For five consecutive years, the leading cause of breaches has been compromised credentials — in other words, the use of stolen passwords. MFA renders the use of stolen credentials futile, as attackers are highly unlikely to have access to a user's other authentication factors, such as a mobile phone.

 

Microsoft has stated that using MFA may stop 99% of password-related attacks within an enterprise, so an increasing number of enterprises have begun to require MFA before granting account access. That's the good news. 

 

The bad news is that attackers are now evolving their tactics for bypassing MFA - and are finding some success. 

 

Many organizations encourage using MFA for threat protection, but relying on users to approve authentication requests manually is now riskier than ever due to tactics such as MFA Bombing.

Tuesday, 06. February 2024

Anonym

Why More Companies are Turning to SaaS for Decentralized Identity Solutions

Considering a decentralized identity (DI) solution? Is this your wish list: ✔️ You want to deploy DI capabilities in the simplest way possible. ✔️ You want to focus on your core business while leaving others to do the detailed technical work of deploying your DI solution. ✔️ You want to deploy your DI solution faster […] The post Why More Companies are Turning to SaaS for Decentralized Identity

Considering a decentralized identity (DI) solution? Is this your wish list:

✔️ You want to deploy DI capabilities in the simplest way possible.

✔️ You want to focus on your core business while leaving others to do the detailed technical work of deploying your DI solution.

✔️ You want to deploy your DI solution faster and at a much lower total cost of ownership (TOC) than other deployment models offer.

To have those wishes granted, you need a cloud-based, SaaS deployment model—a complete and managed solution that includes the deployment, management, maintenance, and support of DI services. In short, you need the Anonyome Platform.

Read our whitepaper or continue here for the highlights.

More companies are turning to a SaaS solution for DI  

For software providers and integrators, adding a DI infrastructure to their product lines presents many of the same choices they faced when integrating other system elements: for example, will we host our web servers in own own local data center using our own hardware? Or should we opt for the simplicity of leasing server space from a cloud hosting provider?

Both methods have benefits. Some companies enjoy the peace of mind that comes with controlling their own hardware and software stacks. Others—an increasing majority—find peace of mind comes from knowing that the hardware and software stacks are delivered and maintained through a contract model.

The contract model is often referred to as cloud hosting or the more formal Software as a Service (SaaS) model because the vendor provider deploys, manages, maintains, and provides support for the software.

Calculating the total cost of ownership (TOC) usually determines whether to use a SaaS or on-site deployment model. The key cost drivers for any software implementation are the costs of:

software application hardware/hosting staff required to deploy, manage, maintain, and support the application.


Any of those cost drivers that a company can successfully outsource become line items in a budget without the operational complexities imposed by on-site deployments.

The Anonyome Platform embraces the SaaS model

Many companies have calculated the inefficiency, complexity, and additional expenses of the traditional on-site software model and have opted for the simplicity of the SaaS model. In fact, the SaaS model has become the de facto standard for delivering software solutions and the Anonyome Platform was built to embrace this model.


Here are 7 reasons why:

1. No upfront capital costs

With the SaaS model, the customer doesn’t have to bear the upfront capital costs of deploying a software solution or even contracting their own cloud provider. Instead, the SaaS vendor incurs these costs (e.g., hardware/hosting, software application, and IT staff hiring and training). This simplifies startup and operating costs for the customer, the customer only pays a monthly or annual subscription cost, and deployment is immediate.

2. Fast deployment

SaaS delivers an almost instant solution to the customer, which is far more attractive than waiting the many months typically required for deploying and testing in-house solutions. Anonyome provides a fully automated approach to deploying new environments for customers.

3. Vendor-provided maintenance and upgrades

In the on-site solution, a customer is on the hook for updating and applying security patches to the software and operating systems, and for performing compatibility tests with every change. On the other hand, in the SaaS model, the software vendor maintains and upgrades the software, which includes delivering new versions and security patching the hosted system to protect against security vulnerabilities.

4. No staffing requirements

Paying staff to research, design, integrate, test, finetune, and launch any system is a significant cost with in-house solutions—up to 75% of the overall costs, in fact. The SaaS model eliminates the staffing concerns for contracted services.

In new emerging areas such as DI, it may be extremely difficult to find new hires with the right skills. The Anonyome Labs’ team, on the other hand, is highly experienced and highly trained in DI technologies. Beyond the Anonyome Platform, we support the ongoing development of DI by:

helping to run three DI-focused blockchains being an active contributor to the DI standards organizations contributing source code back to the DI open source community.

5. No-fuss application scalability

With in-house solutions, the customer must think about infrastructure issues such as network bandwidth, database and server sizing before deciding whether to ramp up their solution. But customers using the SaaS model don’t have to think about the technical how of scalability, because it’s already done for them. This leaves them free to focus on their business strategy and when to scale for maximum revenue.

6. 24-hour software support

The service level agreement (SLA) is key to a SaaS solution because it outlines the reliable actions and response times a customer can expect should an issue arise. With Anonyome’s global workforce across multiple time zones, staff are always on hand to respond to priority issues, even after hours at the customer’s site.

7. High level security and privacy

Under the SaaS model, the software vendor hires and trains people to maintain top security and privacy capabilities on the customer’s behalf, which is a significant resource saving.

DI is exploding as the new identity management solution

As you’d know, DI is the technology and standards that allow a new form of digital identity that works everywhere, is more trustworthy, and respects privacy. DI puts the user in sole control of the personal information that forms their identity. It’s decentralized because there’s no central authority or single issuing agent.

According to June 2023 figures, more than 62 per cent of US companies plan to incorporate a DI solution into their operations, with 74 per cent likely to do so within a year.


Read: Simple Definitions for Complex Terms in DI and 17 Industries with Viable Use Cases for DI

Centralized identity models are no longer sufficient to safeguard against exploitations from either rogue internal operators or external bad actors. DI has stepped into the breach, because:

DI is safer: Without central storage of credentials, DI presents much less risk of credential theft or other operational issues for the customer as custodians of sensitive data. DI streamlines process:Organizations no longer have to repeatedly collect and distribute data, and sign-up and sign-in processes are faster because there’s no need to duplicate efforts and approvals. DI gives individuals control over data:The customer controls their own data, which reduces risk, unifies user experience, and boosts their satisfaction. DI reduces the compliance burden: The system eliminates manual error and inconsistent and redundant data distribution and validation. The Anonyome Platform specializes in DI (and much more)

The Anonyome Platform does three things we know enterprise customers want:

Very quickly deploys DI capabilities in their application environment Handles the deployment, management, maintenance and support of the DI solution Allows the customer focus on high-level strategy instead of the technical intricacies.

Within the DI offering, the Anonyome Platform has five licensable products:

White label mobile DI wallet: A standalone mobile application for iOS and Android, a user can use the mobile DI wallet for DI interactions and storage: creating storing cryptographic keys; and creating connections for receiving, holding and presenting verifiable credentials. It interacts with verifiable credential issuers and verifiers, and supports both AnonCreds and W3C Credential formats. Mobile native DI wallet SDK: For customers who want to add the DI wallet functionality to their own mobile application, this native mobile SDK provides the capability for storage; creating cryptographic keys; establishing connections; and receiving, holding and presenting verifiable credentials. Verifiable credentials service/SDK/sample apps: This service establishes connections with DI wallets, issues verifiable credentials, and requests and verifies presentation proofs from DI wallets. It supports both AnonCreds and W3C Credential formats. Relay service/SDK: This service introduces an always-on capability for a mobile DI wallet. Implemented as a cloud service, the relay service provides mediator capability for incoming/outgoing verifiable credential and other messages, so that sending and receiving is not limited to when the client implementations of the mobile DI wallet and SDK are online. Governance framework:This framework empowers DI network participants to enforce rules while staying true to the tenants of DI. Organizations can define and enforce governance rules associated with their ecosystem, including which wallets, issuers, and verifiers are trusted and which credentials can be used and requested.

In addition to the current Anonyome Platform DI products, we are also developing:

Enterprise DI wallet: A combination mobile wallet application and cloud wallet, the enterprise DI wallet manages credentials associated with the enterprise. Differing from a personal DI wallet, the enterprise DI wallet is intended for one or more users to use simultaneously and to be inheritable by a future occupant of a particular job or position within an enterprise.

The Anonyome Platform is a SaaS-based, API-first developer platform of individualized services

Also known as the Sudo Platform, the Anonyome Platform offers a set of DI, privacy, and cybersecurity services and SDKs that developers can add as capabilities to their existing or new applications.

When a customer has licensed one or more of the Sudo Platform services, we create a new Sudo Platform deployment instance for that customer, which contains their specific licensed services. One of Sudo Platform’s unique features is it doesn’t use a shared deployment model (common for SaaS environments). Rather, Sudo Platform deploys environments individually for each customer. This means the customer can customize the environment for their needs, and that customer’s data and processing are completely separate from those of other customers.

Once we have deployed and configured an environment for a customer, we maintain and support it, which includes updating and patching it, and monitoring it 24/7.

Now, head to our whitepaper for specifics on how the Anonyome Platform operates

Ready to start building?

Remember, we built Sudo Platform so you don’t have to, and we make it easy for you to make it your own. It’s a modular, turnkey collection of services including:

DI Safe and private browsing Password management Virtual private network Compartmentalization Virtual cards Open and encrypted telephony and email communications.


You can use one, some, or the entire stack to augment your established offerings with adjacent capabilities that dramatically increase value for your customers.

Sudo Platform includes everything you need to quickly bring new products to market or augment existing products with value-add privacy and identity protection capabilities. It includes:

Developer-focused documentation APIs SDK source code via GitHub Sample applications for test-to-deploy of various capabilities Brandable white-label apps for quick go-to-market deployments.


If you’d like to leverage the SaaS model and the power of Anonyome Platform, reach out today. We’re excited to see what we could build together.

Contact us today

The post Why More Companies are Turning to SaaS for Decentralized Identity Solutions appeared first on Anonyome Labs.


1Kosmos BlockID

Strengthening Cybersecurity in the Face of Rising Threats

Recent reports from the FBI have shed light on the escalating cyber espionage activities orchestrated by state-sponsored actors, particularly those emanating from China. Case in point — the emergence of the Volt Typhoon botnet, as highlighted in The Guardian and AP News, underscoring the critical need to safeguard sensitive information and ensure the integrity of … Continued The post Strengtheni

Recent reports from the FBI have shed light on the escalating cyber espionage activities orchestrated by state-sponsored actors, particularly those emanating from China. Case in point — the emergence of the Volt Typhoon botnet, as highlighted in The Guardian and AP News, underscoring the critical need to safeguard sensitive information and ensure the integrity of digital identities, now more than ever.

Understanding the Threat Landscape

The FBI’s warnings regarding the Volt Typhoon botnet and China’s espionage activities serve as a stark reminder of the sophisticated tactics employed by cyber adversaries. Coincidently, my last blog covered Midnight Blizzard – Russian nation state attackers on Microsoft. These threats pose significant challenges to national security and underscore the vulnerabilities inherent in identity verification and data protection. This also means the threat landscape is expanding.

Luckily, the Threat was Prevented

In this instance, the threat was thwarted before an incident occurred. In the case of the Midnight Blizzard attack on Microsoft, we were not so lucky. The warnings come as a shot across the collective bow for all organizations and serve as a reminder that this will not be the last time a foreign entity or any other hacker will target the vulnerable.

So, what are organizations to do to keep themselves out of the headlines? Not a huge surprise, but first we must all keep up with applying security patches and up-to-date versions of the OS and application layers. To restate the obvious, good system management hygiene is a must.

But the uncomfortable truth is that bad actors log in as frequently or even more often than using some sophisticated hack to gain access. Many organizations are minimizing dependence on passwords, but they are finding that the big challenge is addressing the many authentication use cases.

Windows Hello for Business conveniently supports passwordless access within the Microsoft platform, but try getting this to work with Mac OS, Linux or even your VPN. Domain controllers and virtual machines, for example, also continue to depend on passwords. To the delight of hackers, the resulting “passwordless strategy” resembles Swiss Cheese more so than it does a Swiss Army knife supporting the broad range of needs enterprise wide.

So, on the path to eliminating passwords it’s the diversity of information technology that needs to be managed, and for good reason. Most enterprise IT environments evolved over decades as did security standards. There should be little to no expectation that somehow magically all ways of authenticating into this morass would happen with some type of hand waving let alone a black box that effortlessly solves all unanticipated authentication use cases.

Identity, it turns out, isn’t sufficiently managed with a password, an SMS code or the knowledge of mother’s maiden name. This is not new … a long list of three letter acronyms including IGA, SSO, PAM, and IAM all recognize identity as a corporate asset that needs to be managed and governed. None, however, seem able to keep up with the unrelenting attacks using social engineering and pirated account credentials. You just need to read the headlines to know this.

Closing the Open Door

At 1Kosmos, we’ve always approached passwordless MFA as a feature, but we’ve viewed the root cause authentication issue as a business challenge revolving around identity. We solved that by performing identity verification and then generating as an artifact a non-phishable passwordless MFA credential with liveness detection.

But as our passwordless journey continued, something interesting happened. We found that placing identity outside of the application platforms and providing for various levels of identity assurance tuned to the risk of the digital interaction helped us rapidly evolve our identity and authentication platform to address the constant stream of use cases that surfaced in just about every customer deployment.

It turns out that not everybody wants an app, not everybody owns a mobile device, and some work environments outright prohibit the use of mobile handsets. By offering identity verification and authentication in a single privacy-by-design platform we’ve provided ourselves and our customers an elegant way to systematically accommodate the unexpected, and in a sense, hardest-to-solve authentication use cases.

This approach to identity modernization quickly augments core identity and access management to mitigate risk, reduce technical debt, and enhance access controls, effectively closing the open door that many hackers walk through unchallenged. By way of example, it’s why we’ve been able to rapidly release app-less authentication, browser-based identity verification journeys and most recently BlockID 1Key, a biometric security key.

At 1Kosmos, we believe that by integrating identity proofing, credential verification, and strong authentication, we equip organizations with the tools and insights needed to combat identity-based attacks effectively – and in ways not possible before.

Through a collaborative and identity-centric approach to security, we help organizations bolster their resilience and navigate through this digital storm unleashed by sophisticated attackers like those behind Volt Typhoon.

Prepare Now for What May Come

Given the advanced warnings from the FBI regarding the Volt Typhoon botnet and China’s espionage, we were lucky this time. But this sequence of events telegraphs the dangers that live among us and should serve as a battle cry for heightened security measures, starting with identity verification at first and every login … for customers, workers, and citizens.

Embracing innovative technologies that in turn enable rapid business innovation … it’s the path forward to reduce risk and deliver order-of-magnitude business improvement. It’s the logical path forward for organizations that thrive on the speed of innovation and want to de-risk their business plan by modernizing and simplifying identity and access management. It’s this bright future we at 1Kosmos envision for all organizations navigating digital transformation and the delivery of digital services.

The post Strengthening Cybersecurity in the Face of Rising Threats appeared first on 1Kosmos.


Microsoft Entra (Azure AD) Blog

Auto Rollout of Conditional Access Policies in Microsoft Entra ID

In November 2023 at Microsoft Ignite, we announced Microsoft-managed policies and the auto-rollout of multifactor authentication (MFA)-related Conditional Access policies in customer tenants. Since then, we’ve rolled out report-only policies for over 500,000 tenants. These policies are part of our Secure Future Initiative, which includes key engineering advances to improve security for c

In November 2023 at Microsoft Ignite, we announced Microsoft-managed policies and the auto-rollout of multifactor authentication (MFA)-related Conditional Access policies in customer tenants. Since then, we’ve rolled out report-only policies for over 500,000 tenants. These policies are part of our Secure Future Initiative, which includes key engineering advances to improve security for customers against cyberthreats that we anticipate will increase over time. 

 

This follow-up blog will dive deeper into these policies to provide you with a comprehensive understanding of what they entail and how they function.

 

Multifactor authentication for admins accessing Microsoft admin portals

 

Admin accounts with elevated privileges are more likely to be attacked, so enforcing MFA for these roles protects these privileged administrative functions. This policy covers 14 admin roles that we consider to be highly privileged, requiring administrators to perform multifactor authentication when signing into Microsoft admin portals. This policy targets Microsoft Entra ID P1 and P2 tenants, where security defaults aren't enabled.

 

Multifactor authentication for per-user multifactor authentication users

 

Per-user MFA is when users are enabled individually and are required to perform multifactor authentication each time they sign in (with some exceptions, such as when they sign in from trusted IP addresses or when the remember MFA on trusted devices feature is turned on). For customers who are licensed for Entra ID P1, Conditional Access offers a better admin experience with many additional features, including user group and application targeting, more conditions such as risk- and device-based, integration with authentication strengths, session controls and report-only mode. This can help you be more targeted in requiring MFA, lowering end user friction while maintaining security posture.

 

This policy covers users with per-user MFA. These users are targeted by Conditional Access and are now required to perform multifactor authentication for all cloud apps. It aids organizations’ transition to Conditional Access seamlessly, ensuring no disruption to end user experiences while maintaining a high level of security.

 

This policy targets licensed users with Entra ID P1 and P2, where the security defaults policy isn't enabled and there are less than 500 per-user MFA enabled enabled/enforced users. There will be no change to the end user experience due to this policy.

 

Multifactor authentication and reauthentication for risky sign-ins

 

This policy will help your organization achieve the Optimal level for Risk Assessments in the NIST Zero Trust Maturity Model because it provides a key layer of added security assurance that triggers only when we detect high-risk sign-ins. “High-risk sign-in” means there is a very high probability that a given authentication request isn't the authorized identity owner and could indicate brute force, password spray, or token replay attacks. By dynamically responding to sign-in risk, this policy disrupts active attacks in real-time while remaining invisible to most users, particularly those who don’t have high sign-in risk. When Identity Protection detects an attack, your users will be prompted to self-remediate with MFA and reauthenticate to Entra ID, which will reset the compromised session.

 

Learn more about sign-in risk

 

This policy covers all users in Entra ID P2 tenants, where security defaults aren't enabled, all active users are already registered for MFA, and there are enough licenses for each user. As with all policies, ensure you exclude any break-glass or service accounts to avoid locking yourself out.

 

Microsoft-managed Conditional Access policies have been created in all eligible tenants in Report-only mode. These policies are suggestions from Microsoft that organizations can adapt and use for their own environment. Administrators can view and review these policies in the Conditional Access policies blade. To enhance the policies, administrators are encouraged to add customizations such as excluding emergency accounts and service accounts. Once ready, the policies can be moved to the ON state. For additional customization needs, administrators have the flexibility to clone the policies and make further adjustments. 

 

Call to Action

 

Don't wait – take action now. Enable the Microsoft-managed Conditional Access policies now and/or customize the Microsoft-managed Conditional Access policies according to your organizational needs. Your proactive approach to implementing multifactor authentication policies is crucial in fortifying your organization against evolving security threats. To learn more about how to secure your resources, visit our Microsoft-managed policies documentation.

 

Nitika Gupta  

Principal Group Product Manager, Microsoft 

LinkedIn

 

 

Learn more about Microsoft Entra: ​ 

See recent Microsoft Entra blogs  Dive into Microsoft Entra technical documentation  Learn more at Azure Active Directory (Azure AD) rename to Microsoft Entra ID  Join the conversation on the Microsoft Entra discussion space and Twitter  Learn more about Microsoft Security

Entrust

The Indispensable Role of Trusted Platform Modules in Distributed ID and Payment Card Printers

In an era where data security and privacy are paramount, the use of Trusted Platform... The post The Indispensable Role of Trusted Platform Modules in Distributed ID and Payment Card Printers appeared first on Entrust Blog.

In an era where data security and privacy are paramount, the use of Trusted Platform Modules (TPMs) in distributed ID and payment card printers is not just a recommendation but a necessity. These small hardware-based security modules serve as the sentinels guarding sensitive information in a digital age.

Security Significance of Trusted Platform Modules

At its core, a Trusted Platform Module is a dedicated microcontroller chip integrated into a device’s motherboard or system-on-a-chip. Its primary function is to secure and safeguard critical data, cryptographic keys, and system integrity. TPMs adhere to established industry standards and offer a range of cryptographic operations. They generate, store, and protect cryptographic keys and execute operations like digital signatures and card issuance. One of the most critical aspects of a TPM is its ability to provide Secure Boot processes, ensuring that the device’s firmware and operating system remain unaltered and authentic. This fundamental feature prevents attackers from compromising the system at its root, thereby establishing trust from the ground up.

Government institutions worldwide recognize the necessity of Trusted Platform Modules in safeguarding sensitive data and critical infrastructure. In the U.S., all devices connecting to government networks are mandated to incorporate TPMs as part of their security standards. These modules play a crucial role in protecting classified information, securing communications, and mitigating potential breaches. Government systems equipped with TPMs are not only more secure but also enable remote attestation, meaning that their status can be verified and authenticated by trusted entities. This is essential for ensuring that the device’s software and firmware have not been tampered with, guaranteeing the utmost security.

Imperative for ID and Payment Card Printers

Without a TPM, devices such as ID and payment card printers become susceptible to rising security threats. Hackers, malicious software, and unauthorized users can exploit vulnerabilities in these systems, potentially leading to devastating consequences:

Data Breaches: Sensitive personal and financial information, including ID and payment card data, can be compromised, resulting in identity theft and financial fraud. Counterfeit Cards: Fraudsters may take advantage of unsecured printers to produce counterfeit identification and payment cards, posing a severe risk to both individuals and institutions. System Tampering: The absence of Secure Boot processes leaves systems open to firmware attacks, making them more susceptible to malware and unauthorized access.

The inclusion of Trusted Platform Modules in distributed ID and payment card printers is a necessity. Government institutions understand the imperative need for TPMs in safeguarding sensitive information, and the same principle applies to banks, universities, and enterprises. The absence of TPMs leaves these entities exposed to security vulnerabilities that could have severe consequences.

By integrating TPMs into their systems, organizations can fortify their defenses against data breaches, counterfeiting, and system tampering. These modules, with their technical prowess and cryptographic capabilities, offer a solid foundation for robust security in the digital age, ensuring the trustworthiness of the devices that shape our lives. Entrust is the only provider globally using Trusted Platform Modules across our latest line of issuance systems. Learn how you can lay the foundation for seamless, secure payment card and ID card experiences for your end-users and staff.

The post The Indispensable Role of Trusted Platform Modules in Distributed ID and Payment Card Printers appeared first on Entrust Blog.


Ransomware and Real Estate: An Eternal Spring of Personally Identifiable Information

In recent months, ransomware attacks have gained attention and become a top concern across multiple... The post Ransomware and Real Estate: An Eternal Spring of Personally Identifiable Information appeared first on Entrust Blog.

In recent months, ransomware attacks have gained attention and become a top concern across multiple industries. The threat has affected many well-known brands, ranging from cable providers and aircraft manufacturers to mortgage servicers and title insurance companies. Ransomware is a type of malware used to infect computers and encrypt data. Once infected, the ransomware attempts to spread to connected systems. This can include computers accessible on the network, shared drives, and backups. The goal of the attack is to render data and applications unusable for the victim until a ransom is paid.

According to Corvus Insurance, ransomware leak site victims reached a record-high in November: a 39% increase from the prior month and a 110% increase year-over-year. The report suggests the uptick was largely due to LockBit and the CVE 2023-4966 Citrix Bleed vulnerability. The exploit allows threat actors to circumvent password requirements and multi-factor authentication (MFA) to hijack legitimate user sessions for harvesting credentials and accessing data. On November 21, 2023, the Cybersecurity & Infrastructure Security Agency (CISA) also issued this advisory.

For financial services organizations, ransomware attacks can be particularly damaging. In August 2023, real estate brokers and agents in the U.S. suffered from an attack that took 23 multiple listing service (MLS) systems offline. MLS systems are private databases created, maintained, and funded by real estate professionals to help clients buy and sell properties. In November, one of the largest mortgage servicers in the U.S. experienced an outage impacting millions and homeowners were unable to submit mortgage payments. Weeks later, two large title insurance providers in the U.S. and UK were attacked and closing transactions were delayed. In early January, one of the largest non-bank lenders was also attacked.

What are the consequences of a ransomware attack? Depending on the industry, the answer will vary. To understand the scope of impact, consider these two sample customer transactions: An iced latte beverage versus a residential piece of real estate. While the first is likely the smallest purchase a customer may make in a single day, the other may be the largest single purchase they make in their lifetime. If the ice cubes used for making the latte represent data, a real estate financing transaction is an iceberg. The volume of personally identifiable information (PII) involved with buying and selling a home and making monthly loan payments make real estate, mortgage, and title firms highly sought-after targets for would-be attackers.

The Ice Cube

When a national coffee chain is subjected to a ransomware attack, customers are likely to face temporary inconveniences. Systems may go offline. Point-of-sale systems for credit cards may be unavailable until the recovery process is complete. Perhaps loyalty rewards information or some confidential data is exfiltrated. If it is determined that a material breach occurred, the compliance team will act in accordance with state and federal breach notification laws. The impacted organization will notify, within a required timeframe upon discovery of the incident, the attorney general for each state in which affected customers reside. Impacted customers will subsequently receive a notification. If PII is stolen or lost for certain customers, those individuals may be offered free credit monitoring for a designated period.

The Iceberg

For a mortgage lender or title agency experiencing a ransomware attack, the incident response process is similar. However, the downstream impact can have significantly greater consequences. For those in the process of buying or selling a home, closings may be delayed. Outages may prevent appraisers from uploading reports. Title companies may be unable to disburse funds from escrow. Seller proceeds, mortgage payoff amounts, real estate agent commissions, and other payments will be delayed until systems are fully restored. The homeowner may be unable to make their monthly loan payment online. A mortgage servicer may be unable to receive borrower payments.

For most borrowers in the U.S., the mortgage servicer also facilitates the property tax and insurance payments on behalf of the homeowner. For these escrowed loans, the impact of a ransomware incident could disrupt real estate tax payments to counties and payments toward homeowner insurance policy premiums. The extent to which stolen data is used for other purposes beyond extortion, such as identity theft or credit card fraud, may also remain largely undetermined for an extended period.

The real estate mortgage and financing ecosystem depends on moving large sums of money in a timely manner between multiple transaction participants. Whether delayed mortgage payments or delayed real estate agent commissions, the immediate impact of a ransomware incident on a financial services organization can be substantial. A sustained outage could affect loan amortization schedules, interest calculations, and principal balances. To facilitate funding new mortgages, lenders regularly sell loans to government-sponsored enterprises (GSEs) such as Fannie Mae and Freddie Mac. In turn, these GSEs offer mortgage-backed securities (MBS), which consist of a group of mortgages organized to pay interest like a mortgage bond. The mission for these entities is to help provide liquidity and stability to the U.S. housing market. Listed as one of 16 critical infrastructure sectors by CISA, financial services firms play an integral role in ensuring economic stability.

From making their smallest purchase in a single day to the largest single purchase of their lifetime, a customer’s expectations regarding data protection will vary. Leaking a customer’s latte purchase history is not a desired outcome; however, the theft of data containing a borrower’s credit history, income, bank account number, and transcripts of past tax returns is exponentially less preferred. A substandard approach to securing transactions for this segment of the industry can create dangerously high levels of risk. The real estate financing industry plays a vital role in helping individuals achieve home ownership, and customer data protection efforts should be considered accordingly.

Best Practices for Preparation, Prevention, and Mitigation

CISA’s #StopRansomware Guide provides ransomware and data extortion preparation, prevention, and mitigation best practices. Preparation best practices focus on backups, incident response plans, and implementing a Zero Trust architecture. For prevention and mitigation best practices, CISA groups by initial access vectors, such as phishing and compromised credentials. Highlights from CISA’s recommendations include:

Maintain Frequent Backups: Encrypt backups of critical data and store them on separate devices inaccessible from a network. Zero Trust Architecture (ZTA): Organizations should consider implementing a Zero Trust architecture to prevent unauthorized access to data and services. Update and Patch Systems: Ensure applications and operating systems have the latest patches and updates. Updates should be obtained directly from vendor sites rather than clicking on email links. Enable automatic software updates and do not use end-of-life software. Email: If you are unsure whether an email is legitimate, CISA recommends verifying the email’s legitimacy by contacting the sender directly.

Secure Financial Futures with Entrust

From encryption to identity and access management, Entrust offers a range of solutions financial services firms are leveraging as part of their overall risk-mitigation best practices.

Entrust nShield Hardware Security Modules (HSMs) provide data security across devices, processes, platforms, and environments. Application-level encryption can be policy-based and geared to specific data protection requirements. Entrust nShield HSMs are available in a variety of hardware configurations as well as an nShield as a Service offering. Entrust KeyControl, based on a strong root of trust delivered by nShield HSMs on-premises or as a service, ensures the secure and efficient management of cryptographic assets. An enterprise platform offering centralized visibility of keys and secrets, KeyControl facilitates decentralized vaults for managing keys and secrets throughout their lifecycle for a wide range of use cases, including enterprise backup and recovery. Secure/Multipurpose Internet Mail Extension (S/MIME) allows users to encrypt and send documents securely in real time without the need for zip files or passwords. Real estate agents, title companies, and lenders can prove where and when the message originated, as well as demonstrate that documents have not been tampered with in delivery. By retroactively protecting email, Entrust S/MIME certificates also help organizations mitigate the risk of data breaches. Phishing-Resistant Identities ensure both the user and device are verified and authenticated using digital certificates to help protect against business email compromise (BEC) and account takeover (ATO) attacks. Compromised credentials are a common initial access vector in ransomware attacks. For more information, see the 2023 Gartner® Magic Quadrant™ for Access Management recognizing Entrust as a Challenger. Verified Mark Certificates (VMCs) are digital certificates that enable organizations to display their registered trademark logo in the avatar slot alongside outgoing emails. A common delivery method for ransomware is phishing. Real estate brokerages, lenders, and title agencies leveraging VMCs can prove to their transaction participants that emails received are indeed from the sending organization and not spoofed emails. VMCs also help reduce the risk of wire transfer fraud and seller impersonation fraud. Post-Quantum Readiness – Long-lived data, such as property ownership history and mortgage loan servicing information, are at greater risk of the “Harvest Now, Decrypt Later” threat. Within the decade, quantum computing capabilities powerful enough to break public key encryption protocols are expected. Organizations involved in the real estate financing industry should take steps now to protect sensitive data, applications, and transactions.

Contact us to learn how Entrust can help your organization protect data and mitigate the risk of ransomware.

The post Ransomware and Real Estate: An Eternal Spring of Personally Identifiable Information appeared first on Entrust Blog.


Extrimian

Identity and Digital Signature in Web 3.0

How are connected the digital Identity and Digital Signature in Web 3.0? Explores how digital and electronic signatures are fundamental in the age of digital identity, with a special focus on decentralized technologies, advanced cryptography and Self-Sovereign Identity within the Web 3.0 world. Disclaimer: In this article, we will use the terms “decentralized identity” and […] The post Identity
How are connected the digital Identity and Digital Signature in Web 3.0?

Explores how digital and electronic signatures are fundamental in the age of digital identity, with a special focus on decentralized technologies, advanced cryptography and Self-Sovereign Identity within the Web 3.0 world.

Disclaimer:

In this article, we will use the terms “decentralized identity” and “SSI” (Self-Sovereign Identity) interchangeably. It should be noted that the views and opinions expressed are based on our experience in the field of digital identity, but should not be interpreted as legal definitions.
What is an electronic signature?

An electronic signature, in essence, is your consent in the digital world. It can be an image of your signature, a PIN code, or even a click of a button. It is vital in managing digital identities and preventing cyber-attacks.

Digital signature: a specialization of electronic signature

The digital signature, a specific subset of the electronic signature, relies on asymmetric cryptography technologies. This technology uniquely links each signature to the signer and the document, providing reliable verification of authenticity. A clear example is the Verifiable Credentials (VCs).

Digital signature and legal entities in different countries Let’s see two examples on this subject: In Argentina: Digital signatures are restricted to individuals, which presents unique challenges for digital identities. In Mexico: Legal entities can use digital signatures, advancing in the management of digital identities. Private key management and security

The security of a digital signature is based on proper management of private keys. The responsibility for protecting these keys varies by country and type of entity, being crucial for online security.

Following the previous example, in countries where the digital signature is personal, as in Argentina, the responsibility for protecting the private key falls on the individual, which is fundamental for the prevention of cyber-attacks. In Mexico, on the other hand, the entity is responsible for its protection.
Identity in the legal and commercial sphere

Identity, both for individuals and entities, is built through actions, reputation and other attributes accumulated over time. For legal entities, this includes their track record, credentials and legal compliance. This identity is crucial in business and legal interactions and operations, and is at the core of what defines Self-Sovereign Identity (SSI) in the Web 3.0 era.

Digital signature in Web3

Web3, with its focus on decentralized identity, proposes a new paradigm in the management of digital signatures. This decentralized approach, exemplified by protocols such as QuarkID, significantly improves interoperability between different systems and entities, allowing greater flexibility and control by users over their digital identities.

In turn, these technologies enable selective disclosure, which allows users to share only the information that is essential for a specific transaction, protecting their privacy.

Simultaneously, the concept of non-repudiation ensures that users’ actions, backed by the immutability of the blockchain and advanced digital signatures, are indisputable, increasing the security and reliability of digital interactions.

Conclusion

Digital and electronic signatures are essential in the era of Web 3.0 and digital identity. Understanding these concepts is crucial for their effective implementation. To learn more about how Extrimian integrates digital identity and cybersecurity solutions, visit Extrimian.

Glossary Digital Identity:

Electronic representation of a person or entity in the digital world, composed of data and information that define its characteristics and credentials.

Electronic Signature:

Electronic method to indicate consent or approval in digital documents or online transactions. It can include anything from a scanned image of a handwritten signature to more sophisticated methods.

Digital Signature:

Advanced type of electronic signature based on asymmetric cryptography, which ensures the authenticity and integrity of a digital document.

Asymmetric cryptography:

Encryption system that uses a pair of keys, one public and one private, for data encryption and decryption.

Verifiable Credentials (VCs):

Digital credentials that can be verified electronically, such as a digital ID or a university degree.

Private Key Management:

The process of managing and protecting the private cryptographic keys used in digital signature and data encryption.

Self-Sovereign Identity (SSI):

Concept of digital identity that allows individuals to control and manage their own identity data autonomously.

Selective Disclosure:

Technique that allows users to selectively disclose certain information while keeping other personal details private.

Non-Repudiation:

Property that ensures that once a digital action (such as a digital signature) has been performed, the author cannot deny its authorship.

The post Identity and Digital Signature in Web 3.0 first appeared on Extrimian.


CADENA Project: a IDB and Extrimian solution

Innovation in Border Controls between IDB and Extrimian Have you ever wondered how technology can transform customs? Well, here’s an answer! The CADENA Project: a IDB and Extrimian Solution. The IDB, or Inter-American Development Bank (IDB), in collaboration with Extrimian, is doing exactly that. In this post, let’s dive into this fascinating world of innovation, […] The post CADENA Project: a I
Innovation in Border Controls between IDB and Extrimian

Have you ever wondered how technology can transform customs? Well, here’s an answer! The CADENA Project: a IDB and Extrimian Solution.

The IDB, or Inter-American Development Bank (IDB), in collaboration with Extrimian, is doing exactly that. In this post, let’s dive into this fascinating world of innovation, security and cryptography.

What is the CADENA Project?

The CADENA Project seeks to revolutionize customs processes towards greater efficiency and transparency. And how does it do that? Through decentralized technology, facilitating trade and strengthening trust in global supply chains – it’s an innovative answer to the challenges of international trade!

The Extrimian’s Role in the CADENA project

Extrimian collaborates closely with the IDB and together they have recently made significant progress:

Creation and management of Decentralized Identifiers (DIDs) and Verifiable Credentials (VCs).
Development of an intuitive user interface to manage Authorized Economic Operators (AEOs) registries.
Implementation of different roles and permissions, ensuring proper access and configuration.
Automatic generation of cryptographic keys and secure synchronization between customs offices.
Efficient management of Mutual Recognition Agreements (MRAs).
Simplified processes for agile deployment, despite complex technological infrastructure.

Benefits and Potential Impact

The implementation of CADENA can be a game changer in international trade. It will facilitate the adherence of more customs to the network, promoting fairer, safer and more transparent trade.

Next Steps and Validation

The IDB and Extrimian are not stopping here. Next steps include:

Testing in real environments and gathering feedback.
Implementation for information exchange through Verifiable Credentials (VCs).
Ongoing pilot testing with selected customs and authorized economic operators.

Conclusion

The CADENA Project is a pioneering initiative that marks a before and after in the customs administrations of Latin America and the Caribbean. With the IDB and Extrimian we are witnessing a true revolution in international trade. Stay tuned for more news!

Glossary CADENA Project:

Initiative promoted by the Inter-American Development Bank (IDB) and Extrimian, designed to transform customs administrations in Latin America and the Caribbean through decentralized technologies.

Inter-American Development Bank (IDB):

International financial organization that supports projects in Latin America and the Caribbean, focusing on economic, social and institutional development.

Extrimian:

Technology company that collaborates with the IDB in the CADENA Project, specialized in cryptography and digital identity solutions.

Decentralized Technology:

Technology that operates in a distributed network, rather than being centralized in a single location or server. This can include blockchain and other forms of distributed technology.

Decentralized Identifiers (DIDs):

Digital identification systems that enable secure, decentralized online identity verification.

Verifiable Credentials (VCs):

Digital documents that can be used to verify the authenticity of an identity or related information in a secure and reliable manner.

Authorized Economic Operators (AEOs):

Commercial entities that have been certified by national or international authorities as secure and reliable in the supply chain.

Cryptographic Keys:

Tools used in cryptography to secure communication and information, allowing data encryption and decryption.

Mutual Recognition Arrangements (MRAs): Agreements between countries or regions that facilitate trade by recognizing each other’s security and conformity assessments.

Pilot Testing:

Experimental tests conducted to evaluate the feasibility, performance and effectiveness of a new project or system prior to its full implementation.

The post CADENA Project: a IDB and Extrimian solution first appeared on Extrimian.


Civic

Welcome, Base Builders!

Today, we’re thrilled to offer Civic Pass to Base builders, who have come to expect secure, low-cost, builder-friendly tools ready to bring the next billion users on-chain. We’ve been hard at work, perfecting the Civic Pass user experience among chains and wallets, and we’re delighted to introduce Civic Pass for Base. Has your Web3 app […] The post Welcome, Base Builders! appeared first on Civic

Today, we’re thrilled to offer Civic Pass to Base builders, who have come to expect secure, low-cost, builder-friendly tools ready to bring the next billion users on-chain. We’ve been hard at work, perfecting the Civic Pass user experience among chains and wallets, and we’re delighted to introduce Civic Pass for Base. Has your Web3 app […]

The post Welcome, Base Builders! appeared first on Civic Technologies, Inc..


This week in identity

E45 - Okta Layoffs / Tech Downturn / Market Consolidation

This week Simon and David take a look at the recent announcement that Okta are laying off 400 staff globally. Is this part of a broader tech slow down? They discuss some of the trends from 2023 with respect to staff attrition and the impact that has had. With funding still high for IAM and cyber what does 2024 have in store?

This week Simon and David take a look at the recent announcement that Okta are laying off 400 staff globally. Is this part of a broader tech slow down? They discuss some of the trends from 2023 with respect to staff attrition and the impact that has had. With funding still high for IAM and cyber what does 2024 have in store?


BlueSky

Join Bluesky Today (Bye, Invites!)

Sign up for Bluesky! No invite code required.

Bluesky is building an open social network where anyone can contribute, while still providing an easy-to-use experience for users. For the past year, we used invite codes to help us manage growth while we built features like moderation tooling, custom feeds, and more. Now, we’re ready for anyone to join.

Sign up for Bluesky

Join more than three million people discussing news, sharing art, and just posting.

What is Bluesky?

To mark the occasion, we teamed up with Davis Bickford, an artist on the network, to share why we’re excited about Bluesky.

To learn more about Bluesky and how to get started, read our user FAQ here.

And if deep dives are more your style, we worked with Martin Kleppman, author of Designing Data-Intensive Applications and technical advisor to Bluesky, to write a paper that goes into more detail about the technical underpinnings of Bluesky.

Looking Forward

We’ve been working on more features that put you in control of your social media experience. Here’s what you can expect to see soon:

Stackable Moderation Services

Safety is core to social media. Bluesky moderates the app according to our community guidelines, and our vision for composable moderation allows users to stack more moderation services together, such as subscribable moderation lists.

In the coming weeks, we’re excited to release the labeling services which will allow users to stack more options on top of their existing moderation preferences. This will allow other organizations and people to run their own moderation services that can account for industry-specific knowledge or specific cultural norms, among other preferences.

One potential use case for labeling is fact-checking. For example, a fact-checking organization can run a labeling service and mark posts as “partially false,” “misleading,” or other categories. Then, users who trust this organization can subscribe to their labels. As the user scrolls through posts in the app, any labels that the fact-checking organization publishes will be visible on the post itself. This helps in the effective distribution of the fact-check and keeps users better informed.

We’ll be sharing more in the coming weeks. In the meantime, if you’re interested in partnering with Bluesky and setting up a labeling service, contact us at partnerships@blueskyweb.xyz.

An Open Social Network

When you log in to Bluesky, it might look and feel familiar — the user experience should be straightforward. But under the hood, we’ve designed the app in a way that puts control back in your hands. Here, your experience online isn’t controlled by a single company. Whether it's your timeline or content filters, on Bluesky, you can easily customize your social experience.

This month, we’ll be rolling out an experimental early version of “federation,” or the feature that makes the network so open and customizable. On Bluesky, you’ll have the freedom to choose (and the right to leave) instead of being held to the whims of private companies or black box algorithms. And wherever you go, your friends and relationships can go with you.

For developers: We’ve already federated the network among multiple servers internally, and later this month, you’ll be able to self-host a server that connects to the main production network. You’ll be part of the first batch of servers that federate with the network, so expect to experiment alongside us! We’ll share more information on how to join the production network with your own server soon. In the meantime, you can test out your server set up via our developer sandbox. Find instructions here.

Saturday, 03. February 2024

Radiant Logic

Reducing IAM Technical Debt with an Identity Data Fabric Approach 

Gartner lists 5 key challenges that result from IAM technical debt; get our four step approach to a solution based on our Identity Data Fabric. The post Reducing IAM Technical Debt with an Identity Data Fabric Approach  appeared first on Radiant Logic.

Indicio

Indicio Community Meetup: looking at digital travel in 2024

The post Indicio Community Meetup: looking at digital travel in 2024 appeared first on Indicio.
The first 2024 Indicio Identity Community Meetup saw Michael Zuriek of SITA and Heather Dahl of Indicio analyze the rapid digital transformation in travel and hospitality driven by the deployment of digital travel credentials. Watch the recording here.

By Tim Spring

The Indicio Identity Community Meetup kicked off its first discussion of 2024 with Michael Zuriek, Head of Innovation for Digital Travel at SITA, and Heather Dahl, CEO of Indicio, diving into the ways verifiable credential technology will transform the experience of travel for everyone — passengers, airlines, airports, and governments.

First, some context. Indicio and SITA are partners who successfully developed and deployed a Digital Travel Credential (DTC) following International Civil Aviation Organization (ICAO) standards.

Why is this a revolutionary technology for travel?

A paradigm shift is coming: Seamless travel

For years, countries have been storing traveler data in centralized databases and using complex agreements and policies to share that information to make travel possible. It’s why there are so many hoops travelers have to jump through when crossing borders.

Now, with a Digital Travel Credential (DTC) that follows ICAO’s global standard for storing identity data on personal mobile devices, there is no need for this complex, cumbersome system to facilitate border crossing. 

Instead, the DTC allows security to simply scan the traveler’s digital information, in a fraction of the time of using a manual, paper-based system. And it provides a way to pre-authorize travel and check in before the traveler leaves home.

We have been conditioned to wait; but automation can severely reduce the bottlenecks in airports.

Dahl described using a DTC to pass through Aruba’s immigration as “almost magical.”: The system was so efficient and seamless that the border guard had waved her through before she was even aware of what was happening. The DTC allowed her paperwork and identity to be verified ahead of time and a biometric scan at the airport checked that she was the correct individual tied to that travel information. 

Every traveler to Aruba will soon be able to have the same experience as the island implements DTC-enabled technology.

By using a DTC for all the touchpoints at an airport where identity needs to be checked, passengers get a much quicker experience, while airports, airlines, and governments benefit from the verified data, simplified privacy compliance, and enhanced security that comes from decentralized identity.

Airports have already invested significantly into biometrics when Covid hit, we could see rapid adoption of the DTC as some of the hardware is already in place.

The EU, US, IATA, and other governments and industry organizations are all currently working on bringing biometrics and digital travel together. Because Indicio and SITA’s DTC works on any airport system, deployment of the DTC will be rapid. The DTC makes biometric technology work much more efficiently and effectively, while simplifying data privacy compliance. And, on a practical level, as airports are an industry of fragmented IT systems, ease of adoption and integration is a critical value proposition. As many airports have biometric investments, the DTC is an easy way to capitalize on these investments and improve processes. 

These are just a few of takeaways from the conversation; if you have time, I highly recommend you watch the full recording which can be found here. Get in touch with Indicio today to get the DTC for your business.

Join us for the next meeting of the Indicio Identity Community Meetup, where we’ll be speaking with Klaeri Schelhowe, Executive Director of Trust Alliance New Zealand and their Digital Farm Wallet Pilot Project a tool that enables farmers, farm enterprises, and industry organizations to manage critical farm data in an efficient and secure way.

The post Indicio Community Meetup: looking at digital travel in 2024 appeared first on Indicio.


Shyft Network

The Shyft Perspective: Global Crypto Regulatory Outlook 2024

The European Securities and Markets Authority has urged member states to decide whether to opt into MiCA’s grandfathering scheme or not by June 30th, 2024. The first part of South Korea’s planned two-part crypto regulation draft will come into force in July this year. 2024 is going to be a regulatory intense year for the crypto space in the US, with the Clarity for Payment Stablecoins
The European Securities and Markets Authority has urged member states to decide whether to opt into MiCA’s grandfathering scheme or not by June 30th, 2024. The first part of South Korea’s planned two-part crypto regulation draft will come into force in July this year. 2024 is going to be a regulatory intense year for the crypto space in the US, with the Clarity for Payment Stablecoins Act setting the stage for stablecoin regulations in the country.

Just weeks into 2024, the crypto world is already seeing major regulatory changes, with the most notable one being the US SEC’s approval of 11 spot Bitcoin ETFs. That was just the tip of the iceberg, though, as a lot more is to come this year in areas like issuance, services, stablecoins, and oversight, with a strong focus on AML and KYC measures.

Emerging technologies such as DeFi, NFTs, and DAOs will also be under the regulatory lens, especially in Europe, Japan, and parts of Africa this year. So, what major crypto regulatory changes are we to expect from these areas in 2024?

Europe

2024 will bring a wave of sweeping changes across the crypto landscape. The rollout of the Markets in Crypto Asset Regulation (MiCA) in EU states is reshaping the scene, even influencing economic partners like Switzerland. As this unfolds, European regulators are joining forces, aiming to streamline how they categorize, monitor, and handle the tech side of crypto assets, creating a unified approach across the continent, with different types of crypto assets getting their own unique set of rules.

European Union

As a part of the Markets in Crypto Assets regulation rollout, the European Securities and Market Authority is set to issue guidelines on crypto asset categorization, market monitoring, and technological infrastructure standards. By June 30th, 2024, EU states must decide on opting into the grandfathering scheme under MiCA, which permits crypto asset service providers to operate under national laws until MiCA authorization is obtained or denied.

This scenario could lead to varied regulatory approaches across Europe, which ESMA aims to harmonize. Furthermore, the Securities and Markets Stakeholder Group is advising ESMA to ensure MiCA’s alignment with Decentralized Finance (DeFi) by the end of 2024.

The United Kingdom

Instead of creating a standalone regulatory framework like the EU’s MiCA, the UK plans to incorporate crypto activities into its existing financial services framework, necessitating crypto firms to comply with traditional banking regulations. The Financial Conduct Authority will offer guidance for crypto businesses requiring authorization under this regime.

Different types of crypto assets, such as unique NFTs and already-regulated assets like security tokens, will also be distinctly regulated this year. Moreover, legislation to formalize this framework is anticipated in 2024, with transitional arrangements for businesses. As for stablecoins, the UK’s immediate regulatory emphasis is on fiat-backed ones used in payment systems, with legislation targeted for early 2024. Meanwhile, the regulation of Decentralized Finance remains under consideration.

Switzerland

Switzerland is implementing the Crypto-Asset Reporting Framework (CARF) this year, bringing digital assets under tax transparency regulations and necessitating due diligence on users by crypto service providers. Moreover, the Federal Department of Finance also plans to prepare a consultation draft for implementing these rules, targeting completion by the end of June 2024.

Asia Pacific (APAC)

Across the Asia Pacific region, governments and regulatory authorities are placing a heightened focus on strengthening the regulatory frameworks to ensure the security of digital assets and protect consumers. This includes the implementation of clear taxation policies for crypto assets, where profits from transactions are subject to taxation, and strict reporting requirements are imposed on crypto investors.

Governments are also emphasizing the importance of security and compliance, with regulations targeting anti-money laundering (AML), know-your-customer (KYC) procedures, and robust cybersecurity standards. Additionally, emerging technologies like DeFi, NFTs, and stablecoins are receiving regulatory attention, leading to the development of comprehensive policies and guidelines.

Japan

In 2024, Japan will continue its detailed approach to cryptocurrency regulation, focusing on consumer protection. This is in response to past breaches at Japanese exchanges, emphasizing the need for stringent security measures. We can also expect Japan to initiate its much-anticipated tax policy reforms to support the growth of crypto startups this year.

India

India is yet to finalize its approach to cryptocurrencies, with the anticipated cryptocurrency bill still pending since 2021. However, the Reserve Bank of India remains skeptical of cryptocurrencies and instead favors CBDCs over other digital currencies, indicating no major regulatory changes in 2024. So, we are unlikely to see any major changes in terms of crypto regulations in India in 2024.

South Korea

South Korea is working on a two-part crypto regulation framework for 2024. The first part, focusing on structuring the crypto market, will take effect in July 2024, whereas the second part, which is still under development, will establish rules for the issuance, listing, and delisting of cryptocurrencies. The country has also decided not to support crypto ETFs and is considering banning credit card purchases of cryptocurrencies. Adding to this, both officials and companies will need to start revealing their crypto holdings from next year.

Singapore

In 2024, the Monetary Authority of Singapore is set to focus on reducing speculative trading in cryptocurrencies, including the ban on cryptocurrency platforms offering incentives for trading. Moreover, starting in mid-2024, the use of Singapore credit cards to purchase digital payment tokens will no longer be allowed.

Hong Kong

The Hong Kong Monetary Authority and the Securities and Futures Commission are considering applications for spot crypto ETFs and other virtual asset funds this year. It is expected to make significant progress in stablecoin regulation, too, with a consultation paper already published seeking public feedback. The regulator also has plans to unveil a “sandbox” framework, enabling market participants to engage directly with the regulatory environment in 2024.

China

The People’s Bank of China recently emphasized the need for global crypto rules, maintaining its tough stance on cryptocurrencies. However, the country is pushing for standardization in the metaverse sector, collaborating with tech giants and government representatives, and has proposed a ban on converting virtual gaming tokens to real-world assets this year.

Australia

Australia plans to advance its crypto licensing framework in 2024, introducing a regulatory framework covering licensing and custody rules for crypto asset providers. This legislation, once approved, will include a 12-month transitional period. The Treasury and Reserve Bank of Australia will also publish a joint report on CBDC research this year, setting a roadmap for future work.

The Americas

Governments throughout the Americas are actively enhancing cryptocurrency regulations to address common concerns. They’re taking steps to combat issues like money laundering and fraud and safeguard investors.

A prominent focus is on regulating stablecoins, with measures to ensure that issuers maintain sufficient reserves and adhere to financial rules gaining traction.

Additionally, tax regulations concerning cryptocurrencies are being clarified, encompassing reporting requirements, capital gains taxes, and income tax obligations for crypto-related activities.

The United States

2024 in the US kicked off with the launch of the country’s first Spot Bitcoin ETF. Yet, the SEC is maintaining its firm stance, not relenting in its rigorous crypto scrutiny. Laws for stablecoin issuers are also likely to emerge in 2024 following the House Financial Services Committee’s passage of the Clarity for Payment Stablecoins Act, which is currently awaiting consideration in the House of Representatives.

Another key focus for this year is the Financial Innovation and Technology for the 21st Century Act, which could shift the crypto world from securities to commodities, with the CFTC taking the lead. Similarly, the bipartisan Responsible Financial Innovation Act aims to classify most cryptos as commodities, shifting more responsibility to the CFTC and establishing stablecoin regulations.

Moreover, from January 2024, the 2021 Infrastructure Investment and Jobs Act has been in effect, making it mandatory for entities to report crypto transactions over $10,000 to the IRS or face serious consequences. Lastly, the outcome of the US elections in November 2024 could significantly impact the crypto industry, as a crypto-friendly administration could bring much-needed clarity and legitimacy to the sector.

Canada

The Canadian Securities Administrator announced new regulations for public investment funds in crypto assets in January this year, which is open for public feedback until April 17th, 2024. The new rule will permit only the non-redeemable and alternative investment funds to handle crypto transactions directly, requiring other mutual funds to invest through these entities. It also underlines that the crypto assets must also be fungible, insured, stored offline, and listed on a recognized Canadian exchange with an annual custodian review.

Brazil

In Brazil, policymakers are working to tighten crypto regulation and bring brokerages under stricter supervision in 2024. Meanwhile, the newly enacted income tax regulations, effective January 1st, 2024, impose up to 15% tax on individuals earning over $1,200 from foreign-based exchanges, aiming to generate $4 billion in tax revenue. Also, the Brazilian central bank’s digital currency, DREX, is in its testing phase until February 14th, 2024, with a public launch planned by year-end.

Colombia

Colombian lawmakers are drafting a bill for comprehensive crypto regulation, with collaboration between the crypto ecosystem and government bodies like the Central Bank and the Financial Superintendency. The bill, expected to be presented publicly this year, will undergo analysis by the Executive and then Congressional approval.

Argentina

This year, Argentinian President Javier Milei, a pro-Bitcoin advocate, aims to introduce crypto regulation aligned with the IMF’s anti-money laundering approach.

Milei’s draft bill, the Law of Bases and Starting Points for the Freedom of Argentines proposes favorable tax rates for declared domestic and foreign crypto holdings and legalizes their use. Tax rates start at 5% until March 2024, then increase to 10% through June, and 15% thereafter until September 2024.‍

Middle East & Africa

In the Middle East and Africa, governments are attempting to strike a balance between nurturing the growth of digital finance and addressing regulatory challenges. This trend involves a focus on developing structured frameworks for cryptocurrencies, especially stablecoins, reflecting an awareness of their growing economic impact. However, there’s also an emphasis on enhancing anti-money laundering and combating the financing of terrorism measures within the crypto sector.

The United Arab Emirates

​​This year, the UAE is on the brink of finalizing its stablecoin laws, with Dubai’s Virtual Asset Regulatory Authority leading the charge. VARA, the first of its kind in crypto supervision, has revamped its virtual asset rulebook to integrate regulations for fiat-referenced virtual assets (FRVAs), or stablecoins. These rules categorize cryptocurrencies, placing FRVAs in ‘category 1’. For stablecoin issuers, this means getting authorization and a license from VARA and adhering to its stringent regulations and the newly minted FRVA Rules.

The UAE’s Financial Services Regulatory Authority has also recently tweaked its AML and sanctions rules in line with the FATF Travel Rule, specifically targeting provisions related to wire transfers in the digital asset realm.

Saudi Arabia

As of yet, Saudi Arabia does not have a regulatory framework for cryptocurrencies. The country has also banned banks from processing transactions related to crypto, while the Saudi Central Bank governor has underscored the need for adequate supervision, regulation, and coordination in virtual currency activities. Amidst this, SAMA has been experimenting through the launch of Project Aber, a joint collaboration between SAMA and the Central Bank of the United Arab Emirates (“CBUAE”), which explores CBDC for cross-border payments. Besides these developments that will continue this year, there’s not much happening on the regulatory front in Saudi.

South Africa

Last year, on October 22nd, South Africa began recognizing cryptocurrencies as a financial product and made it mandatory for all crypto exchanges to secure operating licenses before the year ends. While the Financial Advisory and Intermediary Services Act currently governs them, crypto-related services will soon fall under the Conduct of Financial Institutions bill, set to become law in the coming years. Additionally, with the Financial Action Task Force closely monitoring South Africa, 2024 will bring more robust anti-money laundering and counter-terrorism financing measures to the crypto sector, aiming to help the country exit the greylist by 2025.

Nigeria

Nigeria, after lifting its 2021 crypto ban, has set out new rules for VASPs, focusing on strong KYC and AML checks. However, Nigerian banks still can’t trade or hold cryptocurrencies themselves. At the same time, Nigeria’s Securities and Exchange Commission is starting to handle license applications from crypto custodians and exchanges, showing a careful move towards more involvement in cryptocurrencies.

Kenya

After passing the Finance Bill 2023 and signing it into law, Kenya is ready for a comprehensive digital asset regulatory framework covering tax integration and revenue guidelines. The local industry lobby group, the Blockchain Association of Kenya (BAK), is tasked with preparing the first draft of a VASP bill referred to as the Crypto Bill by the National Assembly’s Departmental Committee on Finance and National Planning. With that, Kenya might become the first country where the industry’s representatives develop the regulatory framework for crypto.

Concluding Thoughts

In 2024, we’re at a crossroads as far as global crypto regulations are concerned. Across the globe, from the European Union to the United States, governments are crafting new rules, influencing how the world perceives and uses cryptocurrencies in everyday life.

While each country’s approach varies, it’s crucial to ensure that the underlying intent remains consistent: safeguarding user experience and security. The decisions made this year will not only shape the immediate future of cryptocurrencies but will also lay the groundwork for their integration into the mainstream global financial ecosystem. So, this year will go down in history as a defining moment for the crypto industry, one that could determine its trajectory for the years to come.

______________________________________

Shyft Network powers trust on the blockchain and economies of trust. It is a public protocol designed to drive data discoverability and compliance into blockchain while preserving privacy and sovereignty. SHFT is its native token and fuel of the network.

Shyft Network facilitates the transfer of verifiable data between centralized and decentralized ecosystems. It sets the highest crypto compliance standard and provides the only frictionless Crypto Travel Rule compliance solution while protecting user data.

Visitour website to read more, and follow us on X (Formerly Twitter), GitHub, LinkedIn,Telegram,Medium, andYouTube.Sign up for our newsletter to keep up-to-date on all things privacy and compliance.

The Shyft Perspective: Global Crypto Regulatory Outlook 2024 was originally published in Shyft Network on Medium, where people are continuing the conversation by highlighting and responding to this story.


Veriscope Regulatory Recap — 22nd January to 4th February

Veriscope Regulatory Recap — 22nd January to 4th February Welcome to the latest edition of Veriscope Regulatory Recap! In today’s issue, we will cover the latest regulatory updates from all over the world from the last two weeks as we enter into the second month of 2024. As crypto adoption grows, regulatory developments, too, are getting ramped up with the US and Europe aiming for more
Veriscope Regulatory Recap — 22nd January to 4th February

Welcome to the latest edition of Veriscope Regulatory Recap! In today’s issue, we will cover the latest regulatory updates from all over the world from the last two weeks as we enter into the second month of 2024.

As crypto adoption grows, regulatory developments, too, are getting ramped up with the US and Europe aiming for more clear guidelines. Meanwhile, in another part of the world, China remains unfriendly towards crypto while its special administrative region, Hong Kong, continues to provide regulatory guidelines in an attempt to become a crypto hub.

So, let’s take a deeper look into these regulatory advances being made around the globe and their impact on the crypto world.

The Global Landscape of Crypto Regulations

One of the major developments on the crypto regulatory front comes from the US, as American lawmakers are trying to delete the controversial accounting bulletin that imposes restrictions on companies for holding their customer’s crypto assets.

In Asia, Hong Kong is making major crypto regulatory moves, as it is planning to consult on the regulatory framework for over-the-counter crypto venues “very soon.” China, meanwhile, is amending its anti-money laundering regulations for the first time since 2007 to cover crypto-related transactions.

Similarly, in Europe, the European Securities and Markets Authorities, the securities regulatory authority of the 27-member union, is seeking public feedback until April-end to assess the criteria for classifying crypto as financial instruments.

With the proposed guidelines, the financial regulator aims to provide authorities and market participants with flexible conditions to determine whether a crypto asset can be categorized as a financial instrument, avoid having a one-size-fits-all approach, and ensure consistency across the region.

US Lawmakers Challenge the SEC on Crypto Custody Rule

Members of Congress, including Mike Flood (R-Neb.), Sen. Cynthia Lummis (R-Wyo.), and Reps. Wiley Nickel (D-N.C.) has introduced resolutions to repeal the Securities and Exchange Commission’s 2022 staff accounting bulletin №121.

The custody rule requires a company holding a client’s crypto assets to do so on its own balance sheet. This requires custodians holding crypto to maintain capital reserves to offset the risk, which could deter institutions and regulated banks from offering crypto custody options.

According to lawmakers behind the resolution, the legally binding directive was issued by the SEC without the required process of notice-and-comment, and hence, it must be repealed.

China to Introduce Revised Crypto AML Rules

Amid calls for more scrutiny, China is set to update its AML rules to cover cryptocurrency transactions. The revised rules are expected to go into effect in 2025.

Late last month, Prime Minister Li Qiang conducted an executive meeting of the State Council to discuss the new draft of its AML regulations, including crypto provisions.

While the AML rule aims to curb illicit financial flows, it also acknowledges the ongoing engagement with cryptocurrencies despite official bans.

Speaking of crypto-focused AML regulations, China won’t be the first to implement the rule, as it has been adopted by other countries as well in the form of the FATF Travel Rule. For example, recently, the EU also expanded its AML laws to cryptocurrencies, requiring obliged entities to report on crypto transactions exceeding €1,000.

Interesting Reads

The Shyft Perspective: Global Crypto Regulatory Outlook 2024

EBA’s Amended Money Laundering & Terrorist Financing Guidelines Explained

__________________________

VASPs need a Travel Rule Solution to comply with the FATF Travel Rule. Have you zeroed in on it yet? Check out Veriscope, the only frictionless crypto Travel Rule compliance solution.

Visit our website to read more, and contact our team for a discussion.

Follow us on X (Formerly Twitter), LinkedIn, Telegram, and Medium for up-to-date news from the world of crypto regulations. To keep up-to-date on all things crypto regulations, sign up for our newsletter.

Veriscope Regulatory Recap — 22nd January to 4th February was originally published in Shyft Network on Medium, where people are continuing the conversation by highlighting and responding to this story.


Elliptic

Crypto regulatory affairs: UAE continues innovation push with digital dirham transfer

The United Arab Emirates demonstrated its commitment to financial sector innovation last week when it undertook the first cross-border transfer of its central bank digital currency (CBDC), the digital dirham. 

The United Arab Emirates demonstrated its commitment to financial sector innovation last week when it undertook the first cross-border transfer of its central bank digital currency (CBDC), the digital dirham. 


Entrust

Zero Trust 1975 Style

There’s been quite a buzz about Zero Trust in the last couple of years. Its... The post Zero Trust 1975 Style appeared first on Entrust Blog.

There’s been quite a buzz about Zero Trust in the last couple of years. Its genesis dates back to 2009, when the then Forrester Research analyst John Kindervag popularized the term when presenting the idea that an organization should not extend trust to anything inside or outside its perimeters. However, Zero Trust is still not a theme that resonates with all IT security experts. There are some who find it too fuzzy and non-deterministic, given that it is more akin to a philosophy or organizational strategy than a prescriptive certification that can be followed to the letter.

Zero Trust Principles – A Look Back in Time

We concluded that Zero Trust is firmly rooted in traditional well-established security design principles. We reached out to our Director of Product Security, Pali Surdhar, to get his view on Zero Trust. Pali forwarded a vintage document for our edification. The document “The Protection of Information in Computer Systems,” authored by Jerome H. Saltzer and Michael D. Schroeder, was published a mere 49 years ago in 1975. This was a time when computers were the preserve of universities, industry, and governments, and usually so big that they needed a large room to accommodate them. It was a time when Gates, Allen, Jobs, and Wozniak were just fledgling computer enthusiasts about to start their transformational journey in personal computing.

A quote from the abstract section of the paper gives a flavor of the content. “[this paper] examines in depth the principles of modern protection architectures and the relation between capability systems and access control list systems, and ends with a brief analysis of protected subsystems and protected objects.”

Security Principles Circa 1975

The paper then introduces three security categories:

Unauthorized information release Unauthorized information modification Unauthorized denial of use

These terms still resonate, although today we’d probably describe them as data breach, code hacking, and finally, Denial of Service, or perhaps a ransomware attack, where a bad actor has taken measures to lock out the authorized user.

The paper then provides examples of security techniques sometimes applied to computer systems. We thought it would be worthwhile translating into today’s cybersecurity terminology.

Security Techniques  – 1975 Today’s Security Terminology Aligned with a Zero Trust Principle? Labeling files with lists of authorized users Access Control List – defines who has access to a system or application. Authorization and authentication could apply here too. Yes Verifying the identity of a prospective user by demanding a password Password verification or even passwordless MFA Yes Shielding the computer to prevent interception and subsequent interpretation of electromagnetic radiation Side channel attacks. Typically require an antenna or probe within close proximity of a device to eavesdrop by analyzing electromagnetic signals. N/A Enciphering information sent over telephone lines Encryption, probably using TLS or similar. Refer to our last blog post Harvest Now, Decrypt Later – Fact or Fiction for more information. Yes Locking the room containing the computer Access control, defense in depth, secure room/vault, etc. These security measures are all still in use today. Yes Controlling who is allowed to make changes to the computer system (both its hardware and software) Least privilege principle – only giving access to those who have the correct clearance and need access to the system Yes Using redundant circuits or programmed cross-checks that maintain security in the face of hardware or software failures High availability would typically be used today with hardware devices having a backup device ready to continue service should a failure occur. Software can be designed with similar fault tolerance. N/A Certifying that the hardware and software are actually implemented as intended In today’s IT environment this would involve a combination of internal quality assurance, security testing, and approval of hardware or software design. In addition, the code or hardware could be submitted to a third party for penetration testing and product certification against industry standards such as FIPS 140-2/3 or PCI DSS. Given today’s complex software and hardware systems, knowing the provenance of the components is essential. This is achieved through bills of materials, code signing, and security processors in hardware. Yes

So while the language used is reflective of the time it was written, we think you’ll agree that most of the ideas conveyed are security principles that equally apply in today’s modern distributed architectures.

NIST, CSA, and the CIA Triad

For those not inclined to read the 59-page NIST SP 800-207 document, here are the seven core tenets of Zero Trust.

 

Building on the NIST publication, the Cloud Security Alliance (CSA) has recently published its Zero Trust Guiding Principles. For anyone who wants to get up the learning curve on Zero Trust without digging too much into the weeds, it is an excellent read. It is also reassuring that the CSA Zero Trust Working Group are of the same opinion as us:

“Zero Trust is a collection of long-standing principles applied in a way that aligns the security architecture with the way we work and live.“

The CSA document then provides a list of the security principles critical to the success of your Zero Trust effort:

Concept of least-privilege access controls (e.g., preventative) Separation of duties (e.g., preventative) Segmentation/micro-segmentation (e.g., preventative) Logging and monitoring (e.g., detective) Configuration drift remediation (e.g., corrective/reactive)

While the term Zero Trust was coined by Kindervag 34 years after the Saltzer & Schroeder paper, we think you’ll agree that many of those principles align with the 1975 document.

And while things have certainly changed since 1975, when computers were monolithic and had limited networking capabilities (think remote users connecting in from terminals), the concerns about security were the same: protecting confidentiality, integrity, and availability of data. Hopefully most of you will be familiar with the CIA triad, another cybersecurity fundamental.

The Unchanging Need for Robust Security

Today, while the concerns remain the same, the complexity of compute environments has greatly increased. For example, in current environments, systems can be spread out across multiple clouds, software systems consist of many microservices, and data resides everywhere, just to name a few of the complexities. Additionally, today’s users not only consist of internal computer users sitting at their desks with directly connected terminals, but also potentially anyone, anywhere on the planet, accessing these complex systems via the internet.

The problem, and the strategy to mitigate the problem, have remained somewhat unchanged over the years. It is just the scale of the problem that has changed. Saltzer and Schroeder knew that the identity of a user needed to be established. They specified using a password for this. At the time with the limited access (no wide area network) this was sufficient. Today users (and adversaries) can access systems from anywhere. This makes establishing identity require something more than a strong password.

Furthermore, while Saltzer and Schroeder talked about locking their 1970s computer room (i.e., protecting the perimeter), they recognized that wasn’t in itself sufficient. In today’s world of distributed users and a multitude of devices wanting access, those physical walls have been replaced by a virtual perimeter, making it even harder to protect and further emphasizing the need for alternative approaches. Zero Trust therefore does not focus on perimeter defense, but instead advocates for strong identity (never trust, always verify) and micro-segmentation (to avoid lateral movement through the network). Sometimes what’s old is new, and that is because we have had a good thing all along.

Whether your organization is embarking on a Zero Trust journey or looking to strengthen its overall security posture by implementing more robust measures based on well-established security principles, consider reaching out to Entrust. Discover how we can assist you in securing your environment.

The post Zero Trust 1975 Style appeared first on Entrust Blog.


auth0

What are Verifiable Credentials and Why You Should Care About Them

Verifiable Credentials can be stored on digital devices, and you can use cryptography to verify their data and authorship. Let's learn more about them and why you should care about them.
Verifiable Credentials can be stored on digital devices, and you can use cryptography to verify their data and authorship. Let's learn more about them and why you should care about them.

KuppingerCole

Feb 29, 2024: Cloud Alphabet Soup - CNAPP

Organizations are using cloud services to develop and deploy new and existing applications. However, the responsibilities for security and compliance are shared between the CSP (Cloud Service Providers) and the cloud customer. The cloud user is responsible for implementing controls to meet their security and compliance obligations.
Organizations are using cloud services to develop and deploy new and existing applications. However, the responsibilities for security and compliance are shared between the CSP (Cloud Service Providers) and the cloud customer. The cloud user is responsible for implementing controls to meet their security and compliance obligations.

auth0

Customizations for Sign up and Login Now in Open Early Access

Customers and trial customers on Enterprise and Professional plans can now customize signup and login flows to address unique needs.
Customers and trial customers on Enterprise and Professional plans can now customize signup and login flows to address unique needs.

Sunday, 04. February 2024

KuppingerCole

Analyst Chat #200: 200 Episodes - A Cyberspace Odyssey

What a milestone! 200 episodes of the Analyst Chat. Matthias and Martin celebrate the journey of the three years of the podcast. Dive into the evolution of cybersecurity and identity management, industry impact, memorable moments, and key insights. From this episode, we will come back to a weekly schedule, so stay tuned for the episode next Monday.

What a milestone! 200 episodes of the Analyst Chat. Matthias and Martin celebrate the journey of the three years of the podcast. Dive into the evolution of cybersecurity and identity management, industry impact, memorable moments, and key insights.

From this episode, we will come back to a weekly schedule, so stay tuned for the episode next Monday.



Saturday, 03. February 2024

Dark Matter Labs

Towards multivalent currencies, bioregional monetary stewardship and a distributed global reserve…

Towards multivalent currencies, bioregional monetary stewardship and a distributed global reserve currency In this third blog of a 4-part series we are sharing a speculative future scenario based around a network of distributed bioregional banks. The blog series is centred on the following enquiries and this reflection aims to address question three. What are the issues that make mone
Towards multivalent currencies, bioregional monetary stewardship and a distributed global reserve currency

In this third blog of a 4-part series we are sharing a speculative future scenario based around a network of distributed bioregional banks. The blog series is centred on the following enquiries and this reflection aims to address question three.

What are the issues that make money (and our dominant monetary systems) so problematic? (Please see Blog 1) Can we use design principles to help us imagine desirable future scenarios? (Please see Blog 2) What could a desirable future scenario actually look like? What can we start building and testing now to begin scaffolding a parallel system? (Please see Blog 4) Part 3 (of 4): Imagining the distributed, bioregional banks of the future

In the previous two blogs we have been exploring monetary challenges and proposing some design principles that could underpin a response. In this post we are shifting to a more creative and imaginative space. The future is unlikely to unfold in a form that we can cognitively envision. If we accept that position then it might feel pointless to engage with imaginative scenarios or foresight exercises. On the other hand, it can be interesting to form a series of working hypotheses (or perhaps stories of change) that we can use as testing grounds for our current thinking. From there, we hope to move to a space of creative vulnerability where we dare to bring some of those thought experiments to life. It is highly likely that some of the ideas presented below won’t work. It is also likely that we will learn something important from each iteration, as we continually loop between what we can understand now and the edges of our potential.

This speculative proposition is centred on the idea of a series of distributed, bioregional banks. However, these next generation institutions will bear little resemblance to our current conception of a bank. In essence they will be stewardship interfaces that exist in service of the regenerative potential of their base region. When linked together via contextually respectful exchange rates they will realign our economic systems in service of planetary vitality. Ultimately the aspiration would be to create a new global reserve currency which itself is regenerative and decolonised by design.

What if a distributed bioregional bank actually existed?

It’s easy to find fault with our current banking system but what would we propose as an alternative? What would these distributed institutions actually do? Who would be involved and in what capacity? As a starting point to open this conversation up, we have been thinking about three stewardship functions that could be part of the remit of these next generation banks. We are framing these ideas as a story that is unfolding to try and illustrate how it might feel to interact with these future banks.

Image: Dark Matter Labs¹
1. Multivalent currencies

The future banks will be part of a vibrant ecosystem of decentralised multivalent currencies. The issuance and use of these diverse, infinite and resource-backed tokens will build a dynamic understanding of the integrity of the ecosystem as a whole. The key functions of the tokens will be fourfold:

To demonstrate invisible or overlooked value flows; To gather sensory inputs from different elements of the ecosystem; To account for individual and collective contributions to the ecosystem’s health; To form the basis for the overall RegenCoin for the bioregion.

To illustrate how this might work we can use the example of a river printing its own solidarity tokens. People will receive river tokens in recognition of acts of care provided to the river. This might take the form of collecting rubbish from the river banks, introducing children to the joy of wild swimming or reducing the level of pesticides flowing into the water. Initially the system might be overseen by human guardians who vouch for such actions and communicate them to the Bank. As technologies are refined for this new way of living, a time will come where the river will be able to issue and destroy its own tokens via multi-sensory inputs.

The motivations for people to value the river tokens will be diverse. In the early days, it might be linked to respect and peer approval. Later, the tokens might become part of a blended measure of contribution that forms the overall currency for the region.

2. Contextually respectful exchange

The bank will provide a monetary stewardship function that will restrict the conversion of base assets into convertible tokens so that they align with the regenerative rhythms of the bioregion. For example:

The Bank could regulate the sales of finite elements both within and outside of the bioregion. The Bank could restrict the issuance of new debt to align with the underlying regenerative health of the region. Different types of debt could be issued that could only be used to buy or invest into specific classes of living agents (‘assets’ in today’s terminology). For example, on aggregate the bank would only issue debt for lumber processing activities that were forecast to be at a regenerative level for the region’s forests over the term of the loan. The bank could designate taxes to be paid in a mix of specified tokens. This mix could be rebalanced periodically to incentivise the behaviours which are optimal for the fluid health of the ecosystem.
3. Probabilistic, participatory governance models

The Bank’s governance systems will be based on Bayesian inference models. This is a form of dynamic statistical modelling that allows an initial set of assumptions to continually update depending on the information it receives. Critically, the output will be a blended result combining automatic sensory data inputs with the emotional values of the living community. For example, if a community upregulates the importance of river health then the overall value of the RegenCoin would fall if other activities were negatively impacting it.

These future bank activities can feel a bit flat and detached without a contextual grounding. One idea to bring them to life is to imagine how they could play out under different scenarios. We are suggesting two examples below and would be very curious to hear other ideas or interpretations:

A house owner gifts their property back to nature: in this example a resident of the Bioregion decides that the area where their house has been built should be re-wilded in the interests of the bioregion’s health. When the resident nullifies their previous ownership rights, it will trigger a series of sensory credits to their RegenWallet. The regenerative generosity of the act will be upregulated by human and more-than-human voices. The result will be an amount of newly created RegenCoin being available to build a house in a more sensitive location. A company tries to game the system to create rights over carbon credits: in this more negative example, we can imagine that a profit seeking company might want to manipulate the system for private gains. Their strategy might be to plant a dense area of trees to maximise their carbon credibility. However, as the sensory inputs are invisible, they would not know how their choice of tree species or location would be valued by the soil, pollinators or the trees themselves. In seeking to generate profits rather than regenerative potential they might in fact generate a negative return.

Sitting in front of a computer in 2024, the scenario that we have outlined above might seem far fetched. If you take a deep breath, soften your gaze and meditate for a second on the extraordinary, complex beauty of life, perhaps it feels a little less fanciful. So much of the living world is beyond our cognitive understanding but entirely open to our sensory perception. To us this feels like both an invitation and a call to action. We already have the technological power of gods and we can choose to direct it in service of life. Grounding this aspiration in a practical context is the next important step in this journey. In our final blog we will therefore begin exploring tangible elements of the scenario that we can start to build and test now.

We are incredibly heartened and grateful for all the messages that we have received so far in response to these blogs. We are still reading through some of the material and will aim to incorporate (and credit) these generous contributions wherever possible. One suggestion is to convene some online studios where we can begin sharing and exploring ideas — we are fully open to others.

This blog was written by Emily Harris (emily@darkmatterlabs.org). The visual design was developed by Sofia Valentini (sofia@darkmatterlabs.org) and Madelyn Capozzi (maddy@darkmatterlabs.org).

The initiative sits within Dark Matter’s Next Economics LAB.

References:

This scenario was partly inspired by some thoughts shared by Gregory Landua on episode 63 of the Planetary Regeneration Podcast.

Towards multivalent currencies, bioregional monetary stewardship and a distributed global reserve… was originally published in Dark Matter Laboratories on Medium, where people are continuing the conversation by highlighting and responding to this story.

Friday, 02. February 2024

1Kosmos BlockID

Vlog: 1Kosmos Adds Passwordless Authentication to Amazon Cognito

Join Rob MacDonald, VP of Product Marketing, and Huzefa Olia, COO of 1Kosmos, as they unveil the exciting integration news with Amazon Cognito. Learn about the significance of 1Kosmos becoming an AWS advanced technology partner and the seamless deployment options for developers. Explore how this collaboration enhances user experience, reinforces identity management, and sets the … Continued The

Join Rob MacDonald, VP of Product Marketing, and Huzefa Olia, COO of 1Kosmos, as they unveil the exciting integration news with Amazon Cognito. Learn about the significance of 1Kosmos becoming an AWS advanced technology partner and the seamless deployment options for developers. Explore how this collaboration enhances user experience, reinforces identity management, and sets the stage for the future of secure digital identity.

Robert MacDonald:

Hi, I am Rob MacDonald, vice president of product marketing here at 1Kosmos, and today I’m joined by Huzefa, co-founder and COO of 1Kosmos. How are you doing today Huzefa?

Huzefa Olia:

I am doing great, Rob. It’s a Friday, so never better.

Robert MacDonald:

It’s Friday, you’re in New Jersey, I heard it snowed a little bit this week, I hope you’re keeping warm.

Huzefa Olia:

When you talk about snow, you’re in Canada, so I cannot talk to you about snow. We got a little bit of a sprinkling, that’s it.

Robert MacDonald:

Fair enough. All right, well listen, today, thanks for joining us, or joining me. We just made a recent announcement of a new integration partner, Amazon Cognito. So, before we talk about all the goodness of what we’re doing with Cognito, why don’t you tell me a little bit about what it is? What is Cognito?

Huzefa Olia:

So, Cognito is a single sign-on platform, an IdP provider that Amazon Web Services provides, especially if you are building your applications on the AWS platform, this is the native integration that the AWS provides out of the box. So, Cognito is widely used across applications, individual applications that may be developed and have their own ecosystems within the AWS network.

Robert MacDonald:

Cool. So, interesting. Where is AWS deployed within an organization? So, you said it’s an IdP, I think everybody that’s probably listening to these understands what an IdP is, but where do you typically find it? Because, when we think workforce in particular, there’s Okta and Microsoft and ForgeRock and Ping. Where’s Cognito, where do you find that?

Huzefa Olia:

They primarily operate in the customer identity and access management or CIAM space today. So, they have not published essentially their numbers of their deployment, but it’s pretty vastly adopted. So, any kind of a public facing application which is hosted on AWS application, and if they would not have a custom integration to be done on Okta or Microsoft, Amazon says, here you go, you can use AWS Cognito.

Robert MacDonald:

So, the primary use case for this would be a CIAM type of initiative, somebody’s put some application on AWS, and then this is the identity management portion of that application, is that what I understood?

Huzefa Olia:

Absolutely.

Robert MacDonald:

Makes sense. So, what motivated either 1Kosmos to partner with AWS or AWS to partner with 1Kosmos? How did that come to be?

Huzefa Olia:

So, identity is our motivation. We’ve made some significant deployments in the CIAM space recently. 2023 has been a great year for us in that particular sense. And our customers were essentially looking for, hey, do you have these integrations with AWS Cognito that you provide out of box? Their footprint is pretty vast, which I got to know. And that has led to us building this integration with them and providing a more out of box solution with them as well.

Robert MacDonald:

So, that’s amazing. Everybody knows AWS, everybody knows Amazon, and it’s cool to know that they’re doing this part of their business as well. So, we are now considered an AWS advanced technology partner. So, what specifically does that mean?

Huzefa Olia:

So, our integration today with AWS is around us being a strong authentication for them. Meaning, if you need to do either multifactor authentication, if you want to do passwordless, if you want to have passkeys that you can deploy on your application that is supported by AWS Cognito, here you go. We are the partner of choice that you can go to. So, it’s pretty significant because most of the applications are now being mandated from a compliance standpoint that, hey, you need to have either Strong MFA or support passwordless passkeys in your application deployment stack, and that’s where our integration with AWS Cognito becomes extremely important. I just wanted to add that we want to continue this particular integration and develop further stories, more on the other sides of pieces that we have, identity proofing and better public mentions, but that is more in the future for us.

Robert MacDonald:

So, based on what I’ve read on the Amazon Cognito site, they have more of these advanced technology partners. It looks like organizations that are aligned to that have gone through some sort of testing, is that correct? By Amazon?

Huzefa Olia:

Yeah, integration as well as testing. So, while anything that we’ve done with AWS, it goes through significant scrutiny and review. So, even this particular integration that we put together, we had to document it. The entire GitHub of this particular technical integration is available. It is referenced in a blog that is published by the AWS community that goes out to anybody, the AWS customer. You can essentially go back and look at it. There are references to the GitHub link as well and what that integration is. But, anyway, all of this was not put together by us. We provided the integration story, but it was reviewed through multiple different channels and reviews. I cannot even recall how many by AWS and published by them.

Robert MacDonald:

Well, that’s amazing. So, somebody looking at adding our technology into that stack can be pretty much rest assured that it’s been tested by Amazon to make sure that it works with their technology, and that the integration and implementation of us with Cognito would be seamless.

Huzefa Olia:

Absolutely. So, if you’re a developer, if you’re watching this, if you have AWS Cognito, feel free to look at the blog. Maybe we can reference it into the link as well when we post this. But, it’s pretty simple, straightforward. You would essentially put a custom auth plugin with 1Kosmos, and there you go. You’ll be able to provide strong MFA or passwordless authentication to your end customers.

Robert MacDonald:

Awesome. We take user experience pretty seriously here at 1Kosmos. I guess, I’m sure you would agree with me on that. Everything we do revolves around that and basically privacy. But, it’s commonly known that users will go elsewhere if it’s too hard, if what they came to do wasn’t easy. How does 1Kosmos help Cognito users provide a better experience when they’re dealing with their end users?

Huzefa Olia:

Absolutely. So, when the experience is, and I always want to highlight that there are two experiences. One is the strong MFA, because we believe in the story of you don’t have to completely take your organization from password to passwordless from day one to day two. You may have a transition period, so strong MFA becomes important. So, we have factored that experience in the entire channel where you can authenticate in Cognito, and then there’s a dialogue box that opens up, which essentially factors in what 1Kosmos provides to you, as any of the strong MFA options that you may have, push notifications, OTP, TOTP, et cetera. For our passwordless and passkeys, we provide an SDK, which again, from an end user standpoint, it seems seamless for them, where all they’re doing is entering their username, their backend API calls that we’re available, that contact 1Kosmos console or gateway that we have, and then we provide, we essentially see if this particular authentication is valid and give the signals back to AWS.

Robert MacDonald:

Listen, I think it’s amazing. So, when we look at Amazon Cognito customers, we touched on it earlier, but how easy, or a developer, how easy is it for developers to deploy 1Kosmos into a Cognito instance?

Huzefa Olia:

So, like I said, you can sign up for a 1Kosmos instance by going to a developer environment. If you have a Cognito instance, you can sign up for a free account for Cognito as well. That is available on the AWS site. There are instructions to develop a custom auth plugin. And on that particular blog we also highlight, and the GitHub repository, what are the steps that you need to essentially use to integrate AWS Cognito with 1Kosmos.

Robert MacDonald:

So, then as a follow on to that question, that obviously sounds relatively easy. How would customers acquire an instance of BlockID? So, it’s like, I like it, I installed it at work, and the testing developer did all the great work. How do I then pay for the instance that I chose?

Huzefa Olia:

We want to make the entire experience easy. So, if you’re a customer in the AWS ecosystem, we are listed on the AWS Marketplace. We have three distinct product lines that are available. So, you can choose any one of them. Most likely it will be around BlockID customer or verify. And you can sign up for the product through AWS Marketplace. You would get your instance, your license keys, et cetera, through that particular portal and you’ll be ready to go.

Robert MacDonald:

That’s cool. Listen, Huzefa, I know that you’re super busy as always, and I wanted to thank you for taking the time today to sit down and tell us about this exciting announcement that we made this week. And I wish you the best, and I hope you come back and talk to me again on a vlog soon.

Huzefa Olia:

Look forward to it, Robert. Thank you.

Robert MacDonald:

So, listen, if you want to find out more about this integration, you can check out our website. We’ve got a data sheet and a press release there that you can go take a read and learn more about this Cognito integration. And then, plus, as Huzefa mentioned earlier, there’s a Amazon Cognito blog that you can read as well. We’ll put the links to all of those down below. Thanks again everybody. We’ll see you again shortly.

 

The post Vlog: 1Kosmos Adds Passwordless Authentication to Amazon Cognito appeared first on 1Kosmos.


Tokeny Solutions

Tokeny’s 2024 Products: Building the Distribution Rails of the Tokenized Economy

The post Tokeny’s 2024 Products: Building the Distribution Rails of the Tokenized Economy appeared first on Tokeny.

Product Focus

Tokeny’s 2024 Products: Building the Distribution Rails of the Tokenized Economy

This content is taken from the monthly Product Focus newsletter in January 2024.

2023 was transformative for Tokeny, our business was booming, and we received a strategic investment from Apex Group, one of the biggest fund administrator and servicer. From a product perspective, it was also an incredible year for us as we made significant strides across critical dimensions, focusing on scalability, connectivity, security, and composability.

We achieved notable milestones, including the launch of the Dynamic NAV feature, the Multi-party approval transfer feature, and the Unified Investor App, ensuring scalability to meet evolving market needs. We also enabled centralized exchange connectivity and delivered a complete custody solution supporting any connected and integrated wallets, to enhance both connectivity and scalability. Regarding security, we conducted regular tests and achieved SOC2 Type I Compliance. Lastly, our network-agnostic tokenization platform supporting multi-chains further emphasizes our dedication to composability.

In 2024, our product vision is to continue delivering solutions that empower not only issuers, but all ecosystem participants in the tokenization journey to cultivate liquidity and accumulate more users. In our view, to achieve the liquidity endgame, we have to build it up through a few phases: standardization, digitalization, collateralization, distribution, and then reach the liquid secondary market.

Below, I’ll explain how we are turning our product vision into actionable steps to address your technical and operational pain points, smoothly transitioning to new phases, and adding value and services, all while improving liquidity for your investors.

Accelerating standardization phase: The industry now embraces the official token standard ERC-3643 for compliant tokenization. We’re working closely with the ERC3643 Association to swiftly advance standardization. Our focus includes developing key tools like the ERC-3643 DApp, streamlining the effortless deployment and management of ERC3643 tokens, and setting a new benchmark for simplicity in compliant tokenization.

Digitizing real-world assets and securities en masse: 

1. Speeding up time to market: While our contributions to standardization are open-source, the value of Tokeny’s solutions extends beyond smart contract deployment (more details here). We are committed to enhancing our APIs and white-label SaaS solutions. Tailored products for diverse segments, ranging from asset managers to Web3 RWA projects, are in development to catalyze the RWA wave and advance the asset digitization phase.

2. Delivering immediate benefits: Accelerating towards liquidity begins with addressing immediate pain points for institutions. Why tokenize now? It’s all about enhancing operational efficiency and streamlining processes with 3rd parties applications, to slash costs. By solving these points, we can set the stage for bringing enormous high-quality assets on-chain, laying the foundation for the critical mass needed for advanced services, and emphasizing transferability, reach, and liquidity.

Improved operational efficiency: Manual, human-based processes are not just time-consuming but also costly. And it is a common challenge across the entire private market. To tackle this, we’re developing features and smart contracts for various business cases, automating workflows from onboarding, and corporate actions, to secondary tradings. Automation increases operational capability, empowering institutions to reach a larger investor base. Bridging 3rd party digital applications: Another notable challenge arises from the fragmented collaboration between various applications, each operated by different third parties. The multitude of digital applications involved in operations can create gaps when assets are brought on-chain. To address this, our products will continue to connect these disparate solutions, ensuring a cohesive operational workflow between tokenized assets and the involved providers.

3. The powerhouse for major corporations: Our strategy revolves around seamless integration into legacy systems and forming partnerships with tech providers, such as custody solutions, integrators, and financial platforms. Through strategic ecosystem integrations, we empower partners to seamlessly utilize our engine, allowing them to concentrate on their core business and sales while we manage all technical aspects.

Enabling DeFi by building a common data set for private markets: Initiated by accelerating permissioned DeFi, this month, we co-developed an open-source UI component with Dev.Pro. With this tool, DeFi is now equipped for regulated assets through ERC-3643, enabling any decentralized exchanges (DEX), automated market makers (AMMs), or lending protocol to provide compliant liquidity or collateralization for tokenized securities and RWA. Regulated entities can even launch their lending smart contracts. Our next move involves building a universal data set for tokenized securities through a product named AssetID, facilitating easy access to verifiable asset data for DeFi applications.

Building the distribution rails: DINO is an interoperable distribution network for digital securities that leverages the ERC-3643 standard. Our commitment to growth extends to integrating additional distribution channels into the DINO ecosystem. Compatibility with centralized exchanges (CEX) and AMMs is seamlessly established on our platform. As we eagerly await more DeFi protocols, the open-source UI component stands ready to welcome them. Think of us as a connector, ensuring a seamless convergence of all distribution rails on our platform.

Secondary market innovation: We will unveil an upgraded version of the Billboard, serving as a license-exempted P2P secondary market solution, enabling investors to seamlessly publish buy or sell intentions, just like placing an ad. Taking it a step further, we will introduce a fully decentralized version of the Billboard, entirely powered by smart contracts. This will allow counterparts to accept an offer directly on-chain, facilitating a peer-to-peer transaction. The offer can circulate across the entire distribution network, visible to all, but only eligible investors for the concerned tokens can accept it.

Exciting times lie ahead, and we look forward to embarking on this journey with you. Stay tuned for more developments throughout the year.

Subscribe Newsletter

This monthly Product Focus newsletter is designed to give you insider knowledge about the development of our products. Fill out the form below to subscribe to the newsletter.

Other Product Focus Blogs Tokeny’s 2024 Products: Building the Distribution Rails of the Tokenized Economy 2 February 2024 ERC-3643 Validated As The De Facto Standard For Enterprise-Ready Tokenization 29 December 2023 Introducing Multi-Party Approval for On-chain Agreements 5 December 2023 The Unified Investor App is Coming… 31 October 2023 Introducing WalletConnect V2: Discover the New Upgrades 29 September 2023 Tokeny becomes the 1st tokenization platform to achieve SOC2 Type I Compliance 1 September 2023 Permissioned Tokens: The Key to Interoperable Distribution 28 July 2023 A Complete Custody Solution for Tokenized Securities 28 June 2023 Network-agnostic tokenization platform for enterprises 26 May 2023 Introducing Dynamic NAV Updates For Open-Ended Subscriptions 25 April 2023 Tokenize securities with us

Our experts with decades of experience across capital markets will help you to digitize assets on the decentralized infrastructure. 

Contact us

The post Tokeny’s 2024 Products: Building the Distribution Rails of the Tokenized Economy first appeared on Tokeny.

The post Tokeny’s 2024 Products: Building the Distribution Rails of the Tokenized Economy appeared first on Tokeny.


KuppingerCole

Embracing Decentralized Identity at EIC 2024

by Marius Goeddert This week, we are thrilled to unveil six distinguished speakers who are at the forefront of the decentralized identity revolution. Their insights and expertise will shape discussions at the European Identity and Cloud Conference, paving the way for a new era of digital identity management. Why Decentralized Identity? Decentralized identity represents a paradigm shift, empow

by Marius Goeddert

This week, we are thrilled to unveil six distinguished speakers who are at the forefront of the decentralized identity revolution. Their insights and expertise will shape discussions at the European Identity and Cloud Conference, paving the way for a new era of digital identity management.

Why Decentralized Identity?

Decentralized identity represents a paradigm shift, empowering individuals with control over their digital personas. No longer confined to centralized databases, identity becomes user-centric, enhancing privacy and security. Join us in exploring how this transformative approach is reshaping the way we authenticate, authorize, and interact online.

At this year's conference, we explore the practical applications of decentralized identity. Our carefully curated sessions will explore real-world use cases, showcasing successful implementations across industries. From secure authentication processes to streamlined access control, each presentation will offer valuable insights into harnessing the power of decentralized identity.

Experts in Decentralized Identity at EIC 2024

Misha Deville is a co-founder of Mailchain, enabling secure communication between decentralized identifiers. She works with companies that are driving innovation in web3 and in traditional organizations to collaboratively build the necessary tools to drive adoption of SSI technologies.

Dominik Beron is the founder/CEO of walt.id, an open-source company that is building decentralized identity solutions for businesses and governments. Over the last years, he also served as an identity expert to the European Commission and EU member states.

Kim Hamilton Duffy is the Executive Director at the Decentralized Identity Foundation. As an expert and leader in the emerging decentralized identity space, she is driving standards and implementations through direct technical contributions and her leadership roles in the decentralized identity communities, including the W3C Credentials Community Group and the Decentralized Identity Foundation.

Rintaro Okamoto is the Head of Business Development for Decentralized Identity at DNP working on developing use cases and business/technology validation. In 2023. the 'Trust Collaboration Service across Mutual Aid Apps' was selected for the Digital Agency's 'Use Case Demonstration Project for the Realization of the Trusted Web'.

Dr. Carsten Stöcker is co-founder and CEO of Spherity where he is building decentralized digital identity management solutions to power the fourth industrial revolution. He serves as a Council Member of Global Future Network for the World Economic Forum.

Kristina Yasuda is an Identity Standards Architect at Microsoft, known for her work on standards in decentralized identity ecosystems: as an editor of OpenID for Verifiable Credentials specifications in OIDF, Selective Disclosure for JWTs draft in IETF, JWT-VC Presentation Profile in DIF; as a chair of Verifiable Credentials Working Group in W3C; and as a member of ISO/IEC JTC1/SC17 working on mobile driving license.

Don't miss the chance to connect with industry leaders, engage in thought-provoking discussions, and gain a comprehensive understanding of the role decentralized identity plays in shaping the future of digital interactions.

Secure your spot now and be part of the conversation that is shaping the future of identity in the digital age!


Ocean Protocol

DF74 Completes and DF75 Launches

Stakers can claim DF74 rewards. DF75 runs Feb 1 — Feb 8, 2024 1. Overview Ocean Data Farming (DF) is Ocean’s incentives program. In DF, you can earn OCEAN rewards by locking OCEAN, curating data, and making predictions (in Predictoor). Here are DF docs. Data Farming Round 74 (DF74) has completed. 150K OCEAN + 20K ROSE was budgeted for rewards. Rewards counting started 12:01am Jan 25,
Stakers can claim DF74 rewards. DF75 runs Feb 1 — Feb 8, 2024 1. Overview

Ocean Data Farming (DF) is Ocean’s incentives program. In DF, you can earn OCEAN rewards by locking OCEAN, curating data, and making predictions (in Predictoor). Here are DF docs.

Data Farming Round 74 (DF74) has completed. 150K OCEAN + 20K ROSE was budgeted for rewards. Rewards counting started 12:01am Jan 25, 2024 and ended 12:01 am Feb 1. You can claim rewards at the DF dapp Claim Portal.

DF75 is live today, Feb 1. It concludes on Feb 8. 150K OCEAN and 20K ROSE are budgeted in total for rewards.

This post is organized as follows:

Section 2: DF structure Section 3: How to earn rewards, and claim them Section 4: Specific parameters for DF75 2. DF structure Passive DF. As a veOCEAN holder, you get passive rewards by default. Active DF has two substreams.
– Volume DF. Actively curate data by allocating veOCEAN towards data assets with high Data Consume Volume (DCV), to earn more.
– Predictoor DF. Actively predict crypto prices by submitting a price prediction and staking OCEAN to slash competitors and earn. 3. How to Earn Rewards, and Claim Them

There are three ways to earn and claim rewards: passive DF (like before), Active DF : Volume DF (like before), and Predictoor DF (new).

Passive DF. To earn: lock OCEAN for veOCEAN, via the DF webapp’s veOCEAN page. To claim: go to the DF Webapp’s Rewards page; within the “Passive Rewards” panel, click the “claim” button. The Ocean docs have more details. Active DF
– Volume DF substream. To earn: allocate veOCEAN towards data assets, via the DF webapp’s Volume DF page. To claim: go to the DF Webapp’s Rewards page; within the “Active Rewards” panel, click the “claim” button (it claims across all Active DF substreams at once). The Ocean docs have more details.
– Predictoor DF substream. To earn: submit accurate predictions via Predictoor Bots and stake OCEAN to slash incorrect Predictoors. To claim OCEAN rewards: run the Predictoor $OCEAN payout script, linked from Predictoor DF user guide in Ocean docs. To claim ROSE rewards: see instructions in Predictoor DF user guide in Ocean docs. 4. Specific Parameters for DF75

This round is part of DF Main, phase 1.

Budget. This round has 150,000 OCEAN + 20,000 ROSE rewards total. That OCEAN and ROSE is allocated as follows:

Passive DF: 50% of rewards = 75,000 OCEAN Active DF: 50% of rewards
– Predictoor DF. 50% = 37,500 OCEAN + 20,000 ROSE
– Volume DF. 50% = 37,500 OCEAN

Networks. Ocean currently supports five production networks: Ethereum Mainnet, Polygon, BSC, EWC, and Moonriver. DF applies to data on all of them.

Volume DF rewards are calculated as follows:

First, distribute OCEAN across each asset based on rank: highest-DCV asset gets most OCEAN, etc. Then, for each asset and each veOCEAN holder:
– If the holder is a publisher, 2x the effective stake
– Baseline rewards = (% stake in asset) * (OCEAN for asset)
– Bound rewards to the asset by 125% APY
– Bound rewards by asset’s DCV * 0.1%. This prevents wash consume.

Predictoor DF rewards are calculated as follows:

First, DF Buyer agent purchases Predictoor feeds using OCEAN throughout the week to evenly distribute these rewards. Then, ROSE is distributed at the end of the week to active Predictoors that have been claiming their rewards.

Expect further evolution in Active DF: tuning substreams and budget adjustments among substreams. What remains constant is passive DF, and the total OCEAN rewards emission schedule.

Updates are always announced at the beginning of a round, if not sooner.

Appendix: Further Reading

The Data Farming Series post collects key articles and related resources about DF.

About Ocean Protocol

Ocean was founded to level the playing field for AI and data. Ocean tools enable people to privately & securely publish, exchange, and consume data.

Follow Ocean on Twitter or Telegram to keep up to date. Chat directly with the Ocean community on Discord. Or, track Ocean progress directly on GitHub.

DF74 Completes and DF75 Launches was originally published in Ocean Protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.

Thursday, 01. February 2024

KuppingerCole

Zero Trust Unveiled: Securing Critical Data in SAP, CAD, and PLM Systems

John Tolbert, Director of Cybersecurity Research and Lead Analyst at KuppingerCole Analysts, will provide an overview of the challenges and regulatory drivers of protecting sensitive unstructured and structured data such as found in CAD, PLM, and SAP systems. He will also discuss the Information Protection Lifecycle, methods for identifying and classifying information objects, access control

John Tolbert, Director of Cybersecurity Research and Lead Analyst at KuppingerCole Analysts, will provide an overview of the challenges and regulatory drivers of protecting sensitive unstructured and structured data such as found in CAD, PLM, and SAP systems. He will also discuss the Information Protection Lifecycle, methods for identifying and classifying information objects, access control architectures, and enforcement methods. 

Markus Nüsseler-Polke, Head of Consulting & Projects​ at Secude will provide an overview of Secude’s HaloCORE and HaloCAD solutions and how they implement zero-trust principles, extending Microsoft Purview Information Protection (MPIP) to safeguard these key systems.




liminal (was OWI)

Facial Biometrics: Trends and Outlook

Facial biometrics technology is shaping the future of identity verification, access control, and public safety. As this market grows, poised to reach a staggering $42.1 billion by 2028, it’s essential to understand the drivers, challenges, and opportunities in this dynamic field. Key insights reveal a technology in flux – widely adopted across various sectors yet […] The post Facial Biometrics:
Facial biometrics technology is shaping the future of identity verification, access control, and public safety. As this market grows, poised to reach a staggering $42.1 billion by 2028, it’s essential to understand the drivers, challenges, and opportunities in this dynamic field. Key insights reveal a technology in flux – widely adopted across various sectors yet contending with regulatory hurdles and the rise of AI-generated deepfake fraud. 

The growth of facial biometrics is driven by its effectiveness in streamlining and securing processes across finance, healthcare, and government sectors. This growth is primarily attributed to its ability to improve user experience and security across various domains. 75% of US consumers aged 18-34 reported adopting facial biometrics in 2022. However, this growth has its challenges. Heightened regulatory scrutiny poses significant compliance burdens, particularly from BIPA and the EU. Lawsuits under BIPA have surged, with substantial settlements highlighting the legal risks of non-compliance. Additionally, the rapid advancement of AI technologies, notably deepfake capabilities, presents a new frontier of threats, demanding continuous innovation and investment in defensive strategies.

This backdrop raises a critical question: How can facial biometrics vendors navigate the evolving regulatory landscape and technological threats while capitalizing on the growing market demand?

The answer lies in a dual strategy focusing on compliance and innovation. Vendors must adapt swiftly to regulatory changes, ensuring their solutions align with laws like the Biometric Information Privacy Act (BIPA), California Consumer Privacy Act (CCPA), and the European Union Artificial Intelligence Act. Those who quickly achieve compliance can avoid costly legal pitfalls and carve a niche in this competitive market.

Simultaneously, vendors must invest in differentiating their offerings. As the market matures, the emphasis shifts towards advanced features like liveness detection, anti-spoofing, and protection against deepfake attacks. These capabilities will not only offer a competitive edge but also address the burgeoning security concerns associated with AI technologies.

Despite the regulatory and technological challenges, the facial biometrics market is ripe with opportunities. The increasing demand for seamless, secure user experiences drives market growth, especially in public-facing industries like transportation and healthcare. Moreover, public sector adoption, particularly in border control and law enforcement, underlines the technology’s indispensability.

Vendors positioned to navigate the regulatory and technological landscape are set to thrive. The future of facial biometrics hinges on balancing innovation with compliance, ensuring security without sacrificing user convenience. As we look towards this future, one thing is clear: facial biometrics will continue to be a pivotal technology in shaping our digital identities and security paradigms.

For more details on the impact of these trends, log in or sign up at Link to access the Outside-In Report: Facial Biometrics.

Related Content Market and Buyer’s Guide for Customer Authentication Can Age Assurance Technologies Help Build Digital Ecosystems of Trust? Navigating Biometric Data Regulations What is Facial Biometrics? 

Facial biometrics is an advanced technology that utilizes unique facial features to verify and authenticate users. It is widely used for customer authentication, access control, and public safety purposes. With the ability to navigate through regulatory challenges and AI threats, facial biometric technology is revolutionizing security and verification processes across multiple sectors. It is indispensable in enhancing government identity schemes through national eID verification, securing borders with rapid identity checks, and aiding law enforcement in criminal identification with unmatched precision.

In addition to public safety, the private sector is rapidly adopting facial biometrics technology. It is transforming the customer onboarding process in banking and gaming and securing mobile payments. Emerging providers are capitalizing on the growing demand for passwordless security and the consumer demand for secure, convenient experiences.

However, solution providers face many challenges as they navigate the complexities of legal battles, technological vulnerabilities, and the societal shift toward widespread acceptance of physical biometrics. Despite these challenges, the future of facial biometric technology looks promising as it continues to enhance security and verification processes across multiple sectors.

The post Facial Biometrics: Trends and Outlook appeared first on Liminal.co.


Elliptic

Three individuals implicated in the $477 million FTX heist?

Update: The US Department of Justice has now confirmed that FTX was the victim of the SIM swap attack that resulted in the theft of $400 million in cryptoassets.   On November 11, 2022 $477 million was stolen from cryptoasset exchange FTX, just as it was collapsing into bankruptcy. Elliptic has been following these stolen assets as they have been laundered over the intervenin

Update: The US Department of Justice has now confirmed that FTX was the victim of the SIM swap attack that resulted in the theft of $400 million in cryptoassets.

 

On November 11, 2022 $477 million was stolen from cryptoasset exchange FTX, just as it was collapsing into bankruptcy. Elliptic has been following these stolen assets as they have been laundered over the intervening years. A report from Brian Krebs now suggests that three individuals have been indicted in relation to this heist.


KuppingerCole

SAP Security Solutions — SecurityBridge

by Martin Kuppinger SAP is the leading vendor of Line of Business (LoB) applications. Their applications covering ERP (Enterprise Resource Planning) and many other user cases form the backbone of business operations in many organizations, ranging from medium-sized business to the world’s largest organizations. Taking such a central role in organizations, SAP systems of different types operated in

by Martin Kuppinger

SAP is the leading vendor of Line of Business (LoB) applications. Their applications covering ERP (Enterprise Resource Planning) and many other user cases form the backbone of business operations in many organizations, ranging from medium-sized business to the world’s largest organizations. Taking such a central role in organizations, SAP systems of different types operated in different deployment models, need special care from security teams that understand the specifics of SAP environments, and are equipped with specialized tools. SAP Security Solutions are essential for increasing the security posture in SAP environments and mitigating risks to these core LoB applications.

Commissioned by SecurityBridge

Indicio

Indicio’s new Software Developer Kit (SDK) makes creating and customizing digital wallets for any application quick and easy

The post Indicio’s new Software Developer Kit (SDK) makes creating and customizing digital wallets for any application quick and easy appeared first on Indicio.
The Indicio Proven® Mobile SDK unlocks the power of verifiable identity and data by giving developers an easy way to build apps with digital wallets and add the features they need to deploy decentralized identity solutions.

SEATTLE, January 31, 2024/ — Mobile app developers now have a simple and powerful way to add  digital wallet functionality to their applications. The new Indicio Proven® Mobile SDK provides packages for Swift, Kotlin, and React Native that are all compiled from a single code base, making it easy to develop apps for both iOS and Android. 

Digital wallets support receiving, holding, and sharing verifiable credentials, which enable people, organizations, and IoT devices to prove their identity and share verifiable data. 

The Indicio Proven Mobile SDK solves a major blocker to deploying decentralized identity and verifiable identity, namely, the lack of a single source for all the code needed to build customized apps with interoperable digital wallets. Your developers can now bring decentralized identity to your applications without having to learn new programming languages or practices, or having to develop complicated, multi-app workflows.

At launch, the open-source protocols and standards the Indicio Proven Mobile SDK supports include DIDComm V1.0, DID:Peer:1 and DID:Peer:2 and AnonCreds. Future updates will add decentralized ecosystem governance (DEGov), JSON-LD and SD-JWT credential types, and OID4VC, providing interoperability with the European Union’s Digital Identity Wallet standards. Indicio is committed to enabling interoperability with all major cross border projects.

“Our SDK is the biggest advancement in digital identity wallet technology to date,” said Heather Dahl, CEO, Indicio. “We’ve eliminated the complexity of having to cobble together code written in different languages from multiple libraries and delivered a single source for integrated innovation with comprehensive functionality and interoperability — all based on well-tested code. Everything’s in one place for you to start building, save time, save money, and deliver the killer features your customers want in your products.”

Apps created using the Indicio Proven Mobile SDK also work with Indicio Proven®, Indicio’s complete out-of-the-box product for issuing and verifying wallet data. Indicio Proven accelerates the speed and accuracy for making vital business decisions with its simple software that is easy to implement and can be added onto existing systems and infrastructure, removing inefficient, frustrating paper-based processes and creating better digital experiences for end users and customers. 

Contact Indicio today to get the Indicio Proven® Mobile SDK, Indicio Proven or any of Indicio’s products. 

The post Indicio’s new Software Developer Kit (SDK) makes creating and customizing digital wallets for any application quick and easy appeared first on Indicio.


Ocean Protocol

New Data Challenge: Aviation Weather Forecasting Using METAR Data

METAR scores are made in every airport, every hour of the day. Can you predict the next few hours with the highest accuracy? 2024 marks the 3rd year of the Ocean Protocol Data Challenge Program initiative. ‘Aviation Weather Forecasting Using METAR Data’ is the second data challenge in 2024, and the second opportunity to score points in the Championship Leaderboard for this season. The challenge l

METAR scores are made in every airport, every hour of the day. Can you predict the next few hours with the highest accuracy?

2024 marks the 3rd year of the Ocean Protocol Data Challenge Program initiative. ‘Aviation Weather Forecasting Using METAR Data’ is the second data challenge in 2024, and the second opportunity to score points in the Championship Leaderboard for this season. The challenge launches today Feb 1, 2024, with a deadline to participate ending Feb 20, 2024, 23:59:59 UTC. Access to the challenge description and submission guidelines can be found on the Desights platform.

The dataset used for the ‘Aviation Weather Forecasting Using METAR Data’ challenge holds METAR scores that update every hour of every day. at KPIM Miami International Airport. We will only use 1 airport for this data challenge, though METAR is a standard score updated at each airport. The data we use for this challenge is Miami's historical METAR logs from 2014–2023. Recognizing the symmetrical data structure of METAR as a metric, models developed in this challenge will be able to be run in real-time for 1–12hr ahead of present-time predictions for METAR score. This is a unique opportunity for data people to dive into real-world data and uncover insights that could shape the future of aviation safety, understanding, airline efficiency, and pilots driving planes.

Challenge Overview

Objective: Building upon the insights gained from Exploratory Data Analysis (EDA), participants in this data science competition will venture into hands-on, real-world artificial intelligence (AI) & machine learning (ML). Their primary objective is to develop advanced models that accurately predict future weather conditions at KMIA (Miami Airport). These AI/ML models become invaluable tools for aviation operations and safety by harnessing the extensive historical METAR data.

Data Set: Access to the dataset of historical METAR data points is available to download from the Ocean Market via the Mumbai Test Network (Polygon Testnet), and via Polygon Mainnet. You can download the dataset directly through Desights.

Machine Learning Model: Participants have free reign to attack their model selection based on their own preferences. Some ideas for Python native ML to include ARIMA/SARIMA using 'statsmodels', LSTM(Long Short-Term Memory) Networks using TensorFlow or Keras, Random Forest Regression using scikit-learn, or Gradient Boosting Machines using XGBoost. When implementing these models, you’ll typically start by preprocessing your time series data (e.g., normalization, choosing model weights, etc.), followed by feature engineering (like extracting time components and creating lag features). After that, you can train your model, tune its parameters, and validate its performance using metrics like RMSE, MAE, or MAPE and pursue the most accurate METAR prediction.

It’s also a good practice to perform cross-validation to assess the robustness of your model. Given the complexity and variability of METAR data, it might also be beneficial to ensemble different models to improve predictions in prior portions of the data challenge.

METAR

METAR is a format for reporting weather information. A METAR weather report is predominantly used by aircraft pilots, airlines, observation stations, and meteorologists who use aggregated METAR information to assist in current general weather forecasting.

Raw METAR is the most common format in the world for the transmission of observational weather data. It is highly standardized through the International Civil Aviation Organization (ICAO), which allows it to be understood throughout most of the world.

This is how you read a METAR

Additional Clarity: METAR/TAF LIST OF ABBREVIATIONS AND ACRONYMS

Prizes

In 2024, we’ve increased the prize pool to $10,000 USDC (or recievable in OCEAN) per challenge. This is distributed among the top 10 finishers, ensuring more participants can win. Additionally, each Data Challenge cycle offers opportunity to score points in the Data Challenge Championship Season Leaderboard. The prize breakdown for every cycle in monetary value + leaderboard points for End-Of-Season awards is as follows:

Points scored in each Data Challenge compound to leaderboard standings for the season championship. Leaderboard updates 1 week after each DC concludes: https://oceanprotocol.com/earn/data-challenges/ How to Participate

Sign Up to Desights: Create a profile in web3 style on Desights to join the data challenge and future data challenges.

Timeline: The challenge runs from February 1, 2024, - February 20, 2024.

Submission Guidelines: Please follow the ‘Evaluation Criteria’ and ‘Report Guidelines’ sections of the challenge overview in Desights for proper submission guidelines.

For questions, comments, and community data science dialogue, reach out in our discord #data-science-hub channel: https://discord.gg/yFRPH9PCN4 for updates and new challenges. Stay tuned for updates and discussions on our blog page blog.oceanprotocol.com for progress throughout the year!

To see past, current, and future data challenges sponsored by Ocean, please visit https://oceanprotocol.com/earn/data-challenges.

About Ocean Protocol

Ocean was founded to level the playing field for AI and data. Ocean tools enable people to privately & securely publish, exchange, and consume data.

Follow Ocean on Twitter or Telegram to keep up to date. Chat directly with the Ocean community on Discord. Or, track Ocean progress now on GitHub.

New Data Challenge: Aviation Weather Forecasting Using METAR Data was originally published in Ocean Protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.


KuppingerCole

Mar 06, 2024: Road to EIC: Exploring the Power of Decentralized Identity Solutions

Decentralized Identity (DID) is pivotal in the digital era, offering a secure, privacy-centric identity solution. In contrast to traditional systems, which are centralized and prone to breaches, DID is built on blockchain technology. With DID, users have control over their data, reducing the risk of identity theft. Interoperability is enhanced, enabling secure authentication across platforms withou
Decentralized Identity (DID) is pivotal in the digital era, offering a secure, privacy-centric identity solution. In contrast to traditional systems, which are centralized and prone to breaches, DID is built on blockchain technology. With DID, users have control over their data, reducing the risk of identity theft. Interoperability is enhanced, enabling secure authentication across platforms without a central authority. DID aligns with privacy regulations like the EU’s GDPR, emphasizing user consent and data control in a self-sovereign environment. As digital interactions surge, decentralized identity becomes a crucial, user-centric tool for secure and seamless identity management in our interconnected, digital world. In this second edition of the “Road to EIC” series, we will discuss questions such as: Is it too early for DID's widespread implementation or is the advent of yet another disruptive technology breakthrough imminent?

Ayan Works

Decentralized Identity: A Game Changer for Digital Citizenship

In an age defined by relentless technological evolution, our lives have seamlessly integrated into the complex web of the digital world. From social interactions to financial transactions, our activities are increasingly finding their home online. Yet, this digital transformation is not without its challenges, particularly concerning identity and privacy. As our lives migrate into this digital sph

In an age defined by relentless technological evolution, our lives have seamlessly integrated into the complex web of the digital world. From social interactions to financial transactions, our activities are increasingly finding their home online. Yet, this digital transformation is not without its challenges, particularly concerning identity and privacy. As our lives migrate into this digital sphere, ensuring the security and authenticity of our identities becomes crucial. This is exactly where decentralized identity emerges as a game changer for the landscape of digital citizenship.

As reported, nine Bangladeshi nationals were recently apprehended in Mumbai, India for illegal stay and fraudulent activities, including the obtaining Aadhaar cards and PAN cards with fake documents and the illegal transfer of money to Bangladesh.

This incident underscores the critical need for a robust and secure identity management system. Breaches in such systems can lead to identity theft, financial loss, and various other problems.

Enter decentralized identity, a game changer for the landscape of digital citizenship.

Revolutionizing Identity Management: A Shield Against Fraud and Illegal Activities

Traditionally, digital identities have been stored and controlled by centralized structures. This centralized approach poses significant risks, where breaches can lead to identity theft, financial loss, and an array of other problems. Decentralized identity, on the other hand, operates on the bedrock of blockchain technology. It allows individuals to reclaim control over their digital identities, free from the clutches of a central authority. Blockchain ensures security and transparency by storing data across a distributed network of computers. This decentralized approach to identity management has the transformative potential to revolutionize the way we perceive and protect our digital selves.

Decentralized identity extends a profound empowerment to individuals. In our current digital landscape, personal information is often surrendered to various platforms, with trust placed in them to safeguard it. Decentralized identity alters this paradigm fundamentally. It empowers individuals to own, manage, and authenticate their identities without relying on intermediaries. This not only grants people greater control but also mitigates risks associated with centralized databases. Your personal data retains its essence — personal.

Fortifying Digital Fortresses

In the digital age, security, privacy, and efficiency stand as non-negotiable pillars. Decentralized identity, fortified by advanced cryptographic techniques, exemplifies an impregnable fortress of security. Each user, armed with a unique digital signature, safeguards their identity without exposing sensitive data. Breaching this network proves an insurmountable challenge, thanks to blockchain’s robust security measures. Decentralized identity not only repels identity theft and fraud but also champions privacy. By enabling users to share minimal, essential information for transactions, it revolutionizes the digital landscape. No longer subjected to constant surveillance, individuals navigate online spaces with newfound confidentiality. Additionally, this paradigm shift simplifies digital engagement. Picture a world where having many passwords is like using outdated things. This new way of managing identity puts everything together, giving one key to access lots of different things. It’s a mix of being safe, private, and quick, showing a future where people easily and securely control their online lives.

A Glimpse from Bhutan: A Beacon of Digital Identity

Bhutan has embraced decentralized identity as a foundation of its digital initiatives. Through Self-Sovereign Identity (SSI), Bhutanese citizens now have a secure and private way to verify their identities online. This innovation not only enhances the efficiency of public services but also preserves the sanctity of individual privacy, aligning seamlessly with the nation’s Gross National Happiness philosophy.

The Road Ahead

While the potential of decentralized identity is vast, challenges persist. Further development and refinement are necessary to ensure seamless integration into existing systems. Moreover, legal and ethical considerations surrounding identity management need thoughtful addressal. However, amidst the complexities of digital citizenship, decentralized identity emerges as a light of hope.

In conclusion, decentralized identity is more than a mere technological innovation; it signifies a paradigm shift. It embodies a future where individuals reclaim control over their digital lives, privacy becomes a tangible reality, and the digital world follows the values of autonomy and security. As we move deeper into the digital age, embracing a decentralized identity isn’t just a choice but becomes a necessity. It is the foundation upon which a safer, more private, and more efficient digital citizenship can be meticulously built.

In the world of decentralized identity, AyanWorks is committed to delivering innovative services that empower both individuals and institutions to confidently navigate the digital landscape.

You can contact us here. You can follow us to get updates in the future.

Join the Conversation: Your Thoughts?

What are your thoughts on decentralized identity and its potential to revolutionize digital citizenship, refuge management and illegal migration? Share your ideas in the comments below. Let’s brainstorm solutions together!


PingTalk

Online Fraud: Defined, Types, Methods, & Prevention | Ping Identity

The era of a global pandemic led to more people staying at home. That meant a substantial growth in online business, particularly in banking and retail sectors, along with food delivery, education, streaming services, pharmacy sales, telemedicine and others.   Identity fraud is a growing problem for organizations today, with losses due to identity theft totaling over $635 billion in 2023

The era of a global pandemic led to more people staying at home. That meant a substantial growth in online business, particularly in banking and retail sectors, along with food delivery, education, streaming services, pharmacy sales, telemedicine and others.

 

Identity fraud is a growing problem for organizations today, with losses due to identity theft totaling over $635 billion in 2023 and account takeover attacks up 354% year-over-year.1 Account fraud is getting more brazen as attempted fraud transactions reportedly increased 92% and attempted fraud amounts have jumped by 146%.2 Fraud will continue to grow in volume and sophistication as more organizations – and individuals – choose online channels to conduct business. The world’s governments are scrambling to catch up with needed changes to cyber laws to hold those committing fraud accountable, but the best option is to prevent fraud from happening in the first place.


Holochain

Facilitating Meaningful Connections From a Dating App Founder

#HolochainChats with Joe Feminella

Between the pings, swipes, and endless scrolls of the modern world, genuine human connection often feels out of reach. For Joe Feminella, the problem of fostering meaningful interactions in the digital era led him to found the dating app First Round’s on Me.

Sparked by his own fatigue trying to find deep relationships in a system obsessed with scale over substance, Joe’s journey reveals three pivotal insights on coordinating impactful collaboration.

Joe's experience in developing a dating app offers a unique perspective on how innovative technology can be used to create more meaningful connections.

How the Design of Coordination Systems Affect Outcomes

While technology has the power to connect people globally, design choices dictate impacts on relationships. App founder Joe Feminella observes, “I think technology is just moving so quickly that we often don't know what to do with it.” Rapid digitization makes it hard to accommodate the reality of everyday life.

Consequently, Joe watched social platforms overwhelm users with information rather than facilitating meaningful connections. Engineered to maximize matches, the barrage of stimulus short-circuited genuine intimacy.

Joe suggests that “progress requires transcending entrenched, engineering-only perspectives.” Bridging system gaps in belonging allows innovation to elevate, not exploit, understanding.

This involves recalibrating metrics to empower care-centered coordination. As Joe concludes, “user-informed design interweaves logic with healthy social values.” By grasping technology’s influence, we gain a conviction for writing social code that finally facilitates the connections we want. 

Using Feedback to Build a Better App

User frustration often signals when tools hamper rather than help healthy relating. For app founder Joe Feminella, the frustration he and his dates shared about the platforms they were meeting on revealed recurrent unmet needs.

Across demographics and identities, people voiced the desire for more intentional interactions cultivating genuine intimacy. They felt starved of courtship customs signaling care, not just matches indicating cursory interest. Stark contrasts emerged between the claims of these dating apps, which promised meaningful connections, and the actual experiences of emotional depletion they provided.

Application users expressed feeling reduced to a number, valued for quantification over quality of character. As the number of dating apps grew, environments that should encourage vulnerability lacked sufficient moderation protections. Leaving many to feel objectified by digitally amplified attention patterns.

Joe suggests exploitation causes anxiety, stating “onslaughts of stimulus overload sensitivities crafted over millennia.” By focusing computing on growth before grounded human needs, innovation routinely agitates users.

This taught Joe the need for cross-discipline solutions. He felt compelled to bridge the gaps hindering healthy bonds and to leverage technology for good. As he concludes, “Progress means transcending entrenched perspectives across disciplines.” Collectively authored improvements weave care back into digital social fabrics.

Bringing More Heart Into Tech to Bridge Understanding

Holochain's framework is made for peer-to-peer applications that are secure, reliable, and fast. Moreover, Holochain is made for social applications that bring people together, centering the user. This technology aligns perfectly with Joe's vision of building authentic connections. 

His experience developing a dating app that emphasizes genuine human relationships is paralleled by Holochain's goal to connect user devices directly in secure networks, giving users the autonomy of locally installed software that still benefits from the power and redundancy of cloud software. 

Joe's insight into user experience and cross-disciplinary thinking are essential. New, innovative technology still needs to be user-friendly and accessible to help enable human connection. Hope persists for redeeming technology to elevate healthy bonds, not degrade them. Joe suggests progress stems from “transcending entrenched perspectives” — blending wisdom across domains to braid ethical values back into digital spaces.

A More Connected Future

The tools either dividing or uniting us remain unwritten, awaiting our choice. While scripts concentrating on control and surveillance surround us, we retain the power to edit the tale toward care.

Now more than ever, we require rewriting the social code — infusing creativity with conscience, and computation with compassion. Though challenges persist in balancing scalability with situated nuance, a more beautiful technology beckons just beyond reach when we link hands across domains.

Wednesday, 31. January 2024

1Kosmos BlockID

Mitigating Midnight Blizzard’s Password Spraying Cyber Attack: Insights and Solutions by 1Kosmos

Less than a month into 2024 and password spraying is being named as the origin for our first eye-opening cyber-attack. In a blog post, Microsoft has identified Midnight Blizzard, the Russian state-sponsored actor also known as NOBELIUM for the attack on their corporate systems. This is another in the ever-evolving landscape of cybersecurity threats. As … Continued The post Mitigating Midnight Bl

Less than a month into 2024 and password spraying is being named as the origin for our first eye-opening cyber-attack. In a blog post, Microsoft has identified Midnight Blizzard, the Russian state-sponsored actor also known as NOBELIUM for the attack on their corporate systems.

This is another in the ever-evolving landscape of cybersecurity threats. As organizations continue to grapple with the reality of sophisticated cyber-attacks, it becomes imperative to look for additional alternatives to bolster defenses if a breach is due to a wider security vulnerability.

Considering the recent guidance provided by Microsoft on the “Midnight Blizzard” nation-state attack, we at 1Kosmos delve into the nuances of this threat and offer insights into how organizations can navigate through the latest attack vector.

Understanding the Midnight Blizzard Cyber-Attack

What was the Midnight Blizzard cyber-attack? As covered by Microsoft Threat Intelligence, it represents a formidable challenge for organizations worldwide. Characterized by the group’s stealthy infiltration and persistent nature, this sophisticated attack exploited network vulnerabilities and breached Microsoft’s Outlook by using a test account to authorize a custom-built malicious application.

The group utilized a password spraying attack that successfully compromised a legacy, non-production test tenant account that did not have multifactor authentication (MFA) enabled. The account inadvertently gave hackers access to the inboxes of various executives, including those in cybersecurity and legal functions.

This allowed them to steal copies of their emails and attachments through a legacy OAuth application that had elevated access to the Microsoft corporate environment. The group had access for approximately six weeks before they were discovered.

The vulnerability was not limited to Microsoft’s own environment. As part of their forensic analysis, Microsoft determined that the same group of attackers used identical tactics to target the inboxes of an unspecified number of Microsoft’s customers.

The Attack

1Kosmos Perspective: Building Security Resilience

The nature of this attack illustrates a need to eliminate passwords, strengthen multi-factor authentication, prevent lateral movement, and simpifly the IT (Information Technology) stack.

How 1Kosmos can help:

Prevent Password Spraying Attacks: 1Kosmos can eliminate passwords wherever possible, and where not possible, enforce regular password reset intervals with a password reset workflow that proves the user’s identity to ensure the reset request’s validity.

Zero Trust Methodology: 1Kosmos authentication methods exceed Zero Trust guidelines and mitigate the risk posed by attackers, by verifying every user and device attempting to access resources. The result can limit the lateral movement of attackers within their networks, thwarting sophisticated infiltration attempts.

Phish-Proof MFA: 1Kosmos strong identity-based authentication protocols perform strong authentication and significantly improve the user experience. 1Kosmos LiveID is a phishing-proof MFA that proves identity at every access request.

Risk Based Authentication: One-size-fits-all authentication solutions prove ineffective, underscoring the imperative for risk-based authentication to become the standard rather than the exception. 1Kosmos can use contextual factors like device details and location to assess the risk level and apply the right level of authentication.

Audit and Reporting: 1Kosmos is built on a private and permissioned blockchain. This private, permissioned ledger retains a detailed, immutable audit trail of all events, enabling visibility to all logins, access attempts, information updates, and shared information related to the digital identity. These logs can be shared with SOC teams and other platforms to ensure quick digital forensic analysis.

Collaborative Defenses: Collaboration and integration are key to a successful security infrastructure. 1Kosmos has built out-of-the-box integrations with industry peers (including Microsoft) to enhance the resilience of organizations against sophisticated attacks. Relying on a duct tape infrastructure and the gaps they inject is exactly what these hackers are looking to exploit. A tightly coupled integration will reduce gaps and improve an overall security posture.

Industry Certifications: 1Kosmos has taken the step to ensure the safety and security of our platform. For instance, 1Kosmos is certified to many standards including FIDO2, NIST 800-63-3, UK DIATF and iBeta DEA EPCS. The combination of these standards prevents identity impersonation, account takeover and fraud while delivering frictionless user experiences that preserve user privacy.

1Kosmos Prevents Identity Based Attacks

As the threat landscape continues to evolve, we must remain focused and adaptive in our cybersecurity strategies. The Midnight Blizzard attack serves as a stark reminder of the persistent threat posed by nation-state adversaries. By embracing a proactive, collaborative, and identity-centric approach to security, organizations can bolster their resilience and effectively navigate through the digital storm unleashed by sophisticated attackers.

At 1Kosmos, we empower remote identity verification and passwordless multi-factor authentication, facilitating secure transactions with digital services for employees, customers, and residents alike. Through the integration of identity proofing, credential verification, and strong authentication, 1Kosmos effectively combats identity-based attacks, empowering organizations with the tools, insights, and expertise needed to combat threats such as this one and safeguard their digital assets.

The post Mitigating Midnight Blizzard’s Password Spraying Cyber Attack: Insights and Solutions by 1Kosmos appeared first on 1Kosmos.


auth0

What to Expect When Your Auth0 Startup Plan Expires?

After your startup plan expires, here’s what to keep in mind and what to do next.
After your startup plan expires, here’s what to keep in mind and what to do next.

Shyft Network

EBA’s Amended Money Laundering & Terrorist Financing Guidelines Explained

On March 1st, 2021, the European Banking Authority published final revised guidelines on money laundering and terrorist financing risk factors, offering directions on how it intends to lead, coordinate, and monitor the fight against money laundering and terrorist financing. Nearly three years later, on January 16th, 2024, the EBA published guidelines amending the earlier ones. So, what do t

On March 1st, 2021, the European Banking Authority published final revised guidelines on money laundering and terrorist financing risk factors, offering directions on how it intends to lead, coordinate, and monitor the fight against money laundering and terrorist financing.

Nearly three years later, on January 16th, 2024, the EBA published guidelines amending the earlier ones.

So, what do these amendments imply? How do they impact CASPs, the crypto asset service providers? What do the CASP/VASPs need to do to comply with these amendments? These are the aspects we will discuss in the segments below.

How Does EBA’s Final Guidelines Impact CASPs

The amendments outline what CASPs must do to handle potential money laundering and terrorist financing risks in their overall business and each transaction.

The Assumption That Drives EBA’s Guideline Amendments for CASPs/VASPs

The new guidelines recognize that Crypto Asset Service Providers operate differently from traditional banks. They point out that CASPs often deal with transfers to self-managed wallets and various decentralized platforms, which aren’t uniformly regulated worldwide. This might make them more vulnerable to money laundering and terrorist financing. The guidelines also note that certain features in crypto assets, which can keep transactions anonymous, could increase these risks.

What Could CASPs Do to Reduce Risk?

The EBA is advising Crypto Asset Service Providers to increase vigilance against the risks of money laundering and terrorist financing, especially due to the ease of anonymous international transfers.

Their advice includes practical steps like limiting product features to control where money can be sent. This means allowing transfers only to certain approved parties, such as crypto-asset accounts or bank accounts in the customer’s name, which are already following strict anti-money laundering and counter-terrorist financing regulations.

Additionally, the EBA suggests CASPs could use more controlled payment systems, like those used for small payments or transactions with the government. Another idea is to offer services specifically to certain groups, like a company’s employees, to better manage risk.

The EBA also notes that CASPs can lower their risk by ensuring customers meet regulatory standards and have a clean history of crypto transactions. Transactions that involve converting to or from official currency, especially through bank accounts in low-risk areas, are seen as safer. The same goes for low-value transactions for goods and services, as long as there’s no negative information about the involved crypto accounts.

CASPs are also encouraged to think about the risks linked to how and where they offer their services.

While these suggestions can help minimize risks, the EBA emphasizes the need for CASPs to actively implement measures that make their operations more secure.

The Measures that the EBA’s Amending Guidelines Recommend for CASPs

The EBA wants CASPs to have suitable and effective monitoring tools, including transaction monitoring tools and advanced analytics tools. The CASPs must also train relevant employees for them to have a wholesome understanding of crypto assets and ML/TF risks to which they may expose the provider.

The CASPs Must Ensure Enhanced Customer Due Diligence

According to the European Banking Authority’s published final revised guidelines,

CASPs must verify the identities of customers and beneficial owners using multiple trusted and independent sources. They also need to identify and verify majority shareholders who are not yet compliant.

EBA underlines that to understand customer relationships better, gathering more information about the customer and the business’s nature and purpose is essential, including tracing the origins of their funds and wealth.

CASPs are also advised to increase the frequency of monitoring crypto-asset transactions. And upon any triggering event, they need to rigorously review, update, and document relevant customer information. Conducting business relationship reviews more regularly is part of this protocol.

According to EBA, CASPs should use investigation tools more extensively for a deeper investigation into crypto assets. This includes examining all distributed ledger addresses a customer might use, particularly if they have several. They also need to monitor customer IP addresses more frequently.

Understanding a customer’s knowledge about crypto assets is another focus area of the amended guidelines. CASPs should take additional steps when withdrawal or redemption patterns do not align with the customer’s usual profile and determine whether the customer or a third party initiates these transactions.

Lastly, CASPs have the responsibility to verify that a customer truly controls and owns their self-hosted wallet address. To aid in this process, CASPs can now utilize Veriscope’s User Signing feature. It allows them to directly request cryptographic proof from users’ self-hosted wallets.

Analytics and Record-Keeping Requirements for CASPs

The EBA guidelines recommend the application of advanced analytics tools to transactions on a risk-sensitive basis. Their deployment must come as a supplementary addition to the standard monitoring tools. This is particularly important for transactions involving self-hosted wallets.

These advanced tools are designed to help CASPs trace a wallet’s transaction history and check for any possible connections to criminal activities or suspicious persons and entities.

Additionally, the guidelines highlight the need for CASPs to go beyond relying solely on distributed ledgers for record-keeping. They should have procedures to link each distributed ledger address with its corresponding private key, controlled by an individual or a legal entity.

Overall, these updated guidelines from the EBA are directing CASPs to adopt a more thorough approach to tackling money laundering and the use of crypto in terrorism financing.

This will likely mean that CASPs have to allocate more resources, including manpower and training, to comply with these guidelines and enhance their monitoring capabilities.

Click here to read the latest global crypto regulatory outlook 2024 report.

Key Takeaways The European Banking Authority (EBA) published amended guidelines for Crypto Asset Service Providers (CASPs) to address money laundering and terrorist financing risks. The guidelines recognize that CASPs operate differently than traditional banks and may be more vulnerable to risks due to features like anonymity. The EBA advises CASPs to take practical steps like limiting high-risk product features and transactions, verifying customer identities rigorously, and monitoring transactions more closely. CASPs need enhanced customer due diligence including gathering more customer information, investigating crypto addresses, and understanding customer knowledge and behavior. The guidelines recommend CASPs use advanced analytics tools to supplement standard monitoring and maintain comprehensive records linking crypto addresses to individuals/entities.

______________________________________

Shyft Network powers trust on the blockchain and economies of trust. It is a public protocol designed to drive data discoverability and compliance into blockchain while preserving privacy and sovereignty. SHFT is its native token and fuel of the network.

Shyft Network facilitates the transfer of verifiable data between centralized and decentralized ecosystems. It sets the highest crypto compliance standard and provides the only frictionless Crypto Travel Rule compliance solution while protecting user data.

Visit our website to read more, and follow us on X (Formerly Twitter), GitHub, LinkedIn, Telegram, Medium, and YouTube. Sign up for our newsletter to keep up-to-date on all things privacy and compliance.

EBA’s Amended Money Laundering & Terrorist Financing Guidelines Explained was originally published in Shyft Network on Medium, where people are continuing the conversation by highlighting and responding to this story.


KYC Chain

Regulation Focus Series | Article 11: Germany and BaFin

As Europe's largest economy and a major global financial powerhouse, Germany is unsurprisingly a major target for illicit financial activity. In this installment of our Regulatory Focus Series, we take a look at Germany's key AML regulator BaFin and the KYC compliance rules it puts in place to curb money laundering and other financial crimes in the country. The post Regulation Focus Series

Tuesday, 30. January 2024

KuppingerCole

The Impact of Expanding Attack Surfaces on Enterprise Cybersecurity and Why You Need a Strong IAM Posture

Join us for a compelling webinar on "Mastering Identity Security in the Age of Evolving Attack Surfaces." This session will address how to minimize human error, eliminate gaps, and overlaps within your different IAM tools, and align your security strategy with the constantly evolving challenges of today and tomorrow's cyber threats. Larry Chinski Vice President of Global IAM Strategy a

Join us for a compelling webinar on "Mastering Identity Security in the Age of Evolving Attack Surfaces." This session will address how to minimize human error, eliminate gaps, and overlaps within your different IAM tools, and align your security strategy with the constantly evolving challenges of today and tomorrow's cyber threats.

Larry Chinski Vice President of Global IAM Strategy at One Identity, and Martin Kuppinger, Principal Analyst at KuppingerCole, will guide you on aligning your security strategy with the ever-changing challenges posed by today's and tomorrow's cyber threats. Don't miss this opportunity to enhance your IAM posture and safeguard your enterprise from evolving risks.




Indicio

You already know who you do business with: time to leverage that trust

The post You already know who you do business with: time to leverage that trust appeared first on Indicio.
The foundations of trust within your ecosystem already exist. Leverage existing relationships to fast-track the benefits of verifiable credentials, and digital identity is something you can act on now. To stay updated on all things governance, sign up for the Indicio Governance Newsletter.

By Sam Curren

Moving to digital trust and verifiable credentials is the goal of many current organizations. Doing so carries substantial benefits, including fraud reduction, increased data sharing with reduced integration costs, and even privacy law compliance.

Large-scale efforts by governments and orgs like GLEIF to create organizational identity are often championed as the foundations for this new digital world, and many are under the impression that it is one that we must wait for.

In reality, the relationships you already have are the key to building a digital ecosystem. Ecosystems need to identify the parties involved in the issuance and verification of credentials — who to trust for what. By contrast, large global identity projects  will actually provide very little value to established ecosystems where the parties have established relationships with each other. Moreover, as ecosystems turn digital, newly created digital identifiers from ecosystem participants can be passed directly between participants and the ecosystem governance authority using existing communication channels and relationships; there is no need for that to be orchestrated by another party.

Here are a few examples:

Employees within a company — companies already know their employees and have existing HR processes for interacting and coordinating with employees. Employees know which company they work for. Issuing those employees verifiable employment credentials requires no outside authority.

Trade associations — trade associations already know their member companies and have been working with them for a long time, in some cases, years or decades. Verifiable credentials for member companies, including certifications, proof of membership, and other benefit qualifications are all based on existing relationships.

Business partner networks — insurance companies are a great example of how to build on established trust. They know the providers in their network, and they know the insured individuals. They can provide both directories and introductions between newly connecting parties, all leveraging these existing relationships. Patient VCs can ease onboarding at a new medical provider, while providers can provide qualification credentials to patients. Data entry errors are minimized, efficiency is gained, etc.

None of these examples need anybody’s permission to start innovating with VCs and gaining immediate benefits.

So, what do these large government efforts and global identity programs provide? We’ve made it clear that existing relationships don’t need external parties. What these large identity systems do provide is a trusted foundation for new relationships. It simplifies the process of identifying new ecosystem partners and participants, which can be a major source of fraud. By streamlining the correct identification of individuals and organizations, ecosystems can grow rapidly and more efficiently.

So what’s the holdup on these large programs?  It should be no surprise that policy creation, technology development, and often regulation take time to get right.

As these programs mature and deliver the right technologies, they will assist in the growth of already established identity and trust ecosystems. They’ll improve upon existing methods of vetting new partners and participants, and that’ll be great.

In the meantime, the technologies needed to enhance trust and create digital ecosystems from your existing relationships already exist. We at Indicio help our customers do just that, and we’d like to help you as well.

Quistions? contact us to learn more about Indicio Proven®, our complete verifiable credential solution, or, to stay updated on all things governance, sign up for the Indicio Governance Newsletter 

The post You already know who you do business with: time to leverage that trust appeared first on Indicio.


Microsoft Entra (Azure AD) Blog

Introducing More Granular Certificate-Based Authentication Configuration in Conditional Access

I’m thrilled to announce the public preview of advanced certificate-based authentication (CBA) options in Conditional Access, which provides the ability to allow access to specific resources based on the certificate Issuer or Policy Object Identifiers (OIDs) properties.    Our customers, particularly those in highly regulated industries and government, have expressed the need for mor

I’m thrilled to announce the public preview of advanced certificate-based authentication (CBA) options in Conditional Access, which provides the ability to allow access to specific resources based on the certificate Issuer or Policy Object Identifiers (OIDs) properties. 

 

Our customers, particularly those in highly regulated industries and government, have expressed the need for more flexibility in their CBA configurations. Using the same certificate for all Entra ID federated applications is not always sufficient. Some resources may require access with a certificate issued by specific issuers, while other resources require access based on a specific policy OIDs. 

 

For instance, a company like Contoso may issue three different types of multifactor certificates via Smart Cards to employees, each distinguished by properties such as Policy OID or issuer. These certificates may correspond to different levels of security clearance, such as Confidential, Secret, or Top Secret. Contoso needs to ensure that only users with the appropriate multifactor certificate can access data of the corresponding classification. 

 

Figure 1: Authentication strength - advanced CBA options

 

With the authentication strength capability in Conditional Access, customers can now create a custom authentication strength policy, with advanced CBA options to allow access based on certificate issuer or policy OIDs. For external users whose multifactor authentication (MFA) is trusted from partners' Entra ID tenant, access can also be restricted based on these properties. 

 

This adds flexibility to CBA, in addition to the recent updates we shared in December. We remain committed to enhancing phishing-resistant authentication to all our customers and helping US Gov customers meet Executive Order 14028 on Improving the Nation's Cybersecurity. 

 

To learn more about this new capability check authentication strength advanced options

 

Thanks, and let us know what you think! 

 

Alex Weinert

 

 

Learn more about Microsoft Entra: 

See recent Microsoft Entra blogs  Dive into Microsoft Entra technical documentation  Learn more at Azure Active Directory (Azure AD) rename to Microsoft Entra ID  Join the conversation on the Microsoft Entra discussion space and Twitter  Learn more about Microsoft Security  

Civic

Optimism Builders: Welcome!

Guess what? We’re thrilled to let you know we’ve joined the Optimism community!  If you’re tired of dealing with dApp security challenges, Civic Pass may help solve some of your issues.  Whether you’re battling botting during live events, NFT mints, or airdrops, or grappling with fraud and spam, securing your blockchain community can feel like […] The post Optimism Builders: Welcome! a

Guess what? We’re thrilled to let you know we’ve joined the Optimism community!  If you’re tired of dealing with dApp security challenges, Civic Pass may help solve some of your issues.  Whether you’re battling botting during live events, NFT mints, or airdrops, or grappling with fraud and spam, securing your blockchain community can feel like […]

The post Optimism Builders: Welcome! appeared first on Civic Technologies, Inc..


Innopay

INNOPAY at UP 2 Consultancy Case Day l UniPartners Eindhoven

INNOPAY at UP 2 Consultancy Case Day l UniPartners Eindhoven 07 Feb 2024 trudy 30 January 2024 - 13:37 Eindhoven 51.449397269928, 5.4875517 Hello students! Exciting news from INNOPAY! We're thrilled to announce our participation in UP 2 Consultancy Case Day o
INNOPAY at UP 2 Consultancy Case Day l UniPartners Eindhoven 07 Feb 2024 trudy 30 January 2024 - 13:37 Eindhoven 51.449397269928, 5.4875517

Hello students!

Exciting news from INNOPAY! We're thrilled to announce our participation in UP 2 Consultancy Case Day on February 7, 2024, hosted by UniPartners Eindhoven. We can't wait to meet all of you and delve into discussions about our work.

This event is a fantastic opportunity for you to explore your career options post-master's degree, and we're honored to be a part of it. Join us for two engaging case presentations where you can witness how we apply our expertise to real-world scenarios. Stick around afterward for some relaxed networking over drinks.

We're looking forward to connecting with you and sharing more about what makes INNOPAY stand out. Don't miss out on this chance to meet us and gain insights into our organisation. 

See you on February 7, 2024, at UP 2 Consultancy Case Day!

For more information and registration, go to UniPartners website.


IDnow

IDnow and netID cooperate: Identity wallet for up to 40 million web users

IDnow to become an active part of the European netID Foundation’s ecosystem Munich, January 30, 2024 – IDnow, a leading identity verification platform provider in Europe, announces its cooperation with the European netID Foundation. The goal: The introduction of an identity wallet for netID users in Germany as an alternative to the offerings of US […]
IDnow to become an active part of the European netID Foundation’s ecosystem

Munich, January 30, 2024 – IDnow, a leading identity verification platform provider in Europe, announces its cooperation with the European netID Foundation. The goal: The introduction of an identity wallet for netID users in Germany as an alternative to the offerings of US platforms.

Developed by IDnow, the netID Wallet app will be available for users to download later this year and will allow them to reuse and share stored digital identities.

Self-determined and even simpler reuse of identities

netID is an open standard that enables users to access online services of netID partners (among others Süddeutsche Zeitung, Mediengruppe RTL Deutschland, ProSiebenSat.1 and United Internet with the WEB.DE and GMX brands) via the same login details, all in compliance with data protection regulations. The introduction of the identity wallet will make storing and retrieving identities even easier for around 40 million potential users.

Bundled in one digital location, users can access and verify proof of identity such as ID cards, driving licenses, social security data and much more on their smartphone at any time – making everyday processes more efficient. netID partners, in turn, can significantly simplify their processes for verifying customer data, for example for invoice purchases, and thus make them more cost-effective.

Wallet can be used by the partners of the netID ecosystem

The initial verification of the user’s identity to create the wallet is carried out using the proven IDnow solutions. Once the identity has been confirmed and the digital wallet has been successfully created, the user decides with whom they want to share what data. Anyone who uses the wallet once to register with a mobile phone provider, for example, can use it again at any time with one of the partners in the netID ecosystem for verified identification.

The wallet is another step toward the future: the European Commission is about to publish a new version of the eIDAS regulation (electronic IDentification, Authentication and trust Services), which is expected to come into force in 2024. All EU citizens should have a digital identity by 2030. The launch of the netID Wallet supports this goal.

“netID has been simplifying digital life in Germany since it was founded. Transparency and data protection have always been at the heart of what we do. We see the netID Wallet as the logical next step in the further development of the netID offering. Thanks to the cooperation with IDnow, we can offer our users secure and user-friendly identity verification that will even be able to serve regulated use cases in compliance with eIDAS in the future,” explains Uli Hegge, CEO of the European netID Foundation.

“We are delighted to have gained a trustworthy partner for our identity wallet in Germany in the form of the European netID Foundation and to be an active part of the netID ecosystem in the future,” says Uwe Stelzig, Managing Director DACH at IDnow.

“IDnow and netID pursue similar goals in terms of data security and user-friendliness, which we are now also pursuing together in identity verification. With the introduction of the wallet, users no longer have to identify themselves again and again, but can save their verified identities and simply reuse them the next time. In this way, netID partners can offer their users even faster and secure access to their services,” continues Stelzig.


KuppingerCole

Digital ID, Web3, and Metaverses

by Martin Kuppinger A while ago, I was asked how to deal with cybersecurity for Web3 and the metaverses. There is not a simple answer to this. I talked about this last year at KuppingerCole Analyst’s cyberevolution 2023, our cybersecurity conference. It’ll be a hot topic again at the upcoming EIC, KuppingerCole’s European Identity and Cloud Conference, from June the 4th to the 7th in Berlin. Go

by Martin Kuppinger

A while ago, I was asked how to deal with cybersecurity for Web3 and the metaverses. There is not a simple answer to this. I talked about this last year at KuppingerCole Analyst’s cyberevolution 2023, our cybersecurity conference. It’ll be a hot topic again at the upcoming EIC, KuppingerCole’s European Identity and Cloud Conference, from June the 4th to the 7th in Berlin.

Going back to the question: The easiest way to solve complex problems is by deconstructing them into smaller pieces, solving these, and putting everything together again.

In the graphic, foundational elements of Web3 converge with metaverses related technologies. It’s tough to try to solve security holistically, of course. But there is NFT Security as a smaller, yet complex  challenge. There is DeFi (Decentralized Finance) Security — some of which we could deconstruct even further.

But, more importantly, there is a common universal element: Decentralized Identity. This is the glue. For solving security in a decentralized world, we need decentralized identity. Then we can solve the various challenges and construct a unified security approach.

Decentralized Identity, which is part of the broader theme of Digital IDs (aka Digital Identity) is one of the main themes of EIC 2024. Digital IDs are not just elements of Identity and Access Management (IAM). It is something much bigger. They relate to IAM, but in cross-organizational use (decentralized, reusable) and supply approaches that enable and leverage business models.

When we think about an avatar (or agent or whatever is the industry term de jour) in a metaverse, this is sort of a “digital double” to us. The term “digital twin,” unfortunately, is already used in another context (namely, manufacturing and simulation).

This digital double has an identity which relates to our digital identity. It acts on our behalf. This imposes challenges such as control, liability, and other concerns. Digital identity is the core of the solution, because we can define that identity, relate it to our identity and others, and keep control. Without this, we will fail in our journey towards new models, especially everything including autonomous or semi-autonomous (that is how I would name an avatar in a metaverse) systems.

While this field is still evolving, one thing is clear: A central foundational pillar of Web3 and the Metaverse (and every modern digital service and business) are Digital IDs, particularly decentralized identity. Also, don’t miss the recent blog post of my colleague Alejandro Leal on eIDAS and the EUDI wallet, another key topic at EIC 2024. There also is a lot of research on this topic available by KuppingerCole Analysts, such as the Advisory Note How Enterprises Will Learn to Love Decentralized IDs: The Roles of Distributed and Sovereign Identities in Our Private Metaverse written by Mike Neuenschwander.

Look at our research, talk to the analysts of KuppingerCole, and don’t miss attending EIC 2024. See you in Berlin in June!


TBD

Shifting Loan Power to Borrowers

How FormFree uses Web5 to shift loan power to borrowers

"Our current credit system says, 'Saddle yourself, so we can give you more to saddle.'" Eric Lapin, President of FormFree, stated during a recent TBD live stream. His comment sheds light on a fundamental paradox of the U.S. credit system: it often requires you to accumulate debt as a prerequisite for obtaining loans, leading to more debt. This systemic issue is underscored by recent data from the Federal Reserve Bank of New York, which reveals that 1 in 5 applicants for mortgages, car loans, or other types of loans were rejected — the highest rate in five years. This trend disproportionately impacts Black and Latino applicants. Guided by their slogan, 'Democratizing lending for all people without bias,' FormFree developed a platform built with Web5 and AI to address these issues.

Residual Income Knowledge Index (RIKI)

The Fair Isaac Corporation score, better known as a FICO score, is a three-digit score that helps lenders in the U.S. evaluate the likelihood of a borrower repaying a loan. A higher score indicates greater reliability. However, achieving a high FICO score — or any credit score — largely depends on accruing and gradually repaying debt. This may come as a disadvantage for various groups of people who may not have a credit score, including veterans, formerly incarcerated individuals, immigrants, young adults, and folks who prefer cash transactions.

To provide a more holistic view of a person's creditworthiness, FormFree developed a new system called Residual Income Knowledge Index™ (RIKI™). Instead, RIKI analyzes monthly income and spending. Integrating this method into our existing credit system provides greater opportunities for home ownership, car purchases, and other essential loans.

"It's like Tinder for banks!"

During the live stream, FormFree Software Engineer, Jon West, showcased their Passport product. The demo highlighted Passport's ability to match borrowers with lenders. This feature inspired an audience member to comment, 'It's like Tinder for banks!'"

Here's how it works:

Borrowers securely connect to their bank account. FormFree analyzes the account records and creates an anonymized credit profile for the borrower. Lenders review these anonymized profiles and decide which borrowers to extend offers to. Borrowers can now review offers from lenders and choose their preferred option. All necessary borrower data is shared with the lender once a match is made.

Typically, borrowers endure aggressively pitched products from dozens of lending institutions. FormFree shifts control to users for more tailored loan consideration.

How FormFree Uses Web5

FormFree uses the following Web5 technologies:

Decentralized Identifiers (DID) - After sign up, the application associates a user's account with a DID. Verifiable Credentials (VC) - A borrower's credit profile is secured as a Verifiable Credential enabling the borrower's information to stay anonymous and tamper-proof. Coming Soon - tbLend and Credential Selector

As part of our incubation program, organizations such as FormFree, Sophtron, and MX are working in partnership to build Web5 protocols that will help their products evolve.

tbLend - A Web5 protocol that connects the lender and borrower to exchange money securely. Credential Selector - A Web5 protocol that enables developers to embed the Verifiable Credentials (VCs) into their application and allows users to select and retrieve VCs. Rewatch the full episode

Learn more about what FormFree is building and watch the full live stream here.

This post highlights key moments from our collaborative live stream with leaders and builders at FormFree. Join our weekly live streams every Friday at 12pm ET/9 am PT. We're learning how innovators and community members are advancing the Self-Sovereign Identity and Global Payments industry. Tune in next time via Twitch or in our community Discord.

Monday, 29. January 2024

Entrust

Public Trust Certificates: A 2023 Recap and Projections for 2024

Looking Back at 2023 2023 was a year of change for Public Trust Certificates –... The post Public Trust Certificates: A 2023 Recap and Projections for 2024 appeared first on Entrust Blog.
Looking Back at 2023

2023 was a year of change for Public Trust Certificates – discussions on reduced validity periods, approval of new requirements, and significant progress in access to Verified Mark Certificates. Let’s review what happened in 2023 along with a few predictions for what’s to come in 2024.

TLS Certificates

The Google Chrome team was quite active in the TLS ecosystem. They began promoting modernizing PKI infrastructures including reducing the TLS certificate validity period to 90-days. Entrust provided our response to the 90-day proposal. The Chrome team also provided the market with information to help secure TLS PKIs through automation for a safer more reliable internet.” You can then hyperlink the word “information.

The CA/Browser Forum (CABF) published ballots which updated the TLS Baseline Requirements (BRs):

SC61 – New Certificate Revocation Lists (CRL) entries must have a revocation reason code SC62 – Certificate profiles update for the requirements and restrictions of the TLS certificate profiles SC63 – Makes OCSP optional, require CRLs, incentivizes automation, and finally approves short-lived certificates, which do not require certificate status using CRL or OCSP

The IETF has published RFC 9345 to support delegated credentials for TLS and DTLS. The delegated credential mechanism allows a server operator to use the private key from their server to issue a delegated credential. The new credential will have a different private key and a validity period of no greater than seven days. Since the delegated credential is only valid for a short period of time, there is no status protocol required to be checked, that is, no CRL or OCSP response.

S/MIME Certificates

The CABF approved the S/MIME Baseline Requirements in November 2022, which provides an S/MIME standard for the public trust CAs. The S/MIME BRs have been effective since the September 1, 2023. ETSI also released standard ETSI TS 119 411-6 to comply with the EU regulation on electronic signatures in email messages and to support the S/MIME BRs. This allows EU CAs meeting the S/MIME BRs to be audited using the ETSI audit criteria. The S/MIME BRs were also required by the Apple and Mozilla policies.

DigiCert provided an S/MIME certificate factory with samples of certificates which meet the S/MIME BRs. They followed up with the pkilint tool, which can be used for S/MIME certificate linting to find errors when a certificate does not meet the S/MIME BRs. Entrust fully supports digital certificate linting  and its deployment.

Code Signing Certificates

The CABF began requiring code signing certificate keys to be generated and managed in a crypto device effective as of June 1, 2023. The goal is to reduce key compromise by removing the option to generate key pairs in software. The CAs must also verify if the keys were generated in a crypto device which would include server HSMs, USB tokens and cloud HSMs.

Ballot CSC68 was issued to update the code signing certificate revocation requirements. The purpose was to tighten up the procedure when a key is comprised, or if a subscriber signs suspect code. If the CA finds out that either of these occurred, then the certificate must be revoked within 24 hours. At a later time the CA may backdate the revocation time before the key was compromised or the suspect code was signed. This will cause signatures after the revocation date to fail.

Verified Mark Certificates

Government organizations can now get Verified Mark Certificates (VMCs) for their outgoing emails. This is equivalent to VMCs issued to companies which have registered marks. Google also strengthened VMCs by adding a blue checkmark which proactively indicates the sender of the email has verified that it owns the domain and the logo displayed in the avatar slot.

VMCR 1.5 was released create Common Mark Certificates. This will allow a certificate applicant to submit either a modified registered mark or a prior use mark to be added to the certificate. The modified registered use mark will allow an applicant with a registered mark to make a change such as removing/moving words or removing a portion of the mark. This may allow the applicant to modify the mark to fit in place with different dimensions or perhaps provide a seasonal change to the mark. For prior use mark, the applicant must have historically displayed the mark for at least 12 months and the mark must be found in an archive webpage source. The Common Mark Certificate beta program has just started in December 2023. Note: we are not sure if both Google and Apple will fully support common marks and emails will not get the blue checkmark.

Mozilla sets Root CA Lifecycles

To support cryptographic agility which Mozilla states is the ability to replace cryptographic primitives, algorithms, and protocols efficiently at reasonable cost with limited impact on operations, Mozilla has set a lifecycle schedule for trusted root certificates. There is a migration plan for existing roots, but the ongoing schedule will be from the date of key generation, a TLS root will be removed after 15-years and an S/MIME root will be removed after 18-years. The lifetime will allow the CA time to get the root embedded and at least 10 years of use. The S/MIME roots have a longer lifecycle as S/MIME certificates are issued with a longer validity period.

To 2024 and Beyond 

Here are some expected changes for 2024.

TLS Certificates

The CABF TLS validation subcommittee is working on multi-perspective issuance corroboration to ensure when verifying a domain name that the determinations made by the primary network perspective is correct before a TLS certificate issuance. Multi-perspective determination will be applied to both domain names and Certification Authority Authorization (CAA)record compliance. We anticipate multi-perspective will migrate to a “must” requirement in 2025.

Code Signing Certificates

The CABF code signing working group is working on updates for Signing Services and high-risk certificate applications. Signing Service will help the certificate subscribers to be complaint and ensure their key pairs are generated in a cryptographic module, the private key is protected, and only the subscriber can activate the private key for signing. The high-risk certificate application has been changed as now all key pairs must be generated in hardware, which mitigates some of the risks that were addressed in the CSBRs.

Key attestation is a method for the CA to verify that the private key was generated in a cryptographic module. There is no standard for key attestation, so this is difficult to implement. Members of the IETF are working on addressing this issue by generation an RFC for CSR attestation and for X.509 based attestation evidence. CSR attestation can be implemented with existing crypto modules and the RFC may be available in 2024. X.509 based attestation will take longer to define and would need to be implemented in new cryptographic modules; as such, we do not expect to benefit from this effort for many years.

S/MIME Certificates

The IETF publish RFC 9485 which supports certification authority authorization (CAA) for S/MIME certificates. This will support domain name owners to provide the CAA record “issuemail” for their authorized CAs. The CABF S/MIME working group is developing on an update for CAA as a recommendation in late 2024 and a “must” requirement in 2025.

And, there you have it – a brief look at an active year in the public trust certificate ecosystem. Find out more about how Entrust can help you manage your digital certificate here.

The post Public Trust Certificates: A 2023 Recap and Projections for 2024 appeared first on Entrust Blog.


liminal (was OWI)

Navigating the AML Transaction Monitoring Landscape: Trends and Solutions for Financial Institutions

The landscape of Anti-Money Laundering (AML) transaction monitoring is rapidly evolving, driven by increasingly complex global regulatory demands and sophisticated money laundering tactics. Financial institutions face the daunting task of adapting to this changing landscape, where technological advancements like AI and machine learning are reshaping traditional monitoring methods. The pressure to
The landscape of Anti-Money Laundering (AML) transaction monitoring is rapidly evolving, driven by increasingly complex global regulatory demands and sophisticated money laundering tactics. Financial institutions face the daunting task of adapting to this changing landscape, where technological advancements like AI and machine learning are reshaping traditional monitoring methods. The pressure to comply with stringent regulations while managing operational complexities is higher than ever.

Financial institutions must effectively identify, analyze, and report suspicious activities within a landscape marked by technological evolution. The dual challenge of staying compliant with rigorous regulations and handling the operational complexities of transaction monitoring is a significant burden. This challenge is compounded by the need to manage the operational complexities of transaction monitoring and integrate advanced technology solutions that are both compliant and efficient in managing fraud risks. Despite these challenges, the demand for effective anti-money laundering (AML) solutions is on the rise, with the total addressable market projected to grow from $3.6 billion in 2024 to $6.8 billion by 2028 at a 17.5% compound annual growth rate (CAGR).

Download the full report for a detailed analysis of key findings 78% of buyers adopt AI/ML for Anti-Money Laundering Transaction Monitoring for quick risk detection and self-learning but face integration and regulatory compliance challenges. AML systems generate costly false positives, with 44% of alerts deemed incorrect, highlighting the need for more efficient solutions. A shift towards volume-based pricing in AML solutions is evident, as 55% of providers move from license-based models, enhancing AI/ML transaction analysis. Rising fines and stricter AML regulations are escalating demand for advanced monitoring solutions among 88% of buyers. Despite legal and privacy challenges, 74% of European financial institutions support increased data sharing for improved AML compliance. Related content Market and Buyer’s Guide for Transaction Fraud Prevention Link Index for Account Opening Q3 Briefing Report – Rise of the Machines (For members) What is Anti-Money Laundering Transaction Monitoring?

Anti-Money Laundering Transaction Monitoring is a process driven by compliance regulations to ensure that financial transactions conducted by institutions are not used for money laundering or terrorist financing. The process is mandated by regulations such as the EU’s AML Directives and the USA PATRIOT Act, which set international benchmarks for companies that engage with the US and Europe. Financial institutions use rule-based or AI-enabled tools to identify and report suspicious transactions, and analysts are responsible for maintaining compliance and ensuring secure financial environments.

These solutions play a critical role in detecting money laundering and other illicit activities, thereby ensuring the integrity of the financial system. Compliance with AML regulations is not only necessary for regulatory purposes but also vital to avoid severe legal and financial consequences. In fact, recent statistics indicate that global fines for AML violations have exceeded $5 billion in 2022 alone. By adhering to regulatory compliance measures, financial institutions can foster trust and contribute to the transparency of the financial system. Furthermore, compliance standards have led to significant advancements in AML solutions, with regulations such as the Bank Secrecy Act, USA PATRIOT Act, EU’s AML Directives, and FATF recommendations acting as catalysts for innovation in Anti-money laundering transaction monitoring technology.

The range of financial institutions that must comply with regulations has expanded to include fintech and cryptocurrency organizations, given the higher risk of money laundering associated with digital currencies and technologies. To address these new financial services, emerging regulations such as the “Travel Rule” from the Financial Action Task Force (FATF) and the 5th and 6th Money Laundering Directives from the European Union have been introduced. The industry is also shifting towards the use of AI/ML technologies to detect suspicious activities more quickly and make more accurate decisions. Currently, 66% of financial services are investing more in AI to accomplish this.

The post Navigating the AML Transaction Monitoring Landscape: Trends and Solutions for Financial Institutions appeared first on Liminal.co.


Tangle

Vira Wallet v.0.3.0 - Major Release

Tangle Labs are happy to share the release of Vira Wallet v0.3.0 one of the most advanced self-sovereign identity wallets on the market and just one step away from a full 1.0 release. After rigorous testing over the past couple of months, and the impressive demonstration of interoperability with other projects, we have experienced significant positive feedback and ensured that the wallet has matu

Tangle Labs are happy to share the release of Vira Wallet v0.3.0 one of the most advanced self-sovereign identity wallets on the market and just one step away from a full 1.0 release.

After rigorous testing over the past couple of months, and the impressive demonstration of interoperability with other projects, we have experienced significant positive feedback and ensured that the wallet has matured to a level where it is now being used for ticketing and open badge experiences in the real world. Learnings from such events and experiences have given the team valuable feedback that has Vira to develop into such an advanced application.

Vira is currently being used across a number of projects including the UNESCO Lifelong Learning project NGDIL, various real world event ticketing and open badge experiences, and has recently been declared a DIIP compliant wallet by the Dutch Blockchain Coalition, alongside the UniMe wallet from Impierce Technologies.

Feature Updates

Whilst a lot has been done behind the scenes to streamline the app in the backend and to ensure its security, stability, and feature performance, the four big feature updates include:

Improved UX/UI

The Vira UI has developed into a seamlessly intuitive and easy-to-use user experience. The balance between user experience and security in the world of crypto and decentralised identity is a very difficult area, often sacrificing one for the other. Having worked alongside partners to build out seamless experiences, we have streamlined the Vira experience to be a smooth one, without sacrificing the importance of security and privacy for users. This important area of the development process can place Vira in the top ranks of user experiences in the identity wallet space.

Open Badge Support

Credentials can now be issued as standardised open badges, supporting the Open Badge Standard v3.0. Separating credentials by standardised types provides a versatile approach to how credentials are received, shared, and stored. The Open Badge standard, by Ed1Tech, is a world leading standard used all over the world. Having implemented the standard in its entirety in the latest release, Vira can now become an open badge compatible wallet that may be used by millions of badge holders in open badge ecosystems around the world.

Self-Signed Credentials

User signed credentials to allow self-declaration of information such as name and email. Having verified credentials issued by a third-party is one thing, but being able to verify that you have self-declared information personally, is another. By implementing self-signing within Vira wallet, this opens up a world of opportunity allowing users to assign self-signed data to profiles, such as name, telephone number, email address, and other data for profiling at the granular level whether its a business card, a user login, or another profile, Vira users can now declare the information they want to within their individual profiles.

Internationalisation and RTL Support

Fully integrated support for translation into any language allowing Vira to go global. Cross-border use is one of the huge value propositions of decentralised identity. With Vira now fully integrated with global internationalisation standards, it can support any language translation as we begin to be used in multiple countries around the world. So whether its an open-badge event in the Netherlands, a travel ticket in Germany, or a business document in UAE, Vira is poised to become a truly global app.

Login with DID

Login with DID support via open-source OpenID standards. Vira is 100% ready to allow its users to login with DID allowing any project to integrate login with DID into their apps and websites using the open-source libraries from Tangle Labs, giving users the opportunity to privately connect to services without the need for third-party identity providers such as Google, Facebook, or X, etc.

What’s next?

These updates alone have put us one step away from a final public v1.0 with a feature complete release scheduled for later in the year. With Vira already live in the real world supporting and showcasing decentralised identity in its many forms, the next release of v0.4.0 will add the final pieces of the puzzle to ensure the complete package is ready for adoption.

The next features to be updated and integrated into the 0.4.0 release for final testing include:

Full Multichain Support - to allow users to operate in different ecosystems by selecting which networks they wish to engage with.

Document Signing - allowing users to legally sign documents with digital signatures from within their Vira wallet

Translations - full translations to allow Vira to be used in various languages such as ES, NL, AR, DE, etc.

Verified Account Support - allowing users to verify their identity for reusable credentials


Northern Block

Utilising Digital Credentials to Promote Responsible Business Conduct

By creating a supply of high integrity digital credentials, we can now start unlocking tremendous value for global supply chains. The post Utilising Digital Credentials to Promote Responsible Business Conduct appeared first on Northern Block | Self Sovereign Identity Solution Provider.

Background

At Northern Block, our journey toward enhancing Canadian critical mineral supply chain transparency and traceability collaboration began with an innovative pilot in 2022, in partnership with the Government of British Columbia, Copper Mountain Mining Corporation, and PricewaterhouseCoopers.

During this initial pilot, we successfully demonstrated that greenhouse gas emissions reporting could be converted into digital credentials, making this sustainability data verifiable and highly available for various systems within the supply chain. This demonstration effectively showcased how, by utilizing common standards and protocols, sustainability data can be seamlessly transferred between different organizations’ systems while maintaining its integrity (note: we cover more of ‘how this works’ in the section below ‘Why do digital credentials help create supply chain transparency in the mining industry?’)

In early 2023, we expanded our efforts by adding support for two new digital credentials:

Mines Act Permit: Easily prove your permit status (for major mine operators in B.C.) Towards Sustainable Mining (TSM): Submit TSM scores and share verified environmental, social, and governance (ESG) data securely

Both credentials are rooted in authoritative governance frameworks, and utilise OCA bundles for proper credential branding

To support mining operators in receiving their Mines Act Permit in the form of a digital credential, we continued our ongoing partnership with the Government of B.C..

To transform TSM reporting data into digital credentials, we formed a key partnership with the Mining Association of Canada (MAC), whose 100+ members account for most of Canada’s production of critical minerals. Canada already produces more than 60 minerals and metals and is a leading global producer of many critical minerals, including nickel, copper, potash, aluminum, and uranium. MAC is also behind the TSM initiative, a globally recognized ESG standard implemented not only in Canada, but in countries such as Australia, Brazil, Spain, Argentina, and Colombia. TSM, first established in 2004, allows mining companies to turn high-level environmental and social commitments into action on the ground. Additionally, TSM was the first mining standard in the world to require site-level reporting with external verification. We anticipated that facilitating the creation of digital credentials for mines, based on a recognized and important reporting standard, should generate significant demand and value in downstream supply chains.

We worked closely with MAC to develop a set of rules and standards for this new type of digital certificate to mobilize the data generated through the  TSM standard. This digital certificate, or ‘credential,’ is designed for organisations involved in extracting important resources. Our goal was to make this TSM digital credential as accurate and trustworthy as possible, similar to what we had previously achieved with the greenhouse gas credential in 2022. We aimed to base the TSM credential on direct, original data (source data) and combine technical methods with strong surrounding governance. This approach ensured that TSM data transmitted through this credential could maintain its reliability and integrity, increasing its value. It can then be used effectively by other companies and organisations further along in the supply chain.

TSM’s comprehensive approach, utilizing various environmental, social and climate change indicators, has been a great mechanism for promoting responsible production practices. Our work in this domain reflects our commitment to making TSM data not only more accessible but also a trusted element in global supply chains, reinforcing responsible mining practices worldwide.

In late 2023, we achieved a significant milestone by enabling mines to self-issue their TSM reports for the years 2021, 2022, and 2023 in the form of digital credentials, which were then digitally presented to MAC using this innovative method. This accomplishment marked a huge milestone for our objective of creating trusted data inputs for supply chains. In this process, we also involved PricewaterhouseCoopers and Envirochem, two recognised TSM verification firms. The two verification firms created externally verified TSM reports as digital credentials for the two mines in question. Essentially, this means that when a mining company receives one of these digital reports from their verifier, it comes with the verifier’s digital signature. This signature proves that the report is genuine and hasn’t been altered. When the mining company shares this report with other supply chain participants, everyone can trust the data accuracy and provenance.

Here is an overview of the ecosystem which Northern Block is supporting with both the Mines Act Permit and TSM credentials:

Note that in this image, the Government of British Columbia has developed its own digital credential system for government ministries, independent of Northern Block’s system. However, both systems are capable of communicating with each other due to their use of interoperable technology. It’s important to recognize that a digital trust ecosystem can involve many different actors, each potentially using services from various providers. A key advantage of this decentralized approach is the flexibility it offers participants, allowing them to choose their preferred platforms without being confined to a single, centralized system.

This advancement was a substantial step in making the existing international TSM standard more accessible for supply chain participants. Our next focus is on broadening its use and creating a demand generation strategy. Given the increasing demand for high-integrity data in supply chains, we believe that our efforts have established crucial foundational groundwork.

Why do digital credentials help create supply chain transparency in the mining industry?

Digital credentials are based on internationally accepted technical and governance standards that are meant to facilitate digital trust. A digital trust architecture enables any two entities to share data between themselves in a trusted manner, independent of the system or application they’re using. This can be compared to email clients, where emails are reliably sent and received regardless of the different clients being used (e.g., like Microsoft Outlook and Google’s Gmail). This reliability is due to the use of common underlying protocols (a common language). Digital trust solutions leverage a common set of protocols, enabling them to exchange trusted data between themselves, with Security, Privacy, Authenticity and Confidentiality in mind. 

This allowed us to conduct trusted data exchanges between different systems. For example, we facilitated a live exchange  between the Government of British Columbia and CMM, where CMM was able to use Northern Block’s ‘Orbit Enterprise Wallet’ to receive a digital credential representation of their Mines Permit. This was issued by the Government of British Columbia, from their credential issuance system, and CMM was able to receive the offer, and upon accepting it, stored the credential within their digital wallet, hosted by Northern Block. This demonstrates that by speaking a common set of standards and protocols, technical interoperability is attainable, avoiding enforcing any supply chain participants to use a specific solution, similar to how email works!

A key aspect of digital trust stems from digital signatures using cryptography. Whenever a piece of data (a digital credential) is sent from an authoritative party to another, that data is signed, becoming tamper-proof and backed by verifiable governance processes. For instance, when a supply chain participant consumes verifiable TSM data as a digital credential proof, they can be confident about the data’s provenance, knowing it came from a mine or the mine’s auditor, and that it followed the correct governance established by MAC. This creates a root of trust for ecosystems that wish to consume that data.

Once these digital credentials are available, they can be shared with any participant in an ecosystem at the discretion of the credential holder. For example, a mining company can easily share this data with investors, buyers, or ESG marketplaces with just a click of a button. They can also selectively share different credentials, controlling what data they share and with whom. Being able to mix and match various types of verifiable data can unlock new efficiencies and value creation opportunities. For example, CMM may share with an investor a proof that they’re a recognised mine in British Columbia, along with selective ESG data from their TSM credential. In this case, they were able to share data that came from two separate, but verifiable sources. The investor will know that all data proofs shared are authentic. This example starts to make us think about some of the efficiencies and new opportunities when being able to mix and match data from various sources.

By converting this data into digital credentials and storing them in wallets owned by the data owners, it becomes easier to share within supply chains, thereby making it more available. This is particularly important in the context of ESG reporting, where fraud is a concern. Digital trust is crucial because it allows for the creation of supply chain data inputs that are authentic and verifiable.

Ultimately, digital trust technologies will elevate the level of trust within critical mineral supply chains. Creating supply chain transparency is only as effective as the data that flows into it. Our goal is to ensure that the data entering supply chains is of high integrity and value, and our work to-date aligns well with the Canadian Critical Minerals Strategy.

Northern Block’s Mining Credentialing Solutions

Northern Block has been dedicated to developing enterprise digital credentialing technology suitable for a variety of applications. In early 2020, we launched the first commercial graded digital credential management platform in Canada, called NB Orbit Enterprise ; a no-code digital trust web-based platform that facilitates the storage, issuance and verification of digital credentials that are held and owned by organisations in digital wallets. The platform contains a collection of components based on identity management, distributed and edge computing, distributed ledger technologies and cryptography. Importantly, it is compatible with global digital trust standards and protocols. 

Here are some of the components available within Orbit Enterprise that we leveraged for this particular critical minerals digital trust project:

Organisational Wallet: a component that allows organisations to receive digital credential offers from their peers, such as governments and auditors, and securely store them in the organisational wallet (e.g., Mines Permit credential, TSM credential).  Secure Connection Manager: This tool helps organisations set up secure, two-way communication channels with each other. Once these channels are established, they can exchange information safely and privately, ensuring that all communications are genuine and confidential. Digital Credential Creation and Sharing Module: This is a service that allows organisations to create and issue digital credentials, like digital IDs or certificates, either to themselves or to other organisations. It also includes features for requesting credentials, negotiating terms, and sending secure messages. This is particularly useful in business-to-business (B2B) situations, where two organisations need to collaborate on issuing these digital credentials. Digital Credential Verification and Exchange Toolkit: This toolkit is used by organisations to either ask for proof of digital credentials from others or to provide proof of their own credentials. It supports collaboration and communication, similar to the credential issuance service, making it easier for organisations to verify each other’s credentials. Credential Governance Toolkit: This toolkit is designed for creating and managing the rules and designs for digital credentials. It’s used by authoritative bodies to set standards and publish these credentials to trusted data registries. organisations can use this toolkit to follow these standards when issuing credentials. It also helps ensure that the branding and legitimacy of these credentials are consistent, especially in supply chains, so that everyone recognizes and trusts them. Call to Action

If you are interested in contributing to the growth of this ecosystem, we would love to hear from you. Our initiative welcomes participation from a diverse range of stakeholders, including natural resource companies, data consumers in need of high-integrity sustainability data, government regulators, and technology solution providers. It will require the collective effort of many strong participants to elevate this ecosystem to the next level and generate a network effect. We are confident in the foundational elements already in place, especially our credentials rooted in authoritative trust. This foundation paves the way for an array of future use cases and broader applications.

The post Utilising Digital Credentials to Promote Responsible Business Conduct appeared first on Northern Block | Self Sovereign Identity Solution Provider.


KuppingerCole

Cloud-Native Application Protection Platforms (CNAPP)

by Mike Small This report provides an overview of the Cloud-Native Application Protection Platforms (CNAPP) market and a compass to help you find a solution that best meets your needs. It examines solutions that provide an integrated set of security and compliance capabilities designed to protect cloud-native applications across the development and production lifecycle. It provides an assessment o

by Mike Small

This report provides an overview of the Cloud-Native Application Protection Platforms (CNAPP) market and a compass to help you find a solution that best meets your needs. It examines solutions that provide an integrated set of security and compliance capabilities designed to protect cloud-native applications across the development and production lifecycle. It provides an assessment of the capabilities of these solutions to meet the needs of all organizations to monitor, assess, and manage these risks.

liminal (was OWI)

Market and Buyer’s Guide for Customer Authentication

The post Market and Buyer’s Guide for Customer Authentication appeared first on Liminal.co.

Elliptic

Regulatory Outlook 2024: Stablecoins will be atop the regulatory and policy agenda

Recently, we outlined five key issues that we think will drive the crypto regulatory and policy landscape this year. In this blog post, we zoom in on a topic that we think will dominate the regulatory agenda in 2024 like never before: stablecoins. 

Recently, we outlined five key issues that we think will drive the crypto regulatory and policy landscape this year. In this blog post, we zoom in on a topic that we think will dominate the regulatory agenda in 2024 like never before: stablecoins


TBD

Top 8 TBD Hackathon Winners!

TBD Hackathon Winners! 🥳

TBD Hackathon! ✨

As we wrap up our very first self-hosted hackathon, we're thrilled to announce that out of 1,681 participants we have our top 8. These three hackathons have been a great learning experience, and an amazing opportunity to see what unique projects developers can build with Web5. In hopes of keeping the Hackathon open we came up with 6 categories, FinTech, Health, Music and Arts, Personal Data,and Empowerment & Enablement applications. Join us in celebrating our top 3 and best application per category.👏

🥇 Turtle Shell

Turtle Shell is a portable personal data management tool that combines your digital life under one Shell, allowing users to own, manage, and view all their data. It was developed by, Moises E Jaramillo and Courtney Chan.

Try it Out:

Learn More 🥈 RideFair

RideFair was developed by Rebecca Chan, Jennifer To and Brian Lam. It's a decentralized ride-sharing app that gives users control over their ride-sharing experience, prioritizing safety and privacy.

Try it Out:

RideFair GitHub Repo Learn More 🥉 Talent Token

Talent Toekn, developed by Marco Boschetti, is a decentralized professional networking platform. Where users can manage their profile, share and receive endorsements for their skills, languages and work experiences without compromising privacy.

Try it Out:

Talent Token Bitbucket Learn more 🏅 MoneySavvy

MoneySavvy, winner of Best FinTech Inspired App, was developed by Ahmad Nurfadilah. It's a decentralized financial management application. That empowers users to take control of their finances in a privacy-focused environment.

Try it Out:

MoneySavvy Learn more 🏅 Rapha

Rapha, winner of Best Health Inspired App, was developed by Festus Idowu. It allows individuals to own and carry their health history with them. Utilizing decentralized web nodes (DWNs) for securly storing medical records.

Try it Out:

Rapha GitHub Learn more 🏅 Musive

Musiv, winner of Best Music and Arts Inspired App, was developed by Denis Riush. It's a decentralized music distribution application. That allows the artist to share their music while maintaining ownership.

Try it Out:

Musiv GitHub Learn more 🏅 Tracks

Tracks, winner of Best Personal Data Inspired Appp, was developed by Jack Watson. It's a decentralized digital travel diary that lets users pin places they've visited with photos and notes on a customizable world map, that can be shared.

Try it Out:

Tracks Learn more 🏅 FifthPoll

FifthPoll, winners of Best Empowerment & Enablement Inspired App, was developed by Spandan Barve, and Riya Jain. It's a decentralized community-driven voting platform where every individual's voice is heard.

Try it Out:

FifthPoll GitHub Learn More 💖 Thank You

Thank you to every participant, Devpost, our engineers, and our amazing community that constantly supported one another the entire time. We look forward to hosting our next one!

For more amazing projects check out our hackathon page.

Sunday, 28. January 2024

Entrust

Navigating the Digital Seas: An Infosec Expert’s Perspective on Data Privacy day

As we reflect on Data Privacy Day 2024, the significance of safeguarding digital identities takes... The post Navigating the Digital Seas: An Infosec Expert’s Perspective on Data Privacy day appeared first on Entrust Blog.

As we reflect on Data Privacy Day 2024, the significance of safeguarding digital identities takes center stage in our increasingly interconnected world. The so-called conflict between “seamless user experience” and security (aka “friction”) is over — the only answer is that security has to be welcomed as part of the experience. Breaches affect our livelihoods, reputations, and families, so a little friction is not just a necessary evil, but an inherent part of the trust ecosystem.

Over the past few years, data breaches and privacy scandals have become alarmingly commonplace, raising concerns about the security of our personal information. From large-scale corporate breaches to individual cases of identity theft, the threats to our digital privacy persist and continue to evolve.

As technology continues to advance at a rapid pace, so too do the tools and techniques employed by malicious actors. From sophisticated hacking attempts to more subtle forms of data mining, our personal information is constantly under siege. Even the most highly trained security professionals may miss increasingly realistic AI-generated phishing scams, across text, voice, and video.

To counter this, CISOs across industries are grappling with the daunting task of securing these digital identities against an ever-evolving threat landscape. As we observe this year’s Data Privacy Week, the need for organizations to prioritize the protection of digital identities has never been more pronounced.

In the face of these challenges, CISOs must adopt a proactive and multi-faceted approach to fortify digital identity security.

Here are some key strategies that organizations can implement:

Zero Trust Architecture: Embracing a Zero Trust model involves assuming that no user or system is inherently trustworthy. By implementing strict access controls, continuous monitoring, and robust authentication mechanisms, organizations can significantly reduce the risk of unauthorized access. Biometric Authentication: As passwords continue to be a weak link in the security chain, biometric authentication is gaining prominence. Fingerprint scans, facial recognition, and other biometric identifiers provide an additional layer of security, making it harder for malicious actors to compromise digital identities. User Education and Awareness: Human error remains a leading cause of security breaches. CISOs should prioritize ongoing cybersecurity training programs to educate employees and users about the importance of protecting their digital identities. Recognizing phishing attempts and practicing good cyber hygiene can go a long way in preventing identity-related incidents. Comprehensive Data Encryption: Implementing end-to-end encryption for sensitive data ensures that even if unauthorized access occurs, the intercepted information remains indecipherable. This is particularly crucial for protecting communication channels and data in transit. Phishing-Resistant MFA: Bad actors are finding ways to bypass multi-factor authentication, but phishing-resistant MFA can help address these new attacks by incorporating multiple layers of protection requiring more authentication as well as proximity.

The imperative to protect digital identities has never been more crucial. CISOs play a pivotal role in shaping the cybersecurity landscape of their organizations, and their strategies must adapt to the evolving challenges posed by cyber threats. By embracing innovative technologies, fostering a culture of security awareness, and adhering to robust privacy practices, organizations can navigate the digital seas with confidence, safeguarding the identities of individuals in an interconnected world.

The post Navigating the Digital Seas: An Infosec Expert’s Perspective on Data Privacy day appeared first on Entrust Blog.

Saturday, 27. January 2024

Dark Matter Labs

7 Structural Shifts:

Reconfiguring Transition Landscapes Entering 2024, it becomes increasingly important to shift the focus from immediate technological and demographic trends to the broader systemic tendencies that are likely to shape the future and options landscape. The seven drivers outlined below must be understood against a systemic backdrop of steadily declining foundational systems of civilisation — from dec

Reconfiguring Transition Landscapes

Entering 2024, it becomes increasingly important to shift the focus from immediate technological and demographic trends to the broader systemic tendencies that are likely to shape the future and options landscape. The seven drivers outlined below must be understood against a systemic backdrop of steadily declining foundational systems of civilisation — from decreasing available arable land along with structural decline in quality at a planetary scale, a quantum decline in energy returned for energy invested, to decreasing global well-being and a decline in life expectancy.

These fundamental shifts place us in a new paradigm where foundational resources are relatively diminishing in comparison to demand, while climate breakdown introduces increased volatility and shocks into the system. This volatility, coupled with a systemic decline, suggests we will face more shocks and an accelerating shortfall in nutrients, energy and well being.

As these factors converge, we don’t just mean fluctuating availability, but also significant price spikes. For instance, a 3–4% drop in food system availability, or 5% in aggregate, could lead to price increases of up to 400%. Such price volatility will lead to systemic inequality within and between nations, potentially triggering social tipping points. It is within this context and transition that we need to consider the following trends.

These structural tendencies, which demand both adaptation and mitigation, are key factors in determining how we evolve and adapt in the years (if not decades) ahead. Some of these tendencies will naturally allow for evolutionary progress, while others will require significant adaptation. Understanding these tendencies from a structural perspective is crucial because they form the undercurrents that will influence long-term trajectories. Recognizing and responding to these systemic tendencies and propensities will be essential in navigating the complex landscape of our future, and shaping our decisions, innovations, and strategies for the decades to come.

The essence of the “propensities” defined below lies in a profound transformation of our constraint space shifting our approach towards technology, society, and future planning. Furthermore, it is important to recognize that these propensities have historically always existed. What is now being manifested is that those propensities are not just a functional reality for the “ignorable many”, but becoming the dominant propensity at a planetary scale.This transformation is reshaping the very notions of progress, the integration of technology into daily existence, and the structuring of institutions.

These tendencies and propensities transcend the simple adoption of new technologies or the following of emerging trends. They represent a more profound, structural evolution in our collective thought processes and approaches. They signify a departure from linear, predictable models derived from extensions of ‘the adjacent possible’ socio-technical landscape based on present-day comprehensions.

In the following sections, 7 of these pivotal propensities are presented. Readers and participants are encouraged to critically engage with these ideas, offering their perspectives and insights. These shifts appear increasingly essential in framing and supporting the magnitude of the transition and more critically the innovation landscapes necessary to provide pathways for the transition.

Multi-Perspectival Pathways for Tomorrow

The transformation towards a multipolar world is one of the most structural shifts that the world is facing and it is driving profound changes in our pathway of transition. This multipolar world view increasingly extends beyond geopolitics, permeating into foundational shifts and divergences in transition pathways for societies.

The multiperspectival approach to climate transition underscores the growing portfolio of strategies being implemented globally — either passively or actively, each reflecting unique civilizational insights, lock-ins and interests. COP28 started to crystallize this emerging reality, with a breakdown in the hegemony of a singular transition pathway.

In this context, it is important to recognise that there appear to be at least three structural principal transition perspectives emerging:

Supply Side — Asset & Investment Driven Transformation
Predominantly observed in the U.S., this strategy emphasizes the creation and enhancement of assets to revolutionize supply chains and technologies. It entails substantial investments in new technologies and infrastructure, nurturing innovation ecosystems, enacting supportive policies and regulations, fostering public-private collaborations, and integrating global supply chains. The goal is to transform supply and technology landscapes while promoting economic growth and competitiveness. It will necessitate new market design and financial innovations.

Integrated Transition Strategies:
This method aims to fundamentally alter supply-side technologies while simultaneously managing or reducing demand. Its’ complexity arises from the dual focus on high-cost supply-side innovation and transitioning demand, necessitating changes in consumer behavior and infrastructure adaptation. The central challenge is to harmonize advancing supply technologies with evolving demand trends, targeting resilience and sustainability. This strategy requires robust policy and regulatory frameworks for successful implementation but perhaps more critically the capacity for societies to make legitimate decisions at the speed and scale necessary.

Offset Systems
In regions like Saudi Arabia, where hydrocarbon energy base costs are low, investing in offset technologies becomes a strategic focus. This includes carbon capture and storage, efficiency enhancements, alternative hydrocarbon applications, and geo-engineering. This approach, contrasting with European or American strategies, leans towards maintaining the hydrocarbon economy while seeking to reduce or mitigate future environmental It’s’ . This strategy necessitates novel methods for managing planetary risks and innovations. Moreover, the unproven technologies that it is reliant on will most certainly accelerate climate breakdown risks in the short term.

These divergent transition pathways will require adopting shared, dialogic and negotiation frameworks for handling different planetary risks and allocations and making provisions for them, not just within national contexts but also in relation to the planet as a whole (i.e. if we are to avoid a race to the bottom and collective self termination). holding and management of these risks on a global scale calls for an innovative form of systemic-level innovation from countries. It means rethinking and redesigning the way that risks are assessed (including historically accrued risks), addressed, and mitigated, integrating both national priorities and planetary responsibilities. Such an approach is crucial for navigating the complex, interlinked challenges that define the current era. It requires a shift towards more collaborative, integrated risk management strategies at the planetary level.

The multiperspectival nature of the climate transition calls for diverse innovation economies, national and metanational alliances, and a reimagined approach to negotiation and collaboration. Navigating this multipolar landscape requires recognizing varied transition pathways, managing geopolitical effects, balancing risks and trade-offs, and adapting to a substantial shift in global transitions. This multipolar, multi perspectival scenario necessitates sophisticated, economic diplomacy to forge a viable, and non-self terminating, equitable future

2. Planetary justice as a precursor for a planetary transition

The emergence and embedding of a multipolar world is also revealing long-standing injustices and accelerating future injustices, previously obscured by asymmetric and extractive power dynamics. The addressing of these injustices is not just an ethical imperative, but also a practical necessity for any large-scale planetary agreement on the necessary transitions ahead. The danger is not just that and equitable transitions might be otherwise being impeded by deeply rooted systemic injustices but any transition will be impeded itself progressing us to a path of mutually assured destruction. This stalemate response when confronting structural barriers has hitherto been the status quo. Tackling such injustices on a planetary scale demands substantial and concerted effort as they prevent a viable pathway towards the magnitude of change that will be.

3. High Interest High Inflation Macroeconomics

The macroeconomic landscape is also shifting alongside the unfurling implications of climate change intensifying in an increasingly multipolar world. We are witnessing increased market volatility and a more pressing scramble for transition materials — driven not just by market dynamics but also by existential necessities and increasing conflict.

This confluence of factors is likely to induce a sustained high-inflationary environment within the economic system and might drive a persistently high interest rate environment that will systemically restrain investment broadly especially in environments most vulnerable to planetary risks. It will throttle our capacity to mitigate the runaway risks associated with climate breakdown (i.e. with the exception of entities wielding substantial economic power).

Moreover, such macroeconomic factors are going to systemically accelerate the missing trillions problem, throttling the flow of capital into the real world economy. This is because as both the volatilities increase, future uncertainties increase, and the cost of capital will become increasingly prohibitive.

Moreover, this shift is poised to aggravate systemic inequalities, disproportionately affecting the poorest and most vulnerable and marginalized by rapidly increasing the cost of living. As systemic inequality widens, it will threaten to destabilize nation-states. This paradigm shift presents a formidable challenge, making global economies more susceptible to fragility. Addressing these emerging vulnerabilities will require systemic innovation at the monetary level that cannot be discounted and will increasingly become a systemic throttle if not addressed.

4. Operating in a Security World

The transition from a world predominantly driven by free trade principles to one increasingly centered around security agreements has been accelerating over the past six years (e.g. the UK/US Atlantic agreement or UK/Japan agreement). The shift towards security-based alliances is expected to continue, with an evolving and broadening definition of security. Recent trends have started to integrate critical technologies into security frameworks, and we can foresee this expansion to further encompass areas such as nutrition, critical minerals (already a traditional focus in security) and essential health commodities and needs.

The expansion of the security domain will also necessitate an even greater systems-based approach to security.

In a more systems-oriented worldview, the traditional concept of nation-state boundaries will undergo a fundamental change. Instead of being defined by geographical lines, states will be characterized by their critical metabolic flows that do not respect national boundaries. It is foreseen that these flows, encompassing energy, materials, and other vital resources, will increasingly become the definitional landscape of statehood. This perspective emphasizes the importance of resource dynamics and interdependencies in defining national identity and capabilities.

Furthermore, it is crucial to acknowledge that the increasing securitization of the world also paves the way for the weaponization of various elements. This might include areas such as food supplies, weather systems, geo-infrastructures, material goods, energy grids, and information systems. As security concerns broaden, so too does the scope of potential weaponization. This development raises fundamental questions about our response strategies. Relying solely on market transaction-based frameworks is increasingly insufficient for understanding the value and risks associated with such diverse sectors. A more comprehensive approach is needed to address the complexities that they present.

As we begin to recognize and operationalize security within these new frameworks of entanglement, we will see planetary-scale reconfigurations. This evolution signifies a deepening understanding of our interconnectedness and the need for comprehensive strategies in addressing security challenges.

The extension of the security framework to incorporate elements like energy, critical materials, food systems, and health infrastructures is poised to significantly reshape the planetary landscape. The move from a commodities-centric free trade world to one focused on security goods and security alliances will be pivotal in determining the pathways for future transitions. This growing emphasis on security is likely to transform international relations and economic agendas, marking a profound shift in the way that global interdependencies and cooperation are perceived and navigated. This evolution marks a new era in global geopolitics, where security considerations permeate multiple aspects of international engagement and decision-making.

5. The New Environmental Right

There is an emergence of what can be described as a surge in environmental nationalism (e.g. rise of AFD in Germany and its recent positions on environmental risks). This brand of nationalism is dedicated to the protection of environmental systems, advocating for methods such as population reduction (principally l by reducing migration) as means to conserve environmental resources. This is poised to be a defining feature of the new political right wing. Environmental nationalism is set to introduce a systemic form of austerity that transcends traditional public goods. Its primary focus will be on enacting austerity measures on essential environmental resources and might be used as a lever to propel a more nationalistic agenda.

Whilst such ideology is ostensibly aimed at preserving certain environmental aspects for particular socio-economic groups, it may also serve as a driving force for increasing systemic inequality. Its selective approach to conserving and allocating environmental resources is likely to generate imbalances, privileging certain groups while disadvantaging others and/or exacerbating existing disparities. A notable characteristic of such environmental nationalism will be its propensity to externalize environmental degradation. It prioritizes the maintenance and conservation of high environmental standards within its own borders, often by shifting environmental burdens to other regions or nations. This practice, akin to a form of passive economic warfare, is expected to intensify. Nations adopting this ideology may prioritize their internal environmental quality, frequently at the detriment of the broader global environment. This leads to aggravated issues like pollution and resource depletion elsewhere. Such externalization tactics, rapidly escalating under the banner of environmental nationalism, are likely to contribute to global environmental imbalances and tensions, complicating the quest for unified, worldwide environmental objectives.

6. Labour Crisis

Furthermore, a labor crisis is being evidenced in advanced economies that are grappling with diminishing labor supply, particularly noticeable in care and transition sectors. For example, numerous areas are readying for substantial urban and community retrofits, but face a shortfall in the workforce that is required to carry out such projects. Moreover, demographic shifts in these economies are outpacing the development of sufficient care infrastructure and labor resources.

To overcome such labor challenges, groundbreaking strategies are required. It entails a greater reliance on automation, significant investments in demand shifting infrastructures and human capital development and a structural reevaluation and reimagination of migration policies and international relations — such as multi-tier citizenships such as digital nomad and essential services visas or multi-tier participation rights and obligations. Such shifts are vital to reconfigure how we approach the new mass demands of work — from care provision to managing large-scale transition such as retrofits in a 21st-century context (especially considering the constraints on labor).

These labor issues and constraints are poised to drive profound change in the innovation landscape and transition strategies. The accessibility and distribution of labor and human resources will emerge as a key systemic limitation, with its impact varying significantly across the globe. This factor will be instrumental in determining the nature and efficacy of strategies for transition.

7. Information Flooding

The seventh foundational shift reshaping our organizational structures pertains to the nature of information economies. Human systems and institutional economies are inundated with an overwhelming flow of information, fundamentally altering our information processing methods. Instead of synthesizing and comprehensively understanding this influx, we are increasingly reliant on pattern-based computations. This approach involves identifying adjacent patterns to make sense of emerging data trends — a significant departure from traditional methods of knowledge processing.

However, knowledge institutions are designed to regulate the computation of knowledge rather than pattern recognition. Therefore they struggle to keep pace with the rapid influx of information, either in the computation of information or the pattern sharing. They are increasingly unable to adapt quickly enough to new demands or develop alternative frameworks for understanding. This gap in societal processing capability opens the door to the weaponization of pattern spaces, where myths and false narratives can proliferate, filling the voids left by conventional institutional computation. We are witnessing an era where the ‘hallucination of patterns’ becomes a systematic method to manage the deluge of information.

A critical challenge will be to construct a new institutional economy that is equipped to handle the scale of information overload. It necessitates a deep understanding of how societies process information and a structural reevaluation of current systems. Addressing this will be key to enhancing our collective capacity and capabilities for managing and facilitating the transition in an era of information abundance.

It is essential to effectively address societal capability to process vast quantities of information in order to construct robust decision-making structures that are fit for an era marked by heightened complexity and built on mass distribution of agency. The challenge will lie not only in developing information processing and decision-making systems for society, but also how to build such systems as a network capability of society as a whole, rather than for a representative few.

Moreover, the building of such systems is not just an operational necessity; it is a fundamental issue that sits at the heart of our societal evolution to transcend the transition challenge. These systems must be capable of managing the deluge of information and translating it into coherent, mass multi-agent decision-making processes that are both inclusive and effective in scope and context.

Navigating the Shift: A New Era of Planetarity **

The effort of this post is driven by the need to perceive, acknowledge and anticipate these structural changes with regard to our investments in innovation and transition frameworks and recognising that this is essential to prevent an escalating planetary landscape that becomes increasingly in crisis and at risk of geopolitical tension and conflict.

A dual purpose is served in understanding these contextual shifts. Firstly, the structural nature of the challenges that we face are acknowledged. More importantly, it involves devising of a new class of relevant innovative strategies and domains for the adaptation and mitigation of the impact of these shifts as society transitions.

Further, as we examine such transitions, we must recognize this moment presents a move from an era defined by industrial internationalism, with its colonisation and extractionism, to a new phase of multipolarity and systemic interconnectedness. This transition, from 19th century globalization to a nascent phase of planetarism, demands care if we are to preserve essential systemic capabilities that we have built as a planetary civilization (such as those exemplified by the microchip, antibiotics, satellites etc.), whilst we systemically and justly rewire our energy, materials, nutrient and cognition systems.

As we look towards the future, it is becoming increasingly evident that our pathways will not follow a single, universal transition. Instead, it will be characterized by a highly multipolar, multi-perspective, and hyperpluralistic approach. The critical question then becomes: how can we preserve this diversity of perspectives and multipolarity while fostering a new theory of diplomacy? This theory must be rooted in coordination, empathy, and respect amidst such diversity.

Central to this endeavor may be our ability to construct a new framework for diplomacy that can facilitate negotiation and understanding between various worldviews, playing a pivotal role in crafting a planetary future capable of embracing complex diversities (hard and soft power, new relational theories and a framework that acknowledges interconnected and, shared destinies on a global scale (both human, more than human and machine capability).

In addition, innovative institutions and the development of new institutional capacities will be required. The concept of diplomacy needs to be reimagined, including by the use of computational machinery and dynamic protocols that can make diplomacy not only more efficient but also legitimate by effectively integrating non-nation state actors and different regional blocs and perspectives, wider groups such as indigenous nations, and even diplomacy that takes into account future generations and more than human species. At last, the infrastructure exists to support the most providing the necessary infrastructure to support the most diverse of plurality. This will lay the groundwork for a new planetary future — and one that harmoniously integrates an array of perspectives and approaches.

Furthermore, it is also becoming increasingly apparent that as we shift away from our current material economy — a system that has generated abundance for some segments of society and scarcity for others — we are entering an era marked by greater scarcity. This transition could potentially lead us towards a novel concept of abundance. However, this journey is not without its challenges. As we begin to encounter the limitations inherent in our economies, particularly the rise of net-zero-sum scenarios, the risk of conflict and war further escalates.

The crucial question is: how do we navigate this transition? How do we move from a reality dominated by the dynamics of abundance through a valley of scarcity to a new paradigm of abundance? The new abundance could be conceptualized not just in material terms but also in terms of meaning. As we constrain our material economy and reshape the social signaling functions that it serves, we need to approach this transformation with careful consideration. There is a potential, perhaps in the next 40 years, to unlock a new era of energy abundance. This could revolutionize our material economy, creating both material and cognitive abundance in ways previously unimagined.

While this future pathway holds a promise, it is important to acknowledge that we are entering a world characterized by diminishing — particularly 19th century concepts of abundances. The implications of operating in such a world are profound and warrant significant attention and strategic planning.

Indy Johar with support from Dm colleagues — all errors and omissions are mine…

7 Structural Shifts: was originally published in Dark Matter Laboratories on Medium, where people are continuing the conversation by highlighting and responding to this story.


liminal (was OWI)

Weekly Industry News – Week of Jan 22

Liminal members enjoy the exclusive benefit of receiving daily morning briefs directly in their inboxes, ensuring they stay ahead of the curve with the latest industry developments for a significant competitive advantage. Looking for product or company-specific news? Log in or sign-up to Link for more detailed news and developments. Week of January 22, 2024 […] The post Weekly Industry News – We

Liminal members enjoy the exclusive benefit of receiving daily morning briefs directly in their inboxes, ensuring they stay ahead of the curve with the latest industry developments for a significant competitive advantage.

Looking for product or company-specific news? Log in or sign-up to Link for more detailed news and developments.

Week of January 22, 2024

Here are the main industry highlights of this week.

➡ Innovation and New Technology Developments Cambodia to Launch Comprehensive Digital Identity System for All Residents in July

The law covers registration of births, deaths, marriages, and divorces for citizens and non-citizens, including stateless residents. The government plans to implement the law in July and conducts training for civil registration and vital statistics (CRVS) workers.

Read the full article on www.biometricupdate.com Czech Republic’s New eDoklady App Hits 70,000 Downloads on Launch Day, Enhancing Digital ID Access

The app allows users to generate digital versions of their national ID card and other identity documents, which will be accepted in government offices and some municipalities. Despite initial network and server issues due to high traffic, the launch has garnered substantial interest, and the government plans to extend the use of digital IDs to private entities next year.

Read the full article on biometricupdate.com Apple Proposes Opening Tap-and-Go Payment System to Rivals in EU Antitrust Case Settlement

The EU accused Apple in 2022 of abusing its dominant position by limiting access to its mobile payment technology. The proposed changes would allow third-party mobile wallet providers to access the contactless payment function on Apple’s iOS operating system.

Read the full article on fortune.com ➡ Investments and Partnerships Silverfort Secures $116 Million in Series D Funding to Transform IAM Landscapes

Israeli digital identity and access management provider Silverfort closed a $116 million series D funding round led by U.S.-based Brighton Park Capital. The round brought Silverfort’s total investment funding to $222 million.

Read the full article on silverfort.com Dapple Security Raises $2.3M for Biometric-Based Passwordless Login Solution

The company utilizes biometrics, such as fingerprints or voice, to create a secure “lock and key” for users, allowing them to replace traditional passwords with a single passkey stored in the cloud. The startup, focusing on cybersecurity for small and mid-sized businesses, raised funds from angel investors, Techstars, Access Venture Partners, and First In. Dapple plans to enter beta testing for its passkey solution in the summer, aiming for a commercial launch in the fall. 

Read the full article on biometricupdate.com UK and Japan Sign Cybersecurity Cooperation Memorandum to Fortify Global Digital Resilience

The agreement follows a three-day event hosted by the National Cyber Advisory Board, focusing on global collaboration in cybersecurity. The partnership aims to enhance the relationship between the two nations, building on the Hiroshima Accord commitments to a Global Strategic Partnership.

Read the full article on thepaypers.com Root Protocol Raises $10 Million in Seed Funding for Web3 Identity Service, Valued at $100 Million

The funding has valued Root at $100 million and included participation from investors such as Signum Capital, Ankr Network, CMS Holdings, and angel investors Tekin Salimi and Meltem Demirors.

Read the full article on coindesk.com ➡ Legal and Regulatory  Texas Man Files $10 Million Suit Against Macy’s and EssilorLuxottica Over Facial Recognition Error

Murphy was falsely accused of armed robbery due to a facial recognition match from low-quality surveillance footage. He was held in jail for nearly two weeks before prosecutors verified he was not present in the state during the robbery.

Read the full article on edition.cnn.com Google Cloud Abolishes Data Egress Fees, Easing Provider Switching Amid Regulatory Watch

The move aims to simplify cloud pricing and address concerns about anticompetitive practices by eliminating a potential barrier for companies to switch providers.

Read the full article on wsj.com Shein’s IPO Plans Face Uncertainty Amid China’s Data Probe

As Shein prepares for an IPO in the United States, the Cyberspace Administration of China (CAC) is investigating the fast fashion giant for how it has handled data about its Chinese partners. While the probe is common, China’s Cybersecurity Law is unclear, and severe non-compliance penalties have been catastrophic for businesses.

Read the full article on verdict.co.uk Bipartisan Bill Aims to Criminalize Nonconsensual Sharing of AI-Generated Intimate Images

Rep. Joseph Morelle (D-N.Y.) has reintroduced the “Preventing Deepfakes of Intimate Images Act,” which punishes the nonconsensual distribution of digitally manipulated personal images. The bipartisan legislation comes in response to an event at Westfield High School in New Jersey in which students shared AI-generated nude photographs of female peers without their knowledge.

Read the full article on wsj.com Microsoft Targeted by Russian ‘Midnight Blizzard’ Cyberattack, Prompting Security Overhaul

On January 12th, Microsoft announced a cyberattack by a Russian state-sponsored actor against its corporate networks. The intrusion happened on a legacy non-production test tenant account via a “password spray attack” and successfully gained access to senior leadership’s emails.

Read the full article on thestack.technology

The post Weekly Industry News – Week of Jan 22 appeared first on Liminal.co.

Friday, 26. January 2024

Farmer Connect

Navigating the Landscape of EU Deforestation Regulation: Challenges and Opportunities

Navigating the Landscape of EU Deforestation Regulation: Challenges and Opportunities In the ever-changing realm of sustainable practices, with a focus on environmental and human rights considerations, the impending European Union regulation addressing deforestation, set to take effect on January 1st, 2025, signifies a crucial turning point. This regulation spans a range of commoditi
Navigating the Landscape of EU Deforestation Regulation: Challenges and Opportunities

In the ever-changing realm of sustainable practices, with a focus on environmental and human rights considerations, the impending European Union regulation addressing deforestation, set to take effect on January 1st, 2025, signifies a crucial turning point. This regulation spans a range of commodities, including coffee, cocoa, palm oil, soy, timber, and rubber, signalling a significant shift in how businesses approach sustainability, particularly in the context of deforestation.


liminal (was OWI)

Building Digital Trust: Consumer-Centric Decentralized Identity Verification

In this State of Identity podcast episode, host Cameron D’Ambrosi chats with Dentity Founder and CEO Jeff Schwartz about his journey from Walt Disney Company to founding Dentity. Focused on decentralized and consumer-centric identity verification, Jeff shares insights from his automotive sector experience and the challenges of online identity verification. The discussion examines the importance […

In this State of Identity podcast episode, host Cameron D’Ambrosi chats with Dentity Founder and CEO Jeff Schwartz about his journey from Walt Disney Company to founding Dentity. Focused on decentralized and consumer-centric identity verification, Jeff shares insights from his automotive sector experience and the challenges of online identity verification. The discussion examines the importance of consumer-driven identity verification, tackling the ‘cold start’ problem in user adoption, and navigating the competitive landscape. Discover what standards, incentives, and collaborative efforts are necessary to revolutionize the identity verification tech space.

The post Building Digital Trust: Consumer-Centric Decentralized Identity Verification appeared first on Liminal.co.


KuppingerCole

Zero Trust Network Access

by Alejandro Leal Zero Trust Network Access (ZTNA) is becoming increasingly essential as organizations adapt to remote work, cloud adoption, and the growing sophistication of cyber threats. Unlike traditional perimeter-based security models, ZTNA treats every user, application, or resource as untrusted and enforces strict security, access control, and comprehensive auditing to ensure visibility an

by Alejandro Leal

Zero Trust Network Access (ZTNA) is becoming increasingly essential as organizations adapt to remote work, cloud adoption, and the growing sophistication of cyber threats. Unlike traditional perimeter-based security models, ZTNA treats every user, application, or resource as untrusted and enforces strict security, access control, and comprehensive auditing to ensure visibility and accountability of all user activities. In this Leadership Compass, we provide an overview of the existing solutions implementing a holistic approach to Zero Trust methodology, enabling secure yet convenient access to business applications and resources for users, regardless of their location. A comprehensive examination of the market segment, vendor service functionality, relative market share, and innovative approaches to providing ZTNA solutions are all contained in this report.

OWI - State of Identity

Building Digital Trust: Consumer-Centric Decentralized Identity Verification

In this State of Identity podcast episode, host Cameron D'Ambrosi chats with Dentity Founder and CEO Jeff Schwartz about his journey from Walt Disney Company to founding Dentity. Focused on decentralized and consumer-centric identity verification, Jeff shares insights from his automotive sector experience and the challenges of online identity verification. The discussion examines the importance of

In this State of Identity podcast episode, host Cameron D'Ambrosi chats with Dentity Founder and CEO Jeff Schwartz about his journey from Walt Disney Company to founding Dentity. Focused on decentralized and consumer-centric identity verification, Jeff shares insights from his automotive sector experience and the challenges of online identity verification. The discussion examines the importance of consumer-driven identity verification, tackling the 'cold start' problem in user adoption, and navigating the competitive landscape. Discover what standards, incentives, and collaborative efforts are necessary to revolutionize the identity tech space.


PingTalk

What is Device Trust? How it Works in 2024

With a rising risk of cybersecurity attacks and data breaches in the corporate world, it's important for companies to incorporate a device trust process to keep networks and customers' personal identifiable information (PII) safe. Implementing industry best practices for device verification is key for online protection. Understand how device trust works, what challenges to consider, and how to inc

With a rising risk of cybersecurity attacks and data breaches in the corporate world, it's important for companies to incorporate a device trust process to keep networks and customers' personal identifiable information (PII) safe. Implementing industry best practices for device verification is key for online protection. Understand how device trust works, what challenges to consider, and how to incorporate the best software to maximize security.

Thursday, 25. January 2024

KuppingerCole

Beyond Secrets Management: Transforming Security in the Digital Age

Join security and identity experts from KuppingerCole Analysts and Entrust to reveal the intrinsic and symbiotic relationship between key management and secrets, offering visibility, compliance assurance, and effective risk management. Learn the importance of safe, repeatable processes around keys and secrets to fortify security in a dynamic cyber landscape.  Martin Kuppinger, Pr

Join security and identity experts from KuppingerCole Analysts and Entrust to reveal the intrinsic and symbiotic relationship between key management and secrets, offering visibility, compliance assurance, and effective risk management. Learn the importance of safe, repeatable processes around keys and secrets to fortify security in a dynamic cyber landscape. 

Martin Kuppinger, Principal Analyst at KuppingerCole Analysts, will explain how stolen credentials became a leading cause of data breaches and how, to tackle this risk for modern cloud-native or hybrid applications, secrets management must be reinvented. He will outline the requirements for managing the entire lifecycle of digital credentials and addressing the needs of various stakeholders. 

Michael Loger, Director Product Management at Entrust will provide a new definition of key and secrets management, and an overview of Entrust’s KeyControl solution, highlighting its secrets management capabilities, KeyControl ecosystem, unique architecture, and compliance management dashboard.  




Anonym

Is the Cloud Safe Enough to Store Files?

Cloud storage is amazing — and risky The 2000s may be over, but cloud storage is still amazing. Services such as Dropbox, Apple’s iCloud, Google’s Drive, and Microsoft’s OneDrive all help users share files with friends, recover when a hard drive crashes, and move files between their devices. Still, hearing about data breaches[1, 2, 3] […] The post Is the Cloud Safe Enough to Store Files? appeare
Cloud storage is amazing — and risky

The 2000s may be over, but cloud storage is still amazing. Services such as Dropbox, Apple’s iCloud, Google’s Drive, and Microsoft’s OneDrive all help users share files with friends, recover when a hard drive crashes, and move files between their devices. Still, hearing about data breaches[1, 2, 3] leaves people wondering whether their data is safe in the cloud.

Security on today’s cloud lacks end-to-end encryption

Today’s cloud storage services use fairly similar client–server architectures that start with a locally installed application that monitors a specific file folder on a user’s computer. When the app detects changes in the folder, it relays them to the user’s account on the cloud. The cloud service handles copying them to a user’s other devices.

One of the problems with cloud storage is that files are not end–to–end encrypted (E2EE) – meaning that files encrypted before leaving a user’s device do not remain encrypted until they return. Rather, most providers use the transport encryption + encryption at rest paradigm. In this model, transport encryption encrypts files sent to the server (e.g., HTTPS), but decrypts them upon arrival. Next, the server applies encryption at rest so that only encrypted files are stored. While providers tout the strength of their encryption algorithms (e.g., AES 256), what they don’t highlight is that the server decrypts user files before re-encrypting them and that they hold the decryption keys!

While this model is efficient, it is vulnerable to attack or service provider bugs and unfortunately is the mainstay of cloud storage. This leaves users wondering: Are cloud providers accessing my data? and Can hackers steal my digital files?  

We can make the cloud safer with Decentralized Identity

Answering those questions is difficult and varies over time. So, how do individual users protect their files in the cloud? One idea is to help users layer end–to–end encryption (E2EE) on top of any features the cloud storage services provide. This is fairly easy using the cryptographic features of decentralized identity (DI).

The DIDComm messaging specification was created to provide a platform-independent yet interoperable encrypted messaging capability that enables users of a wide range of DI platforms to exchange end-to-end encrypted messages.

This E2EE messaging capability can be used to secure files stored on virtually any cloud storage platform without divulging any plaintext file data content to the cloud service. Further, by storing secure files in an encrypted messaging format, those files can potentially be activated to later perform a myriad of secure file services for security and privacy conscious users.

Learn more about this approach in our white paper, The Cloud: Is it Safe Enough to Store Files? or explore a software tutorial at SudoPlatform Lab: Protecting Cloud Storage. Or contact us to discuss how you can use Sudo Platform to apply DI capabilities to your products and services.

The post Is the Cloud Safe Enough to Store Files? appeared first on Anonyome Labs.


YeshID

Release Notes for January 25, 2024

Howdy YeshID Community! New year, New updates to YeshID! We’re coming in hot for the first release of 2024 with a lot of amazing updates that are, as always, designed... The post Release Notes for January 25, 2024 appeared first on YeshID.

Howdy YeshID Community!

New year, New updates to YeshID! We’re coming in hot for the first release of 2024 with a lot of amazing updates that are, as always, designed to make managing your digital workspace easier, more intuitive and more efficient.

Here’s what we just dropped:

🗒️ Access Grid Enhancements: Sorting and Visibility Improvements: Integrated apps are now prioritized in the access grid, improving navigation and usability. Additionally, new hover and resize behaviors for the access grid’s shadow enhance user experience. Column Reordering: This feature enables users to rearrange columns in the access grid, allowing for a more customized and efficient workspace. ⚙️ Application Management and Integration: Application Deletion and Addition: The logic for deleting applications has been improved and moved for better accessibility. There’s also a new flow for adding applications requiring OAuth. Google Workspace Integration: Fixes and improvements in the integration with Google Workspace ensure more reliable and accurate synchronization. User and Admin Syncing: Enhancements in syncing admin roles and user identities, including syncing with Google, streamline user management. Manual Application Addition: Resolved issues related to adding non-integrated applications manually. ✨ User Interface and Usability Improvements: UI Enhancements: Various tweaks, like minimum bar width adjustments and added spacing in forms, improve the overall user interface. State Visibility: Improvements in showing the correct state in different modes, like preselected app mode, enhance user clarity. 🏗️ Organizational Structure Visualization: Org Tree Visualization: Fixes and enhancements in the organizational tree visualization, especially for empty organizations, improve clarity and user interaction. 🔐 Identity and Access Management: Account Mapping and Reporting: Better handling of identity mappings and integrated application reports, ensuring accurate tracking and reporting. Group Membership Bugs: Fixes in group membership management enhance reliability in user access control. New Status Indicators: Introduction of an invited status tooltip for better user status tracking. 📧 Email and Notification System: Email Template Updates: Changes in the email templates for different scenarios, like user onboarding and access requests, for clearer communication. Trigger Emails for User Changes: Automatic email notifications for user additions, deletions, or suspensions outside of YeshID. 🤝 Security and Compliance: Secret Property Handling: Enhanced security measures to prevent the return of secret properties in integrations. File Type Restrictions: Restricted certain uploads to PNG files only to bolster security. 📋 Task Management and Navigation: Task View Splitting: The tasks view has been split by completion status for better task management. Navigation Enhancements for Admins: Improved navigation options for admins, especially those who are new or in the process of being onboarded. 🕺 Miscellaneous: Year Update in Templates: Updated the year to 2024 in email templates. General Bug Fixes and Improvements: Various other fixes and improvements, including time display issues and testing adjustments.

See all release notes.

The post Release Notes for January 25, 2024 appeared first on YeshID.


Ocean Protocol

DF73 Completes and DF74 Launches

Stakers can claim DF73 rewards. DF74 runs Jan 25 — Feb 1, 2024 1. Overview Ocean Data Farming (DF) is Ocean’s incentives program. In DF, you can earn OCEAN rewards by locking OCEAN, curating data, and making predictions (in Predictoor). Here are DF docs. Data Farming Round 73 (DF73) has completed. 150K OCEAN + 20K ROSE was budgeted for rewards. Rewards counting started 12:01am Jan 18,
Stakers can claim DF73 rewards. DF74 runs Jan 25 — Feb 1, 2024 1. Overview

Ocean Data Farming (DF) is Ocean’s incentives program. In DF, you can earn OCEAN rewards by locking OCEAN, curating data, and making predictions (in Predictoor). Here are DF docs.

Data Farming Round 73 (DF73) has completed. 150K OCEAN + 20K ROSE was budgeted for rewards. Rewards counting started 12:01am Jan 18, 2024 and ended 12:01am Jan 25. You can claim rewards at the DF dapp Claim Portal.

DF74 is live today, Jan 25. It concludes on Feb 1. 150K OCEAN and 20k ROSE are budgeted in total for rewards.

This post is organized as follows:

Section 2: DF structure Section 3: How to earn rewards, and claim them Section 4: Specific parameters for DF74 2. DF structure Passive DF. As a veOCEAN holder, you get passive rewards by default. Active DF has two substreams.
– Volume DF. Actively curate data by allocating veOCEAN towards data assets with high Data Consume Volume (DCV), to earn more.
– Predictoor DF. Actively predict crypto prices by submitting a price prediction and staking OCEAN to slash competitors and earn. 3. How to Earn Rewards, and Claim Them

There are three ways to earn and claim rewards: passive DF (like before), Active DF : Volume DF (like before), and Predictoor DF (new).

Passive DF. To earn: lock OCEAN for veOCEAN, via the DF webapp’s veOCEAN page. To claim: go to the DF Webapp’s Rewards page; within the “Passive Rewards” panel, click the “claim” button. The Ocean docs have more details. Active DF
– Volume DF substream. To earn: allocate veOCEAN towards data assets, via the DF webapp’s Volume DF page. To claim: go to the DF Webapp’s Rewards page; within the “Active Rewards” panel, click the “claim” button (it claims across all Active DF substreams at once). The Ocean docs have more details.
– Predictoor DF substream. To earn: submit accurate predictions via Predictoor Bots and stake OCEAN to slash incorrect Predictoors. To claim OCEAN rewards: run the Predictoor $OCEAN payout script, linked from Predictoor DF user guide in Ocean docs. To claim ROSE rewards: see instructions in Predictoor DF user guide in Ocean docs. 4. Specific Parameters for DF74

This round is part of DF Main, phase 1.

Budget. This round has 150,000 OCEAN + 20,000 ROSE rewards total. That OCEAN and ROSE is allocated as follows:

Passive DF: 50% of rewards = 75,000 OCEAN Active DF: 50% of rewards
– Predictoor DF. 50% = 37,500 OCEAN + 20k ROSE
– Volume DF. 50% = 37,500 OCEAN

Networks. Ocean currently supports five production networks: Ethereum Mainnet, Polygon, BSC, EWC, and Moonriver. DF applies to data on all of them.

Volume DF rewards are calculated as follows:

First, distribute OCEAN across each asset based on rank: highest-DCV asset gets most OCEAN, etc. Then, for each asset and each veOCEAN holder:
– If the holder is a publisher, 2x the effective stake
– Baseline rewards = (% stake in asset) * (OCEAN for asset)
– Bound rewards to the asset by 125% APY
– Bound rewards by asset’s DCV * 0.1%. This prevents wash consume.

Predictoor DF rewards are calculated as follows:

First, DF Buyer agent purchases Predictoor feeds using OCEAN throughout the week to evenly distribute these rewards. Then, ROSE is distributed at the end of the week to active Predictoors that have been claiming their rewards. You can read the details for how Predictoor DF Reward Amounts work here. Read how to optimize Predictoor rewards here.

Expect further evolution in Active DF: tuning substreams and budget adjustments among substreams. What remains constant is passive DF, and the total OCEAN rewards emission schedule.

Updates are always announced at the beginning of a round, if not sooner.

Appendix: Further Reading

The Data Farming Series post collects key articles and related resources about DF.

About Ocean Protocol

Ocean was founded to level the playing field for AI and data. Ocean tools enable people to privately & securely publish, exchange, and consume data.

Follow Ocean on Twitter or Telegram to keep up to date. Chat directly with the Ocean community on Discord. Or, track Ocean progress directly on GitHub.

DF73 Completes and DF74 Launches was originally published in Ocean Protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.


2024 Ocean Protocol Data Challenge Championship is Live

Are you the top Data Scientist in the land? Demonstrate your case for 1st place in this year’s data challenge championship season. Introduction This blog introduces the kickoff of the 2024 Ocean Protocol Data Challenge Championship. The first Data Challenge of the year is live on Desights now and ends on Tuesday, Jan 30, 2024. ’24 is the 3rd year of Ocean Protocol-sponsored data science competit
Are you the top Data Scientist in the land? Demonstrate your case for 1st place in this year’s data challenge championship season. Introduction

This blog introduces the kickoff of the 2024 Ocean Protocol Data Challenge Championship. The first Data Challenge of the year is live on Desights now and ends on Tuesday, Jan 30, 2024. ’24 is the 3rd year of Ocean Protocol-sponsored data science competitions. This year welcomes season 2 of the championship and leaderboard points. Additional details about the 2023 Season can be found in this blog post. Some minor details have been added/subtracted to this year’s championship, which are presented below:

What’s New This Season?

2023 welcomed over 200 unique data scientists competing in data challenges. To accommodate an increasing number of recurring participants, we have raised the prize pool for each data challenge from $5000 USD to $10,000 USD available every challenge. Due to this, the 200 OCEAN participation bonus for submitting reports & proposals has concluded.

Starting with the current data challenge (Road to Safety: Traffic Accident Analysis) the $10,000 prize pool will be distributed to the top 10 scored submissions per data challenge. Additionally, leaderboard points for the championship season will be rewarded for every challenge, scaled to the top 10 on a given data challenge. All participants outside of the top 10 will not receive points towards the championship season leaderboard. Furthermore, new data challenges will begin roughly on 2 Thursdays of each month, and each will be open to participate for 20 days. 2022 & 2023 data challenges tested different time durations between 7–30 days. It has been determined that initiatives and hypothesis testing that require longer than 20 days will be tagged and executed as something other than a data challenge (data science competition).

Beyond the program structure, new features and functionalities of Desights.ai continue to roll out regularly. Desights is the application that the Ocean Data Science team uses to conduct data challenges. The platform continues to mature as the web3 platform to crowdsource solutions to AI & ML challenges, business intelligence, applied data science, and predictive analytics.

How It Works

There are 3 points to pay attention to while submitting:

You will need a Desights Profile to submit to a data challenge. If you don’t have one, you can follow this simple tutorial to create one: https://docs.desights.ai/guides/creating-desights-profile Before submitting make sure you have the valid asset URL for your submission. Here’s how to do that: https://docs.desights.ai/guides/checking-if-your-asset-url-is-correct Make sure you submit correctly! Submission is directly via the Desights Platform. Check out our guide on how to submit your solutions: https://docs.desights.ai/guides/submission-to-the-challenge

Important: Make sure you submit your solutions before the deadline. The submission form will not be available once the challenge expires.

Check out the Championship leaderboard, and active + previous data challenges here https://oceanprotocol.com/earn/data-challenges.

2024 Leaderboard & Awards

As briefed above, the reward structure for each data challenge and the end-of-championship season awards have been modified.

1) More $ incentives repeatedly for the best quality reports/outcomes, and 2) advancements in gamifying the points structure for the year are the two main pillars of change to bear in mind.

An updated structure for the current data challenge and all challenges hosted in the 2024 season is articulated in the image below:

Calendar of Events

Data Challenges sponsored by the Ocean Protocol Data Science team will begin 2 Thursdays of every month. Challenges will also end on 2 Tuesdays of every month.

Each Challenge will be open for 20 days to formulate a report, results, or alternative submission criteria for the given challenge.

The 2024 Championship Leaderboard is live and will conclude in December 2024.

Updates to changes in the leaderboard will be published through Ocean Protocol media channels after each data challenge ends and new points are accrued.

Join The Community

The Ocean Protocol Data Science team and alternative core teams are available in the Ocean Protocol Community Discord & Desights Community Discord Channels. Live updates to the Ocean Data Challenge leaderboard and all related initiatives + updates are available via Twitter, Ocean Website, Discord, and blog.

For questions, comments, and community data science dialogue, reach out in our discord under the #data-science-hub channel: https://discord.gg/yFRPH9PCN4 for updates and new challenges.

Stay tuned for published research, challenge reviews, and all Ocean Protocol updates on the blog page under blog.oceanprotocol.com.

To see past, current, and future data challenges sponsored by Ocean, please visit https://oceanprotocol.com/earn/data-challenges.

About Ocean Protocol

Ocean was founded to level the playing field for AI and data. Ocean tools enable people to privately & securely publish, exchange, and consume data.

Follow Ocean on Twitter or Telegram to keep up to date. Chat directly with the Ocean community on Discord. Or, track Ocean progress now on GitHub.

2024 Ocean Protocol Data Challenge Championship is Live was originally published in Ocean Protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.


KuppingerCole

Mar 14, 2024: Decentralized Identity – a Key to Reusing Identity for Improved Security and User Experiences

Decentralized identity has earned a place in identity management for its capacity to increase privacy and security, while improving the user experience. It is fundamental to creating a reusable verified identity which enables numerous use cases such as reusable KYC, proof of employment, remote onboarding and passwordless authentication.
Decentralized identity has earned a place in identity management for its capacity to increase privacy and security, while improving the user experience. It is fundamental to creating a reusable verified identity which enables numerous use cases such as reusable KYC, proof of employment, remote onboarding and passwordless authentication.

Veramo

Announcing the Veramo Agent Explorer Plugin system

Since its inception, Veramo has been designed as a modular framework for decentralized identity and verifiable data, utilizing a “plugin” architecture. If you need to support a new DID method, a new credential type, or an alternative storage layer, you can simply add a plugin for that functionality into your Veramo agent and continue to use the rest of the agent’s functionality without having to d

Since its inception, Veramo has been designed as a modular framework for decentralized identity and verifiable data, utilizing a “plugin” architecture. If you need to support a new DID method, a new credential type, or an alternative storage layer, you can simply add a plugin for that functionality into your Veramo agent and continue to use the rest of the agent’s functionality without having to do a bunch of extra integration work.

Now we’ve brought this “plugin paradigm” to the Veramo Agent Explorer.

For those unfamiliar, the Agent Explorer is a React application that allows you to connect to (local, remote, or in-browser) Veramo agents and interact with them through an intuitive user interface. The Agent Explorer allows you to create and manage your DIDs, create and verify credentials, and even communicate with other DIDs through the DIDComm interface.

Now, you can bring your own user experience into the Agent Explorer by creating plugins that can be loaded into this front-end dynamically. Plugins can be as simple or complex as you want, depending on the experience you want to provide. Plugins can provide custom rendering for certain credential types, additional DIDComm message handlers, or even full “windows” into your application or protocol without having to re-implement all the DID, credential, and storage management that the Agent Explorer provides.

Plugins page in “Settings” of Agent Explorer app

Much of the existing Agent Explorer functionality is already implemented through built-in plugins, which you’re free to turn on or off at your discretion.

To develop your plugins, please refer to the Agent Explorer Plugin documentation, available here: https://github.com/veramolabs/agent-explorer/blob/main/packages/plugin/README.md. This page also links to several existing plugins that you might use for inspiration or as a guide for developing your own Agent Explorer plugins.

To celebrate this new functionality, we’re hosting a small hackathon focused on Agent Explorer plugins. For more information, please check this announcement: https://medium.com/veramo/announcing-the-agent-explorer-plugin-hackathon-1c603e9543b3

Announcing the Veramo Agent Explorer Plugin system was originally published in Veramo on Medium, where people are continuing the conversation by highlighting and responding to this story.


Elliptic

5 things you’ll learn from reading The Crypto Launderers

Though cryptoassets have existed for 15 years, there are still many misconceptions about the ways the technology intersects with criminal activity, and the implications for society.   

Though cryptoassets have existed for 15 years, there are still many misconceptions about the ways the technology intersects with criminal activity, and the implications for society.   


Veramo

Announcing the Agent Explorer Plugin Hackathon

We’re happy to announce that we’re hosting a hackathon for the Veramo builder community to explore solutions that utilize the Agent Explorer Plugin functionality. The hackathon will run from January 25th until February 22nd. PRIZES * The top 3 entries, as judged by the VeramoLabs team, will receive 500 DAI (~$500) each. RULES Hackathon entries must be open source and permissively li

We’re happy to announce that we’re hosting a hackathon for the Veramo builder community to explore solutions that utilize the Agent Explorer Plugin functionality. The hackathon will run from January 25th until February 22nd.

PRIZES

* The top 3 entries, as judged by the VeramoLabs team, will receive 500 DAI (~$500) each.

RULES Hackathon entries must be open source and permissively licensed (Apache 2.0 or MIT) Entries must be accessible simply by adding a plugin to the agent explorer. Your entry can utilize external services (e.g. a cloud agent you run that the plugin communicates with). Still, users must be able to interact with it from the Agent Explorer plugin only with no additional configuration/setup. Participants must submit a 1–5 minute video explaining/demonstrating their plugin. GET STARTED To get started, review the hackathon details and register here: https://forms.gle/Vz5N4UkN67jXRchd7 If you are new to decentralized identity, review our resources section, where you’ll find suggested developer tools and tutorials. Head to Discord: https://discord.gg/2EbbRMTtRf, introduce yourself, and let us know if you have any questions. If you are looking for teammates, you can also find them there. Join our opening session happening on the 1st of February by 9 a.m. EST/3 p.m. GMT+1. Click on this link to set a reminder.

Get hacking! Submissions will open on February 12th, 2024.

WHAT TO SUBMIT Include a video (no more than 5 minutes) demonstrating your submission. Videos must be uploaded to YouTube, Loom, or Vimeo and made public. Provide a URL to a public code repository. Complete the submission form (to be shared) before the deadline. RESOURCES

1. https://github.com/veramolabs/agent-explorer/blob/main/packages/plugin/README.md

2. https://youtu.be/MyKyVubqeS0

Announcing the Agent Explorer Plugin Hackathon was originally published in Veramo on Medium, where people are continuing the conversation by highlighting and responding to this story.


YeshID

Easier & more secure employee offboarding for small businesses with YeshID

When people talk about onboarding and offboarding, onboarding gets all the attention. We talk about how to make a seamless onboarding experience and get your new employee to work fast.... The post Easier & more secure employee offboarding for small businesses with YeshID appeared first on YeshID.

When people talk about onboarding and offboarding, onboarding gets all the attention. We talk about how to make a seamless onboarding experience and get your new employee to work fast. How to provide them with the birthright apps they need to do their job. But we don’t talk about what happens when it’s time to part ways with an employee and what needs to be done then. 

Ensuring proper offboarding is a huge concern for small businesses. When you part ways with an employee, there’s a lot you can forget that can leave you vulnerable: 

If the recovery email and phone number aren’t reset, the user might be able to regain access to their account. If you forget to forward the user’s email, you might miss out on important communication from your customers.  If you don’t promptly deprovision the user’s accounts from third-party applications, you might be paying more in license costs than you need.

Luckily, it’s not all up to you to remember every step. Here’s how you can handle employee offboarding in YeshID.

Easier employee offboarding in YeshID

To offboard someone in YeshID, select that person from your Organization, and in the side car that opens, click ‘Offboard.’ From here, you can choose: 

What action you wish to perform (suspending vs deleting) Who, if anyone, to assign access to the employee’s Google Docs/Drive, Calendar, etc. data Who, if anyone, should receive new email sent to the old address When you would like the process initiated

As long as you are managing application access in YeshID, we will also create subtasks for applications which need to be deprovisioned. This is a big differentiator from how you can manage this process in the Google Workspace console. Offboarding directly in the console only turns off “sign in with Google” accounts for the user. But what about accounts that use a work email and password? YeshID helps you track all application access to make sure the user is offboarded from any company applications they had access to.

Now let’s dive into each option. 

Offboarding with the intention to suspend

This is a good option if you want to remove access for a user but don’t want to completely delete them. For example, maybe the user was privy to critical business data that you don’t want to risk losing, or maybe it’s a seasonal contractor that you plan to reinstate later in the year. 

Here’s what happens when you suspend a user in YeshID:

The user’s data will be kept, but they won’t receive emails, calendar invitations, or files as long as they’re suspended (note: this does require you to maintain a Google Workspace license for the user).  Once you specify that the user is suspended, you will be prompted to create a task list to offboard the user from the applications they have access to.  Emails are sent to the application administrators, prompting them to remove access for the user at specified time. You can “unsuspend” a user at any time, as long as they’re not deleted.

Offboarding with the intention to delete

This is a good option if you know that you want to remove the user entirely. When you delete a user, YeshID will:

Reset the Google Workspace password, which will revoke any application-specific passwords tied to the user’s account Log the user out of every Google session across all of their devices by invalidating their session cookies Delete the user’s account recovery email and phone numbers Revoke all OAuth grants associated with the user’s account Prompt you to create a task list to offboard the user from the applications they have access to Send emails to application administrators, prompting them to remove access for the user at the specified time.

When you delete a user, YeshID will also create a list of tasks that an admin must manually complete. While the above steps will happen automatically, these steps require confirmation from the admin to continue: 

Confirm that they have transferred all email from the account they wish to save, by using the Google Data Migration Service. Initiate any data transfers that were configured in the YeshID offboarding dialog (Google Docs/Drive, Calendar, etc.). Delete the account in Google Workspace. Set the old email address as an alias on another account so you can continue receiving new email for the old account. 

It’s worth noting that the last step here, creating an alias, can’t be done until you delete the user. Then you have to remember to come back to create the alias. It’s an easy step to forget — thankfully you have that task reminder in YeshID. 😉

Easier offboarding with technology

The task of offboarding a user contains many distinct steps, each of which can lead to issues if forgotten. Why take the risk? YeshID uses ready-made procedures that ensure you follow offboarding best practices and keep your environment safe and running smoothly. See how easy it can be.

The post Easier & more secure employee offboarding for small businesses with YeshID appeared first on YeshID.

Wednesday, 24. January 2024

Holochain

Navigating Tension and Discomfort in Global Collaboration

#HolochainChats with Catherine Stihler

A context note from Mary Camacho:

As the executive director of an organization exploring new models of human synergy and coordination, I often think about how we can enable more inclusive collaboration globally.

So it was thrilling to have the chance to speak with Catherine Stihler, whose decades of experience in geopolitical relations has given her deep wisdom.

Our wide-ranging conversation covered many aspects of policy, technology, and governance. Fundamentally, it centered on a key question — how do we bridge the many divides that hinder human cooperation and collective problem-solving? 

There were no simple answers, but identifying the barriers provided inspiration to keep striving for coherence across cultures. By exploring the intersections of technology, regulation, and how groups communicate, we identified commonalities underlying collective problem-solving. What follows is a summary of key noticings from the conversation. 

Cultural Perspectives Complicate Collaboration

Successful collaboration requires navigating varied mindsets and worldviews — a complex challenge. As Stihler noted, “How do you culturally navigate spaces and places?”

Her multinational career has shown that coherence demands understanding perspectives. Within the EU parliament alone, discussion happens in over 20 languages, demonstrating the diversity of communication styles. Even basic terms can mean different things across cultures.

Friction naturally occurs when bringing together people with different assumptions and motivations. As Stihler experienced, this tension is integral, if sometimes uncomfortable.

During negotiations, she focused on the ultimate policy aims that transcended cultural barriers. She focused on finding common purpose-centered compromises. Still, bridging divides can be challenging, given subjective identities and priorities. We must accept messiness in sharing space.

Keeping communication open despite discomfort aids progress. And the groups who govern play a key role in establishing incentives towards mutual understanding.

Inclusive Governance Requires Uncomfortable Coherence

Governance and policy processes necessitate navigating conflicting interests to reach collaborative decisions. As noted, this process contains inherent “tension” and “discomfort.” When representing constituencies with varying needs and values, perfect solutions fail.

Progress means reconciling disagreements. 

Stihler recounted challenging EU regulations where health priorities competed with economic ones. Compromising entailed painful sacrifices on multiple sides. However, with shared underlying aims, adversaries transformed into collaborators united by a common purpose

While universal access to collaboration and information is ideal, Stihler noted that some limitations can spur healthy competition, stating:

“Not all barriers are bad. Some can encourage innovations.”

For example, within a single economic market, preventing total domination by one company allows space for newcomers and alternate models like co-ops. Rules ensure stability while still providing openings for diverse contributors.

Overcoming Persistent Digital Divides

Despite the connectivity promised by digital tools, barriers still limit access and divide users. As Stihler explained, restrictions like “geo-blocking” prevent sharing media across country borders, hindering collaboration. While the EU’s “Digital Single Market” intended seamlessness, gaps remain, frustrating users across member states and other countries.

For non-EU countries, these false boundaries are especially visible. From a UK or US perspective, the borders seem technical and arbitrary, given the ability to instantly share data globally. However, preventing access contradicts the spirit of open digital space. We must keep pioneering tools giving users agency and accessibility.

While some limitations aim to encourage market competition, others serve little purpose beyond denying availability. More and more people cross borders because families, friends, and now businesses are moving. And the digital tools they use also evolve alongside this change. This is a sweet spot for Holochain because the architecture requires the consent of the users and apps built with it — rather than centralized server hosting to connect groups together.

Bridging Divides Through Understanding

Technology alone cannot solve barriers to cooperation stemming from varied cultural perspectives and incentives. However, frameworks that help build bridges and foster mutual understanding can assist with reconciliation. 

As the EU demonstrates, keeping communication open across cultural divides provides critical conduits for compromise, even if tension persists. Institutions play a vital role in sustaining conduits for exchange.

While universal inclusion remains an ideal, moderate limitations sustaining diversity also prove essential for progress by preventing stagnant monopolies and stimulating competition. With open, equitable rules of engagement, such friction engenders innovation rather than isolation.

As remote and hybrid work evolve, integrating the magic of embodied teamwork with virtual capabilities remains imperative to human progress. For it is amidst the messiness of the human experience that the future unfolds.

If you enjoyed this article, feel free to share it or drop a comment and share your thoughts. We want to hear from you!


1Kosmos BlockID

What is MFA Fatigue and How Can Your Business Combat it?

Multi-factor authentication (MFA) has emerged as a pivotal tool in cybersecurity, a security key to bolstering the fortifications guarding sensitive information and systems. Essential to comprehending the broader discussion on MFA security is an understanding of the phenomenon termed “MFA Fatigue.” This concept encapsulates the exhaustion and inconvenience experienced by users due to repetitive an

Multi-factor authentication (MFA) has emerged as a pivotal tool in cybersecurity, a security key to bolstering the fortifications guarding sensitive information and systems. Essential to comprehending the broader discussion on MFA security is an understanding of the phenomenon termed “MFA Fatigue.” This concept encapsulates the exhaustion and inconvenience experienced by users due to repetitive and cumbersome multi-factor authentication and processes. Addressing this issue is instrumental in cultivating a cybersecurity environment that is both secure and user-friendly.

What is MFA Fatigue?

MFA Fatigue is a multifaceted challenge with various contributing factors. The core of this issue lies in the repetitive and often cumbersome login process and processes that users must navigate to authenticate their identities, compromised credentials, mobile device, username and password, and account. This can lead to exhaustion and frustration, diminishing the overall user experience and potentially leading to lax security practices.
An in-depth examination of MFA Fatigue requires exploring its manifestations and impacts on user behavior. Recognizing the signs and symptoms is essential to preemptively addressing potential challenges and mitigating risks. It is crucial to analyze how MFA Fatigue influences user interactions with security protocols and its subsequent impact on user credentials and overall cybersecurity hygiene.

How does an MFA Fatigue Attack Start?

MFA Fatigue attacks begin by exploiting users’ weariness from engaging with multiple, repetitive authentication processes. Attackers anticipate that tired and frustrated users are more likely to make mistakes or bypass security protocols, making executing attacks like phishing or account and password hijacking easier.
The attacker’s goal is to take advantage of these moments of vulnerability, where legitimate users might overlook suspicious activities or ignore security alerts because they desire a more straightforward authentication process.
In these attacks, adversaries mimic legitimate authentication requests or login attempts, creating a sense of urgency, forcing the users to act quickly and without much thought. Users already fatigued by numerous authentication steps are more prone to fall for these tactics, giving attackers the access or information they seek, thus compromising the security defenses set by the MFA processes.

What is an example of a MFA fatigue attack?

One typical example of an MFA Fatigue attack is a phishing scheme where the attacker impersonates a trusted service with stolen credentials with which the user frequently interacts, such as an email provider or a corporate system.
The user receives a message urging them to log in and attempt to confirm their identity by clicking a link. Since users often encounter multiple authentication requests, they might proceed without thoroughly evaluating the request’s legitimacy, potentially exposing sensitive information.
Another example could be an attacker exploiting the users’ familiarity with authentication processes to make a login attempt to gain access or create a spoofed login page. Users, tired of repeatedly entering credentials and going through MFA procedures, may not meticulously check the URL or the page’s security, entering their information into a fraudulent site, which then captures their credentials and potentially bypasses MFA protections.

Is MFA fatigue social engineering?

MFA fatigue indirectly fosters a conducive environment for social engineering attacks. Social engineering involves manipulating individuals into divulging confidential information or performing actions that compromise security.
MFA fatigue contributes to this by making users more susceptible to attacks due to their weariness and frustration from constant authentication requests. Most users might be less vigilant and more willing to comply with unusual requests, thinking it is just another part of the authentication process.
However, it’s important to clarify that MFA request fatigue is not a social engineering attack. It is a state of user exhaustion and frustration due to repetitive MFA request processes, which attackers exploit using social engineering techniques, like phishing or pretexting, to deceive users into lowering their defenses and revealing sensitive information or access.

Evolving Landscape of Cybersecurity and MFA

Tracing the trajectory of MFA technologies provides valuable insights into their current state and future directions. Initially, MFA emerged as a groundbreaking approach to secure authentication, offering robust defenses against unauthorized access to login credentials, security keys, and authentication messages. However, as cyber threats have evolved, so have the demands on MFA technologies to provide enhanced security without compromising usability.
The present landscape presents a crucible of challenges and opportunities. Technological advances and signs in a heightened threat environment necessitate continuous evolution and adaptation of MFA strategies. Organizations must be agile and responsive to maintain sensitive data integrity and security postures while mitigating MFA Fatigue.

Technological Innovations Targeting MFA Fatigue

Emerging technologies present promising avenues for alleviating MFA Fatigue. Innovations continually reshape the landscape, aiming to simplify authentication processes without compromising security. Evaluating the effectiveness of these technologies and their reception by users is integral to understanding their role in combating MFA Fatigue.
Many technologies, such as biometrics and adaptive authentication, have been heralded as transformative in enhancing user experience. By staying abreast of these technological trends and their implications, organizations can make informed decisions that bolster their cybersecurity while easing user fatigue.

Legal and Compliance Considerations in MFA

In the labyrinth of multi-factor authentication (MFA), legal and compliance considerations are one example of pivotal guideposts. These elements delineate organizations’ boundaries and obligations to implement and manage MFA processes. The organization’s multi-factor authentication and legal landscape encompasses various facets, including data protection regulations, user rights, and industry-specific compliance mandates.
Understanding and adhering to legal and compliance considerations are non-negotiable facets of responsible MFA deployment. Organizations must stay abreast of legislative developments, regulatory requirements, and industry best practices. This adherence is instrumental in safeguarding organizational integrity, user trust, and the overall robustness of cybersecurity defenses.

User Experience (UX) Design Principles for MFA

Incorporating User Experience (UX) design principles into MFA strategies heralds a reimagined approach focused on the user’s identity-centricity. This involves designing MFA processes prioritizing ease of use, intuitive interaction, and overall user satisfaction. Effective UX design in MFA seeks to reduce complexities and frictions contributing to user fatigue and dissatisfaction.
By leveraging UX design principles, MFA processes can be transformed into seamless user journeys that harmonize security and usability. Such a design ethos fosters a positive user interaction with security protocols, encouraging compliance and enhancing the security keys overall effectiveness in authentication processes.

Organizational Strategies for Implementing MFA

Deployment strategies within organizations play a critical role in the reception and effectiveness of MFA processes. Thoughtful implementation involves careful planning, stakeholder engagement, and continuous improvement mechanisms. Crafting strategies that consider organizational dynamics, technical infrastructures, and user needs are paramount.
Adoption and usability are key considerations in the strategic deployment of an MFA system. Organizations must foster environments supporting user education and adaptation, providing necessary resources, training, and support. Such comprehensive strategies pave the way for the successful integration of MFA as a robust and user-friendly component of organizational cybersecurity.

Continuous Improvement: Analytics and Feedback

Embracing a philosophy of continuous improvement propels MFA strategies toward evolving excellence. Utilizing analytics offers a lens into user interactions, behaviors, and challenges within the MFA processes. With user feedback, analytics forge a pathway to insightful enhancements and refinements.
Feedback loops, encompassing user insights and analytical data, become the bedrock for informed decision-making. They facilitate the identification of areas for improvement, user challenges, and opportunities for optimization. By nurturing a culture of continuous improvement, organizations can maintain MFA processes that are both contemporary and user-centric.

How 1Kosmos BlockID Helps Combat MFA Fatigue

1Kosmos BlockID simplifies the user experience by revolutionizing identity verification processes, posing a potent solution to MFA fatigue. Its intuitive design facilitates self-service identity verification with over 99% accuracy, streamlining user onboarding and ensuring secure access. Using various identification methods such as LiveID, FaceID, and government-issued IDs, BlockID allows for flexible and instant identity assertion, minimizing the hassle associated with multiple authentication factors and reducing the dependency on passwords and one-time codes.
Incorporating BlockID in the user authentication process significantly reduces traditional MFA methods’ complexity and time-consuming nature. It differentiates between genuine users and imposters, thus safeguarding against identity fraud while promoting a seamless, efficient, and user-friendly digital interaction. BlockID’s transformative approach aims to provide users with a balance of speed, convenience, and utmost security, effectively combatting MFA fatigue.
Apart from reducing MFA fatigue, BlockID also strengthens security infrastructures by:

Biometric-based Authentication: We push biometrics and authentication into a new “who you are” paradigm. BlockID uses biometrics to identify individuals, not devices, through credential triangulation and identity verification. Identity Proofing: BlockID provides tamper evident and trustworthy digital verification of identity – anywhere, anytime and on any device with over 99% accuracy. Privacy by Design: Embedding privacy into the design of our ecosystem is a core principle of 1Kosmos. We protect personally identifiable information in a distributed identity architecture, and the encrypted data is only accessible by the user. Distributed Ledger: 1Kosmos protects personally identifiable information in a private and permissioned blockchain, encrypts digital identities, and is only accessible by the user. The distributed properties ensure no databases to breach or honeypots for hackers to target. Interoperability: BlockID can readily integrate with existing infrastructure through its 50+ out-of-the-box integrations or via API/SDK. Industry Certifications: Certified-to and exceeds requirements of NIST 800-63-3, FIDO2, UK DIATF and iBeta Pad-2 specifications.

To learn more about the 1Kosmos BlockID solution, visit the platform capabilities and security feature comparison pages of our website.

The post What is MFA Fatigue and How Can Your Business Combat it? appeared first on 1Kosmos.


Indicio

Newsletter Vol 70

The post Newsletter Vol 70 appeared first on Indicio.

Now on LinkedIn! Subscribe here

Verifiable Credentials: A look ahead at 2024 — Spotlight: Travel and Hospitality
 

SITA’s Senior Lead Solution Architect Michael Zureik joins Indicio CEO Heather C. Dahl and VP of Communications and Governance Trevor Butterworth for a look at what 2024 will bring for verifiable credentials!

Read more How Verifiable Credentials Can Handle the Threat of Deepfakes in KYC

It’s only a matter of time before generative AI renders current methods of identity verification for financial transactions useless. In this article we take a look at how verifiable credentials can offer a secure way to be confident in online interactions in the age of artificial intelligence.

Read more Trends in Decentralized Identification to Watch for in 2024

2024 will be a huge year for verifiable data. As the industry continues to develop, our team calls some interesting trends to your attention and puts together some predictions for what we think will happen this year.

Read more Indicio’s 2023 Roundup

As we begin 2024, Indicio takes a look back at some of the most noteworthy articles and milestones of the past year.

Read more Identity Insights — PhocusWire’s Movers and Shakers of 2023
 

PhocusWire’s Mitra Sorrells joins Indicio’s Trevor Butterworth for a conversation about Phocuswire’s recent article on the movers and shakers of the travel industry in 2023.

Watch the video Want to see more weekly videos? Subscribe to the Indicio YouTube channel! Upcoming Events

 

Here are a few events in the decentralized identity space to look out for.

Identity Implementors Working Group 2/8 DIF DIDComm Working Group 1/29 Aries Bifold User Group 1/30 TOIP Working Group 1/30 Hyperledger Aries Working Group 1/31 Cardea Community Meeting 2/1

The post Newsletter Vol 70 appeared first on Indicio.


IDnow

Mobility as a Service and digital identity verification—a partnership.

Over the years, the ways in which we get from A to B have changed based on how the mobility industry has developed as well as major world events impacting travel choices and decisions. However, finding the fastest and easiest way to get to our final destinations still remains the major priority. But not only […]

Over the years, the ways in which we get from A to B have changed based on how the mobility industry has developed as well as major world events impacting travel choices and decisions. However, finding the fastest and easiest way to get to our final destinations still remains the major priority. But not only do we want fast and easy travel but also convenience paired with cost savings, and we want to be able to know all our options.

Shared mobility such as car sharing and micromobility will continue to see growth over the next 10 years due to increasing environmental regulations, there are plenty of choices for consumers, but how can they find the best based on their immediate needs?

Enter Mobility as a Service (MaaS)—a one-stop traveling shop, which takes every type of transportation into account and provides its users the best optimized route to their destination. A concept focusing on a “bundled service model,” consumers are in a sense buying into mobility, instead of investing in transport equipment such as personal vehicles.

With the global Mobility as a Service market projected to grow from $236.42 billion in 2022 to $774.93 billion by 2029, companies need to bear in mind where MaaS is headed and how they can become part of this growing movement.

What is Mobility as a Service (MaaS)?

Mobility as a Service (MaaS) is an emerging trend that is transforming the way we use and access transportation. The overarching idea of MaaS is that it is an integration of various forms of transport and transport-related services into a single, comprehensive and on-demand mobility service. Basically, everything at the touch of a button. This in effect allows users to access all types of transportation (walking, car sharing, public transportation, micromobility, etc.) in one application with a single payment channel. Say goodbye to the multiple ticket purchasing stress.

Instead, services are becoming consolidated by having one application for all their travel needs.

Why is MaaS important?

The importance of MaaS lies in the concept itself. By reducing the inconveniences associated with accessing individual transportation services, MaaS provides a one travel service from planning to booking to paying, creating a more fluid and integrated experience of transportation. This results in less congestion, fewer emissions and lower operating costs all while providing a more accessible and inclusive journey for those individuals with limited mobility. Plus, MaaS allows for the integration of smart city data to create efficient and cost-effective traffic management systems. In the long run, MaaS holds great promise for improving accessibility, affordability, sustainability and efficiency in our transportation networks.

Additionally, every person has their own unique needs based on their job, interests, financial profile, physical capabilities, behaviors, relationships, personal preferences and more. Putting these all together creates a specific identity that can be tailored to when it comes to transportation and exactly what MaaS will provide since alternative transport options will be available in a one-stop shop.

In short, MaaS could be the final puzzle piece needed to make urban life smoother, faster and cheaper for everyone. 

What are Mobility as a Service examples?

Mobility as a Service can sometimes be confusing when wondering which apps and operators fall under which category. Some MaaS providers, such as ride hailing apps (Uber) and peer-to-peer rental services (GoGet & FlexiCar) are currently good examples of MaaS that focus on one or two areas. However, with the expected CAGR around 7.43% over the next five years and Asia-Pacific the fastest growing market, major players in the MaaS industry include Enterprise, Hertz and Moovit.

However, the future of MaaS looks toward apps that can provide all services in one place such as Whim based in Helsinki, but also available in the UK, Belgium, Austria, Japan, etc. Whim combines over 2,500+ taxis, car rentals and public transport options, helping users plan and book better journeys through smartphone technology. This can also be seen from the business side as well. UK-based company Mobilleo is a Mobility as a Service app specifically for businesses. Whether a self-employed business or a fleet manager, Mobilleo combines thousands of travel providers into one app to help businesses find, book and pay for their journey in one solution.

Digital identity: the building block of MaaS.

But what does this mean exactly for users and operators? Well, for one it means the importance of data. In order to provide what consumers are looking for data must be shared. However, security concerns arise from both the consumer and operator when it comes to data collection and could keep MaaS from growing. Consumers fear the attractiveness this service would be for fraudsters, while also doubting the legitimacy of ID verification platforms. On the other hand, operators worry about the challenge of proper driver’s license verification and preventing minors/fraudsters from using their services.

In a way, the root of the problem is the lack of security on both ends and the possibility of the mishandling of data, which would put the fate of MaaS in jeopardy. A strong framework for data sharing and identity verification therefore needs to be put in place in order for MaaS ro be successful. But there’s really no need to worry because there is a solution.

The missing link: security through identification.

The key to this new service is what connects the various modes of transportation and their service operators in order to support MaaS and provide a smooth multi-leg and multi-modal journey for customers. But what exactly is the key? Simple. Digital identity verification.

With all transportation options listed in one area, having digital identity verification is crucial so providers know who is using their services. Plus, providers want to offer the best service to their customers (in hopes that you will use them again) and can do that by way of having the digital identities of its users. But consumers also benefit by seeing what the mobility providers can offer them. Basically, it’s a two-way street where both parties can be satisfied.

But why exactly is digital identification necessary? Not only does it provide consumers and companies the benefits of this two-way street, but it offers trust, which is the foundation of any relationship.

Because MaaS is a platform displaying various modes of transport, there are some services that will need to know if a customer possesses a valid driver’s license or is the appropriate age to use their application. Additionally, any leasing contracts for car-sharing also can require a qualified electronic signature or proof of bank account. But no matter what they may require, a one-time, fast, seamless and automated identity verification is all that is needed.

Plus, as more and more services become digitalized, companies will offer the opportunity for users to store their digital identities in a digital wallet so verification can become even more simplified and efficient while traveling. Since MaaS is housed all in one application, all needed documents, tickets, etc. will be in one place while on-the-go. Travel will be like it never has before.  

Convenience is the goal.

Though still a new concept, MaaS will become more popular in the future due to its main function. It provides convenience for all its users. Mobility doesn’t need to be a hassle or stressful and people don’t want it to be. Instead, it should be easy and smooth. MaaS provides exactly that with the assistance from digital identity verification.

Not only is MaaS convenient, but so is proving your identity. IDnow offers automated solutions which require only a quick photo snap of your ID or driver’s license. Even a qualified electronic signature can be added when necessary. With a process that only requires a few minutes, users can do so from on-the-go, wherever they may be. IDnow is the solution for knowing customers’ true identities so they can carry on with their journey.

Read on: Fueling the demand of convenient mobility

A partnership leading us into tomorrow.

In today’s growing mobility sector, having a digital identity is essential for both consumers and businesses. Both want to authenticate the other and feel secure in their relationship. Thus, these digital identities are at the core of MaaS since it is a one-stop shop. It truly is a digital identity solution partnership that will lead us into the world of tomorrow’s transportation.

Interested in finding out more about the future of mobility and how to become part of this transition into the future? Check out our guide on Mobility: The New Driver Experience.

By

Kristen Walter
Junior Content Marketing Manager
Connect with Kristen on LinkedIn

Mobility Guide

Identity verification for the new driver experience. Get an overview of where the mobility industry is headed and the ID verification products that will assist in this new transition. Get your free copy

Trinsic Podcast: Future of ID

Jacques von Benecke: Launching National-Scale Self Sovereign Identity in Bhutan

In this episode, we talk with Jacques von Benecke, CTO of DHI, the commercial and investment arm of the government of Bhutan. If you haven’t heard, Bhutan has launched one of the most complete SSI ecosystems in the world! We spent most of the time diving in to how it’s going since launching just a few months ago. We get into metrics, like how many users have onboarded, and the growth rate over ti

In this episode, we talk with Jacques von Benecke, CTO of DHI, the commercial and investment arm of the government of Bhutan. If you haven’t heard, Bhutan has launched one of the most complete SSI ecosystems in the world! We spent most of the time diving in to how it’s going since launching just a few months ago.

We get into metrics, like how many users have onboarded, and the growth rate over time. We cover use cases, business models, governance, and more! We even cover how they productize wallets for accessibility, and we were amazed to hear they have iOS, Android, and web… but they have 11 other wallet configurations they’ve had to build to cover the whole population!

Most of all it was really interesting to see what can happen when you have top-down mandate to develop better digital identity and a strong technology approach to execute.

To learn more about Bhutan’s National Digital Identity initiative you can visit Bhutan NDI or listen to Jacques' interview on the (un)Trustables podcast.

Subscribe to our weekly newsletter for more announcements related to the future of identity at trinsic.id/podcast

Reach out to Riley (@rileyphughes) and Trinsic (@trinsic_id) on Twitter. We’d love to hear from you.


IDnow

Time to grow up? Crypto and fintechs rack up more AML fines than traditional financial services.

In 2023, new entrants to the financial system reached a monumental milestone, for all the wrong reasons. Last year, for the first time ever, crypto firms and fintechs received more fines than traditional financial services.  According to data analysed by the Financial Times, crypto and digital payment companies paid an eye-watering $5.8 billion in fines […]
In 2023, new entrants to the financial system reached a monumental milestone, for all the wrong reasons.

Last year, for the first time ever, crypto firms and fintechs received more fines than traditional financial services. 

According to data analysed by the Financial Times, crypto and digital payment companies paid an eye-watering $5.8 billion in fines due to a multitude of reasons, including non-compliance in anti-money laundering checks and failing to uphold sanctions and other financial crime issues.

Findings and failings.

In total, crypto firms recorded 11 separate fines in 2023 compared to an annual average of just two over the last five years. Meanwhile, payments firms recorded 27 fines, which was a huge increase on their annual average of five per year from 2018 to 2022. 

Most fines levied against payments groups were to fintech groups that were less than 20 years old.

This is a surprising shift in the global payments space, and just goes to show that such punishments can be levied out to even the most digital-savvy sectors.

Rayissa Armata, Director, Global Regulatory & Government Affairs at IDnow

“Gone are the days where bad behavior will be tolerated by newcomers to the financial industry. With major players like Blackrock recently launching its bitcoin ETF, which boasts huge compliance operations as well as general risk aversion, the industry is likely, and will need to, grow up quickly in 2024 and beyond,” added Rayissa.

The main contributor to the astronomical amount of 5.8 billion in fines was the $4.3 billion penalty levied against crypto exchange Binance, which was set “purposefully high.”  

Reasons for the mammoth fine were attributed to violations of: 

The Bank Secrecy Act (BSA)
Failure to register as a money transmitting business
Violations to the International Emergency Economic Powers Act (IEEPA). 

“Binance became the world’s largest cryptocurrency exchange in part because of the crimes it committed – now it is paying one of the largest corporate penalties in US history. The message here should be clear: using new technology to break the law does not make you a disruptor, it makes you a criminal,” said Attorney General Merrick B. Garland. 

Secretary of the Treasury Janet L. Yellen was even clearer in her damnation of the crypto platform, stating how Binance’s “wilful” failures to adhere to AML and sanctions compliance allowed money to flow to terrorists, cybercriminals, and child abusers through its platform. 

Of course, this was not the only wake-up call that should have set off alarm bells in the crypto community in 2023.  

The much-publicized decision to convict the founder of FTX of money laundering and fraud sent further shockwaves through the industry. Despite such turbulence, there were also significant steps taken to stabilize crypto in 2023, such as the Market in Crypto Assets regulatory framework (MiCA), the Transfer of Funds regulation (TFR), and the UK’s proposed regulatory regime, which neared ever closer to completion. As the regulations are finally likely to be enforced in 2024, this could not happen at a more crucial time.

Discover which trends are likely to affect the crypto (and gambling and financial services) industry in 2024.

2024 may very well prove to be a watershed year for the crypto industry. It’s likely that the upcoming crypto regulations will have a positive impact and act as a stabilizing force, while the fines should serve as a deterrent and remind crypto firms that AML laws and regulations apply to them too.

Rayissa Armata, Director, Global Regulatory & Government Affairs at IDnow 

“However, there’s also a very real chance that firms, especially new entrants to the crypto and fintech market that are in a rush to establish themselves and onboard as many customers as possible, may bypass certain steps in customer controls. This is not advisable. KYC processes, and rigorous identity checks are in place for a reason – to protect the business and the customer.”

Light at the end of the tunnel?

Besides the aforementioned regulatory frameworks due to be enforced later this year, 2024 has already started with some reassuring developments regarding crypto regulation. Chief among these was guidance issued by the European Banking Authority (EBA), which extended its guidelines on money laundering and terrorist financing to crypto asset service providers.  

The European Council and European Parliament also recently announced that it had reached a provisional agreement on the decision to create the brand-new Anti-Money Laundering Authority (AMLA) – a European authority with the expressed purpose of countering money laundering and financing of terrorism. Technical negotiations are expected to be ongoing for months, with a tentative aim of formally adopting final agreement by 2024. 

Proposals put forward include: 

Requirements for financial institutions to conduct enhanced due diligence on wealthy individuals, or on transactions involving high amounts.
Enhanced due diligence for cross-border crypto transactions. 

While obliged entities, including financial institutions, banks, real estate agencies, asset management services, casinos, and merchants traditionally enforce AML requirements like KYC, the new rules now cover most of the crypto ecosystem. All crypto-asset service providers will now be required to conduct due diligence on customers, including verifying personal KYC data, with such data required to accompany transfers under the Travel Rule when transactions exceed €1000 or more.

Stronger together.

In February, 2024, we joined a consortium of five partners, including the IOTA Foundationwalt.idSPYCE.5, and Bloom Labs, with the aim of making Crypto Asset Service Providers (CASPs) and self-hosted wallets compliant with the European Anti-Money-Laundering Regulation and the Transfer of Funds Regulation (TFR). [Read more about TFR in our blog, ‘EU extends TFR to crypto, requiring KYC for all crypto transactions.’]

A major challenge for CASPs will be to adhere to the new rules lies in GDPR compliance, as personal identifiable information (PII) should not be stored on blockchains or Distributed Ledger Technologies (DLT). However, to comply with the new regulations, CASPs need to know with whom they are doing business and continuously verify this information.

To address this challenge, the consortium has proposed a system where a trusted party tokenizes an identification process, allowing CASPs to have confidence in this process, without revealing any PII.

Discover more about the consortium by reading the press release and watching the explainer video below.

Putting the KYC into crypto and fintech.

While KYC processes are an integral part in ensuring crypto exchanges can protect themselves and their customers from fraud and money laundering, they are also an important requirement to comply with AML, MiCA and other crypto-related regulation. Having these controls in place will protect investors from financial losses and add stability to a notoriously volatile market. 

“The bottom line is that KYC and Know Your Business is at the nexus of this world and people are catching up to this reality – you have to know who is out there – as people go on or off chain or in and out of the traditional banking world,” warned Rayissa. 

IDnow’s highly configurable identity verification solutions work across multiple regulations, industries and use cases, including crypto and fintech. Whether automated or expert-assisted, our online identity verification methods have been optimized to meet the strictest security standards and regulatory requirements without compromising on customer conversion or consumer experience.

By

Jody Houton
Senior Content Manager at IDnow
Connect with Jody on LinkedIn



Tokeny Solutions

Year of Tokeny: 2023’s Milestones & 2024’s Tokenization Predictions

The post Year of Tokeny: 2023’s Milestones & 2024’s Tokenization Predictions appeared first on Tokeny.
January 2024 Year of Tokeny: 2023’s Milestones & 2024’s Tokenization Predictions

I hope you kicked off the new year with great energy and enthusiasm.

At Tokeny, our people are what make us outstanding. As a remote-first team, we prioritize coming together each year for retreats to forge physical connections, align our visions, and simply share great moments.

Last week, our journey led us to the picturesque Puigcerdà in Spain. Amidst breathtaking mountain hikes, indulgent spa experiences, and delightful culinary adventures, we also found time for some highly efficient meetings. It was truly amazing, and we had the pleasure of meeting one of our first members, Nida, who finally joined us in person after six years of dedicated work at Tokeny from Turkey.

As we reflect on the achievements of 2023, we are thrilled to share the highlights that have propelled us to the forefront of tokenization.

? Tokeny’s Triumphs:

2023 marked Tokeny’s 6th year, a milestone that brought numerous celebrations, including:

45 new financial institutions as customers and partners Apex Group Investment: We secured new investment from Apex Group, which underscores institutional support for tokenization, enabling us to scale further. Security Focus: We continued to demonstrate our commitment to the highest security standards, earning a perfect 10/10 audit score from Hacken and obtaining SOC2 Type 1 certification. Revenue Boom: We increased our annual revenue by a stellar 100%. 176k readerships of our newsletter. 5,679 companies contacted us to collaborate or learn from us. Team Growth: Our team expanded by 25%. And, two of our team members expanded their family by welcoming a newborn ?. 

? Eventful Year:

We gained global presence by attending 52 events worldwide, and speaking at 38. Additionally, we participated in 16 webinars, and featured in 2 podcasts.

We selected some of the most interesting digital content for you:

 DFNS webinarArtory WebinarLux for Finance refresher

? Growing Network of Issuers, Asset Managers and Servicers:

LinkedIn Reach: We saw a 48% growth in our follower base, reaching over 8k followers on LinkedIn. Newsletter Expansion: We welcomed 2.4k new subscribers, our newsletter community now stands at 9.9k, including influential institutional subscribers from top asset managers like JP Morgan, State Street, BlackRock, Fidelity Investments, and Goldman Sachs.

? ERC-3643 Standardization Contribution:

Association Launch: We successfully launched the ERC3643 Association with 45 prominent actively participating members, including top-tier financial institutions, global law firms, and blockchain leaders like SS&C, Invesco, CMS, Apex Group, and Polygon. EIP Recognition: Achieving final status in the Ethereum Improvement Proposal (EIP), the ERC-3643 was officially recognized as the de facto token standard for compliant tokenization.

Predictions for 2024

The upcoming year promises to be full of tokenization breakthroughs. Our predictions are:

5 out of the top 10 asset managers will tokenize assets: Tokenized products will be introduced by half of the top 10 asset managers. Standardization with ERC-3643: The entire industry will start to build valuable use cases and form a dynamic ecosystem with a validated token standard ERC-3643 instead of reinventing the wheel. Regulated entities will start tokenizing cash: A few large financial institutions will issue stablecoins, e-money or deposit tokens. US regulation will open up: The US market is set to open up, compelling others to hasten their initiatives. Institutions go public blockchain: It opens the door for interoperability with DeFi, driving demands from investors by adding utility to their assets and innovation.

As we kick-off 2024, we are prepared for the challenges and opportunities in the tokenization landscape. Your support has been pivotal, and together, we are set for an even more dynamic year ahead.

Thank you for being a crucial part of Tokeny’s journey!

Tokeny Spotlight

REFERENCE

We’re referenced in Asset Tokenization report by Polygon Labs.

Read More

EVENT

CCO, Daniel Coheur, joined the Cyvers Compliance Roundtable.

Read More

PRODUCT NEWSLETTER

ERC-3643 attained ‘Final’ status in its Ethereum Improvement Proposal (EIP).

Read More

FEATURE

We’re featured in the Enterprise Ethereum Alliance report.

Read More

TOKENIZED ART

Read more about unlocking accessible tokenized art funds with verifiable data.

Read More

EVENT

CCO, Daniel Coheur, and CEO, Luc Falempin, took part in Apex Invest.

Read More Tokeny Events

NFT Paris

February  23th-24th, 2024 | ?? Paris

Register Now

ALFI Asset Management Conference

March  19th-20th, 2024 | ?? Luxembourg

Register Now

Digital Assets Week Hong Kong 

March  7th, 2024 | ?? Hong Kong

Register Now ERC3643 Association Recap