Last Update 6:02 PM October 20, 2020 (UTC)

Identosphere - Company Blog Feeds

Brought to you by Identity Woman and Infominer.
Please do support our collaboration on Patreon!!!

Tuesday, 20. October 2020

Global ID

The GiD Report#131 — #IIW31, Filecoin’s launch, what the original iPhone teaches us about…

The GiD Report#131 — #IIW31, Filecoin’s launch, what the original iPhone teaches us about innovation Welcome to The GiD Report, a weekly newsletter that covers GlobaliD team and partner news, market perspectives, and industry analysis. You can check out last week’s report here. GlobaliD’s Dev Bharel and Alexis Falquier are taking part in the 31st annual Internet Identity Workship (the birth
The GiD Report#131 — #IIW31, Filecoin’s launch, what the original iPhone teaches us about innovation

Welcome to The GiD Report, a weekly newsletter that covers GlobaliD team and partner news, market perspectives, and industry analysis. You can check out last week’s report here.

GlobaliD’s Dev Bharel and Alexis Falquier are taking part in the 31st annual Internet Identity Workship (the birthplace of self-sovereign identity) from today until Thursday. Stay tuned for a full recap!

#IIW31 with nearly 250 participants from around the world Filecoin launch WIRED on online parks What the original iPhone teaches us about networks and innovation Satya Nadella on Microsoft’s SSI initiative Stuff happens 1. Filecoin’s hotly anticipated mainnet launched last week on Oct. 15. Filecoin and IPFS creator Juan Benet, Photo: Protocol Labs

If you’ll recall, Filecoin was one of the hottest ICOs of 2017, raising $205 million in its token sale. The vision was a decentralized storage alternative to centralized cloud solutions like Amazon Web Services built around the Interplanetary File System or IPFS protocol.

And though there were some early hiccups since the launch around price volatility and mining economics, as you might expect with a much hyped release, Filecoin is now live!

If you’re curious about how IPFS works, Hacker Noon has a useful beginner’s guide.

First off, the way we manage our data hasn’t really changed much since the birth of the Web:

HTTP is great for loading websites but it wasn’t designed for the transfer of large amounts data (like audio and video files). These constraints possibly enabled the emergence and mainstream success of alternative filesharing systems like Napster (music) and BitTorrent (movies and pretty much anything).
Fast forward to 2018, where on-demand HD video streaming and big data are becoming ubiquitous; we are continuing the upward march of producing/consuming more and more data, along with developing more and more powerful computers to process them. Major advancements in cloud computing have helped sustain this transition, however the fundamental infrastructure for distributing all this data has remained largely the same.

IPFS looks to change that:

IPFS attempts to address the deficiencies of the client-server model and HTTP web through a novel p2p file sharing system. This system is a synthesis of several new and existing innovations. IPFS is an open-source project created by Protocol Labs, an R&D lab for network protocols and former Y Combinator startup. Protocol Labs also develops complementary systems like IPLD and Filecoin, which will be explained below. Hundreds of developers around the world contributed to the development of IPFS, so its orchestration has been a massive undertaking.

The end result?

IPFS provides high throughput, low latency, data distribution. It is also decentralized and secure. This opens up several interesting and exciting use cases. It can be used to deliver content to websites, globally store files with automatic versioning & backups, facilitate secure filesharing and encrypted communication.

Where Filecoin comes into play:

Filecoin is a separate protocol designed to add economic incentives to file storage on IPFS, and foster a distributed storage market that rivals enterprise cloud storage (like Amazon S3, etc). Instead of centralized infrastructure with fixed pricing, IPFS + FileCoin offers storage on a global network of local providers who have the freedom to set prices based on supply and demand. Instead of a Proof-of-Work consensus algorithm like Bitcoin, Filecoin uses Proof-of-Storage to ensure security and reliability. So anyone can join the network, offer unused hard drive space on their computing device, and get rewarded in Filecoin tokens for data storage and retrieval services.

All of which is really neat and a reminder that there is a ton of really cool innovation continuing to happen in the space. What also sets Filecoin apart is how successfully they’ve been at getting the word out and drumming up interest in the project. At the time of this writing, their newly launched token has a market cap north of around $600 million.

And as Greg Kidd has noted, you could see a future where GlobaliD employs Filecoin for our self-sovereign storage solution.

Relevant:

Filecoin Confirms Long-Awaited Mainnet Launch for Next Month — CoinDesk A Beginner’s Guide to IPFS | Hacker Noon 2. So Filecoin is a really cool idea with a ton of potential. And in time when more and more of our lives happen online, we need more cool ideas. WIRED’s idea? We need more and better online parks.

It’s how we could fix the “broken internet,” WIRED suggests (via Mitja):

We need public spaces, built in the spirit of Walt Whitman, that allow us to gather, communicate, and share in something bigger than ourselves.

And private parks won’t do the trick:

MUCH OF OUR communal life now unfolds in digital spaces that feel public but are not. When technologists refer to platforms like Facebook and Twitter as “walled gardens” — environments where the corporate owner has total control — they’re literally referring to those same private pleasure gardens that Whitman was reacting to. And while Facebook and Twitter may be open to all, as in those gardens, their owners determine the rules.
Venture-backed platforms make poor quasi-public spaces for three reasons.
First, as the legendary venture capitalist Paul Graham put it, “startups = growth.” The focus on growth — of users, of time spent, and then of revenue — is the defining trait that has made Facebook a $750 billion company. And the key to rapid growth is optimization to create a “frictionless” experience: The more relevant the content you see, the likelier you are to click, return to Facebook, and bring your friends.
But friction is essential to public space. Public spaces are so generative precisely because we run into people we’d normally avoid, encounter events we’d never expect, and have to negotiate with other groups that have their own needs. The social connections that run-ins create, social scientists tell us, are critical in binding communities together across lines of difference. Building a healthy community requires the careful generation of this thick web of social ties. Rapid growth can quickly overwhelm and destroy it — as anyone who has lived in a gentrifying neighborhood knows.

Also, the challenges (the first of which is funding):

Second, there’s a talent and research problem. People outside of tech generally underestimate how hard it is to build something seamless, intuitive, and irresistible that allows millions of people to interact. We need to rally a diverse, representative generation of builders to this cause. And given that digital products live and die by metrics, we need to identify signals that correspond to flourishing public digital spaces.
Finally, there’s a problem of public imagination. Fixing our ability to connect and build healthy communities at scale is arguably an Apollo mission for this generation — a decisive challenge that will determine whether our society progresses or falls back into conspiracy-driven tribalism. We need to summon the creative will worthy of a problem of this urgency and consequence.

Sounds like a potential use case for SSI and GlobaliD Groups.

To Mend a Broken Internet, Create Online Parks 3. Apple unveiled the latest iPhone last week and it’s a reminder that for innovation to flourish, we need to push back against the networks and platforms that rule over us.

In the case of the iPhone, it was the cell phone companies — from the NYTimes tech newsletter:

In the pre-iPhone age, we had years of clunky mobile devices, and phone providers like AT&T deserved a lot of the blame.
Phone companies dictated almost everything about flip phones and early smartphones, including their features, look and speed. People had to put up with crummy software from the phone company to surf the web or download songs and ringtones. (Remember ringtones?!) It stank.
One of the secrets to the iPhone’s success is Apple simply said no to all of that. Apple’s chief executive at the time, Steve Jobs, gave wireless phone companies an ultimatum: Stay out of every decision about the iPhone or lose a shot at selling a potential blockbuster.
Apple got its way, the iPhone was eventually a success and phone companies got rich from it alongside Apple.

How times have changed as we now find Apple on the other side of the equation.

Relevant:

Via /antoine — Apple at a Crossroads: An Interview With M.G. Siegler 4. The other week, we talked about how Microsoft was adhering to newly laid out principles regarding their app store — clearly a shot at what’s going on with Epic v. Apple and Google. Microsoft has also been one of the biggest brands/companies at the forefront of the self-sovereign identity SSI movement.

Here’s Satya Nadella talking SSI:

We’re going further, working to create an open, decentralized identity system that is independent of any central authority or tech company even. Military veterans are already piloting these capabilities to jumpstart their careers.
Veterans now store their verified service records and transcript in a digital wallet on their phone, which they can share directly a university or an employee. Universities can validate it in seconds and never have to store the sensitive data. Veterans can add verified credentials to their LinkedIn profile to help them stand out and opening up new career opportunities.

Relevant:

Nationalism & Its Threat to Digital Identity: Part 4 — One World Identity Deloitte: Digital identity: proving it’s you Self-sovereign identity: The true password killer EFF: Digital Identification Must Be Designed for Privacy and Equity Self-Sovereign Identity in 2030 Explained — CoinDesk 5. Stuff happens: CFTC Chairman Heath Tarbert Talks Ethereum, DeFi and the Next BitMEX — CoinDesk Neobanks, Digital Assets, and Finance in 2021 Zapps — Apps on Zoom | Product Hunt WeChat Judge Unlikely to Let Ban Go Forward Amid U.S. Appeal Via /jvsMcKinsey 2020 Global Payments Report Via /m — Coronavirus has turned the humble QR code into an everyday essential When Your Last $166 Vanishes: ‘Fast Fraud’ Surges on Payment Apps Clear Conquered U.S. Airports. Now It Wants to Own Your Entire Digital Identity. International Statement: End-To-End Encryption and Public Safety The Man Who Speaks Softly — and Commands a Big Cyber Army David Nage Fortnite tweet:
“Fortnite has over 350M players worldwide; +70% of them spend $85 on V-Bucks which allow them to get Outfits, Gliders, Pickaxes and Emotes. That’s about $20B on digital assets they don’t own This is rife with opportunities for disruption.”
Clarence Thomas wants to reel in Section 230 IMF, World Bank Plan Central Bank Digital Currency Rules — CoinDesk BONUS TWEET: “The #NobelPrize committee couldn’t reach Paul Milgrom to share the news that he won, so his fellow winner and neighbor Robert Wilson knocked on his door in the middle of the night.”

The GiD Report#131 — #IIW31, Filecoin’s launch, what the original iPhone teaches us about… was originally published in GlobaliD on Medium, where people are continuing the conversation by highlighting and responding to this story.


Smarter with Gartner - IT

CIO Agenda 2021: Prepare for Increased Digital Innovation

Despite the rocky economic climate in 2020, IT leaders expect a 2.0% increase, on average, in the IT budget for 2021, according to the Gartner CIO Agenda. And they’re not deluding themselves. Boards of directors plan to increase spending on IT and technology by an average of 6.9% across the enterprise.  At top organizations, CIOs face fewer constraints, with 28% saying that they have secu

Despite the rocky economic climate in 2020, IT leaders expect a 2.0% increase, on average, in the IT budget for 2021, according to the Gartner CIO Agenda. And they’re not deluding themselves. Boards of directors plan to increase spending on IT and technology by an average of 6.9% across the enterprise. 

At top organizations, CIOs face fewer constraints, with 28% saying that they have secured additional funding

The investment makes sense, as organizational leaders intend to spend more resources more rapidly on digital acceleration to better adapt to the changing economic conditions caused by the pandemic. Despite the uncertainty created by COVID-19, top performers seized an opportunity to lean into the shift resulting from the pandemic and increase funding for digital innovation. They invested more quickly, and those investments mattered. Organizations that increased funding of digital innovation are 2.7 times more likely to be a top performer than a trailing one.

However, looking into the future, the gap starts to close for 2021 planning, with 69% of top performers planning to increase digital innovation funding compared to close to 68% of typical performers. 

Even more telling is what differentiates the process of gaining additional IT funds in top versus trailing organizations. At top organizations, CIOs face fewer constraints, with 28% saying that they have secured additional funding specifically to support more experimentation/risk-taking in IT. That number is closer to 19% for trailing organizations. 

What CIOs can do

Ensure that IT’s initiatives align with digital business acceleration and draft business cases around digital channels. Look for back-office projects for finance, marketing or HR that can be put on the back burner to reallocate funding to digital innovation. 

[swg_ad id="37603"]

The post CIO Agenda 2021: Prepare for Increased Digital Innovation appeared first on Smarter With Gartner.


Nyheder fra WAYF

LAWschool ny tjeneste i WAYF

LAWschool har i dag gennemført sin tilslutning til WAYF og kan derfor nu give studerende ved abonnerende institutioner enkel adgang til en læringsplatform for jura. Platformen omfatter over 80 læringsvideoer med supplerende materiele og rummer også sociale funktioner for brugerne. Language Danish Read more about LAWschool ny tjeneste i WAYF

LAWschool har i dag gennemført sin tilslutning til WAYF og kan derfor nu give studerende ved abonnerende institutioner enkel adgang til en læringsplatform for jura. Platformen omfatter over 80 læringsvideoer med supplerende materiele og rummer også sociale funktioner for brugerne.

Language Danish Read more about LAWschool ny tjeneste i WAYF

Evernym

Evernym Joins with Other Solution Providers to Achieve Interoperability Milestone

Evernym has a vision of ubiquitous verifiable credentials that facilitate every trusted interaction in daily life. We have always recognized that adding a layer of trust to the Internet requires an ecosystem with many stakeholders, including multiple solution vendors. Such a layer demands open, interoperable standards that enable credential issuers, holders, and verifiers to choose […] The post

Evernym has a vision of ubiquitous verifiable credentials that facilitate every trusted interaction in daily life. We have always recognized that adding a layer of trust to the Internet requires an ecosystem with many stakeholders, including multiple solution vendors. Such a layer demands open, interoperable standards that enable credential issuers, holders, and verifiers to choose […]

The post Evernym Joins with Other Solution Providers to Achieve Interoperability Milestone appeared first on Evernym.


Smarter with Gartner - IT

7 Digital Disruptions You Might Not See Coming In the Next 5 Years

Usually, if you have a health concern, you make an appointment — possibly via telehealth — with your doctor, who will conduct an examination or run tests to try and pinpoint the problem. But as technology evolves, it has become possible for a computer to listen to your voice and tell you if you have signs of early-onset dementia. In fact, machines soon will know more about your physical well-being

Usually, if you have a health concern, you make an appointment — possibly via telehealth — with your doctor, who will conduct an examination or run tests to try and pinpoint the problem. But as technology evolves, it has become possible for a computer to listen to your voice and tell you if you have signs of early-onset dementia. In fact, machines soon will know more about your physical well-being than you do. 

Nobody can see the future exactly as it will happen, so we have to be prepared

This raises a host of questions: Should a computer tell you about a potential medical issue? Can you sue the people who own the software if they don’t tell you and it results in something harmful? Is telling you that you have a serious medical condition the right thing to do? 

This is called technological biohacking, and it’s one of seven digital disruptions that you might not see coming, and like the others on the list, it could change the world. 

[swg_ad id="37663"]

“Nobody can see the future exactly as it will happen, so we have to be prepared. We have to proof against what might happen,” said Daryl Plummer, Distinguished VP Analyst, during his presentation at virtual Gartner Symposium IT/Xpo®. “We’re in for a decade of radical technology disruption. Now’s the time to get started.” 

It’s important to know the different types of change:

Features: Continuous improvement of constant changes and add-ons. 

Fads: Short-term, very exciting, high-impact things that fade quickly. 

Disruption: Something that comes a little after a fad, but stays and changes business models. This is a fundamental change.

Organizations should focus on identifying the disruptions that will have long-term effects on the world. This, historically, has included things like streaming services to televisions or ride shares. 

No. 1: Nontraditional compute technologies 

Nontraditional compute technologies are technologies you can’t afford to ignore. As Moore’s Law approaches its potential breaking point, new technology will need to take over. And Moore’s Law is only one technological axiom at risk. Already, deep-neural networks, DNA storage, quantum computing and other technologies are changing the way organizations think about compute styles and models, and how those styles will expand the limits of computational power. 

Read more: Gartner Top Strategic Technology Trends for 2021

Computing technologies are evolving from traditional to digital, and then to neuromorphic, which allows them to be more human-like. Chips are getting smaller and more dense, new materials and markets are evolving, and the cost of computing is changing. These technologies are also creating new selling opportunities and products.  

No. 2: DNA data storage

DNA data storage tackles the challenge of massive data storage and longevity. This technology enables the storage of unprecedented amounts of technology for thousands of years in a small space and in a less corruptible form. It does this by encoding binary data in the base pairs of synthetic DNA. Storage of petabytes, exabytes or even yottabytes of data in mere grams of synthetic DNA is possible.

It’s estimated that 5 exabytes could store every spoken word ever uttered by humans. One single gram of DNA could store all the knowledge generated by humans in one year. That data can then be stored wherever it’s needed. For example, it could be stored in your car engine, for use with repairs. DNA data will radically change how humans handle, store and retrieve data. 

No. 3: Distributed cloud

Distributed cloud is where cloud services are distributed to different physical locations, but the operation, governance and evolution remain the responsibility of the public cloud provider. This means the services are where the customer needs them, but are still the responsibility of the public cloud. In addition, reduced cost of data egress is supported for high-data-traffic applications like machine learning. Data sovereignty can be assured because the cloud services are kept in a specific geographic location but can still be in the public cloud.

No. 4: Digital twin of the earth 

Traditionally, digital twins exist in a very specific, discrete context. But, “if we can see it, we can duplicate it,” said Plummer. 

A digital twin of the earth would afford a holistic view of how our climate is changing around the globe, how pollution is traveling from place to place, and even how ships are tracked from port to port. This changes how humans think about mapping, tracking, physical operations and emergency services.

For example, a digital twin of the earth would be able to alert authorities to the smallest brush fire before it burns out of control. The earth becomes a monitoring ground for low-earth-orbit satellites to help create simulations of everything from beach soil erosion to animal migrations to human behavioral tracking. 

No. 5: Augmented humans

Right now, computers are able to translate thoughts into text and onto a screen. Next up? Pictures pulled from the mind; the same technology would enable people to put their memories or images back into their brain. Technology will also allow humans to augment themselves — from exoskeletons for superhuman speed to implants for superhuman hearing. This will change what it means to be human, but will also raise some serious ethical questions. However, augmentation also has the potential to help people live better lives.

No. 6: Technological biohacking

Everything is recorded. Computers are constantly listening, analyzing and storing data, but there are questions about the use of that information. This is especially important as computers begin to collect data that you’re unconsciously sharing, from physical health to verbal triggers, as seen in the intro example.

If a camera in a public park can identify something about your health that you didn’t even know, the concern is no longer maintaining privacy as much as it is understanding how your private information is used. However, when used in aggregate, biohacking can result in cheaper drug production and better medical options.

No. 7: Emotional experiences 

Can you be fired if you’re not happy enough? Computers with cameras can now tell if you’re upset, sad, angry or distressed, which means theoretically organizations could evaluate your feelings or expressions and use them to make decisions. Inexpensive sensors are now able to track physical biometrics, leading to mood-oriented computing and hyperpersonalized experiences. In fact, Gartner predicts by 2024, AI identification of emotions will influence more than half of the online ads you see. 

The post 7 Digital Disruptions You Might Not See Coming In the Next 5 Years appeared first on Smarter With Gartner.


Ontology

Ontology Boosts Security And Compliance Through Integration With Chainalysis

Integration with Chainalysis will create a safer experience by increasing trust and security across the Ontology ecosystem Ontology, the high performance, open-source blockchain specializing in digital identity and data, has announced today an integration with Chainalysis, the blockchain analysis company. Ontology will utilize Chainalysis’s compliance and investigative solutions, KYT (Know

Integration with Chainalysis will create a safer experience by increasing trust and security across the Ontology ecosystem

Ontology, the high performance, open-source blockchain specializing in digital identity and data, has announced today an integration with Chainalysis, the blockchain analysis company.

Ontology will utilize Chainalysis’s compliance and investigative solutions, KYT (Know Your Transaction) and Reactor, to enhance trust, provide a safer experience to traders, and combat illicit activity through transaction monitoring and tracing.

Andy Ji, Co-Founder of Ontology said, “Chainalysis’s commitment to building trust in blockchains is strongly aligned with our own values at Ontology where we are dedicated to providing our community and our ecosystem with the highest level of security and trust. Joining forces with Chainalysis allows us to further enhance our internal compliance measures through their high-level services such as Chainalysis KYT and Chainalysis Reactor. This integration also permits us to increase the level of transaction monitoring within the Ontology ecosystem and further mitigate against financial risks such as fraud and money laundering.”

Chainalysis provides blockchain data and analysis to government agencies, exchanges, and financial institutions across 50 countries. Its software is used to detect and investigate illicit activity ranging from fraud and hacks to money laundering and ransomware attacks.

Jason Bonds, Chief Revenue Officer of Chainalysis, said, “By partnering with Chainalysis and putting anti-money laundering transaction monitoring in place, Ontology is demonstrating its commitment to compliance best practices. This not only aligns with global regulatory guidance but also with their mission to build trust and transparency through their unique blockchain.”

The news comes following Chainalysis’s recent announcement of plans to increase its presence in the APAC region and open new offices in Singapore and Tokyo.

Find Ontology elsewhere

Ontology website / Ontology GitHub / ONTO website / OWallet (GitHub)

Telegram (English)Discord

Twitter / Reddit / FacebookLinkedIn

Ontology Boosts Security And Compliance Through Integration With Chainalysis was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


Trinsic (was streetcred)

Trinsic Cements its Commitment to Interoperability Ahead of Internet Identity Workshop XXXI

Self-sovereign identity (SSI) has largely developed and matured thanks to the rich conversations at the semi-annual Internet Identity Workshop (IIW)—one of the most important user-centric digital identity events on earth. Not only does IIW act as a place to iron out specs and bring communities together, but it acts as a forcing function for implementors […] The post Trinsic Cements its Commitmen

Self-sovereign identity (SSI) has largely developed and matured thanks to the rich conversations at the semi-annual Internet Identity Workshop (IIW)—one of the most important user-centric digital identity events on earth. Not only does IIW act as a place to iron out specs and bring communities together, but it acts as a forcing function for implementors to demonstrate the solutions they’ve built over the last six months. IIW’s unique focus on community drives vendors to emphasize interoperability in their demonstrations.

 

With IIW #31 starting this morning, it’s the perfect time to reflect on how IIW continues to be a part of the interoperability story of the SSI community.

Trinsic's interoperability story

Interoperability has always been of paramount importance to Trinsic. That story begins at an IIW #28 demo with the Government of British Columbia where we demonstrated the first mobile app and enterprise SSI platform to comply with the Hyperledger Aries RFCs, achieving interoperability with the open source ariescloudagent-python. This was the first time two separate codebases demonstrated true interoperability.

 

Since then, the Trinsic Wallet has been a standard tool for Aries developers to test the agents they’re building. In the Aries community, Trinsic became a leader of interoperability efforts, helping others debug when their implementations didn’t work with our wallet. In our view, spending time helping others in the community achieve interoperability was time well-spent.

 

But this ad-hoc interoperability testing done via direct messages in RocketChat wasn’t scalable. As the number of Aries-based implementations grew, so too did the burden of maintaining and promoting interoperability. We are happy to say that only three IIW’s after the community’s first display of interoperability, a growing group of SSI vendors have convened to collaborate and hold each other to a high standard of interoperability through the formalization of vendor-to-vendor solution testing.

Formalizing interoperability between vendors

We are proud to work with the tremendous teams in our community to formalize vendor-to-vendor solution testing according to test suites and interoperability profiles in the Technical Stack Working Group at the Trust over IP (ToIP) Foundation. This group has been meeting weekly for several months, and we look forward to this group continuing to improve the interoperability requirements.

 

Meeting weekly for the last several months, we’ve formally tested the Trinsic platform with the products developed by the following projects. That means any developer that uses the Trinsic platform—including Trinsic’s APIs, Trinsic Studio, and the Trinsic Wallet—will inherit interoperability with solutions built using the following products:

 

Lissi esatus AG IBM Evernym idRamp Aries Cloud Agent Python Aries Framework .Net

 

If you plan on attending IIW #31 this week, be sure to attend the session on ToIP interoperability profiles to learn more. As always, we encourage other vendors to join the cause with us so that three IIW’s from now, we will see dozens of interoperable SSI implementations that comply with the ToIP interoperability profiles.

Future interoperability efforts

Our current and future interoperability efforts expand beyond the amazing work happening at the ToIP Foundation. We’re actively engaged in developing next-generation technologies including new zero-knowledge proof capabilities at the Decentralized Identity Foundation (DIF), support for additional DID methods, and W3C compliant JSON-LD credentials. We will be leading a session this week at IIW #31 on some of our next-generation interoperability efforts, so stay tuned!

Our interoperable product stack

Each of Trinsic’s products are powered by the same interoperable core. With the Trinsic Studio, Trinsic Wallet, or our developer platform of 3 APIs, robust documentation, and SDKs in popular languages, your team can rest assured your implementation is powered by a company that prioritizes interoperability.

 

Trinsic Studio: An easy-to-use web interface for managing credential exchange with no code. Also serves as the mechanism to acquire API keys and manage billing for paid plans. Try it for yourself completely free, and issue a credential in less than 5 minutes! Provider API: Our newest API enables developers to programmatically provision issuer and verifier cloud agents. Learn more about the provider API in the recent launch announcement. Credentials API: Our core API enables developers to have a turnkey way to issue, verify, and manage verifiable credentials on any Hyperledger Indy network. Check out our documentation or one of our reference applications to get started. Wallet API: An API for creating and managing cloud wallets on behalf of credential holders. It’s the backend of our Mobile SDK, which you can read more about in our recent post about building your own SSI wallets. Get started with the API by checking out the documentation.

Focusing on interoperability has been at Trinsic’s core since we began, because we want to ensure that verifiable credentials are as accessible and useful as possible to people everywhere. Making sure that SSI vendors’ solutions are interoperable with one another is one of the major steps to achieving that vision.

The post Trinsic Cements its Commitment to Interoperability Ahead of Internet Identity Workshop XXXI appeared first on Trinsic.


Self Key

Certifiers Platform is Live!

SelfKey Weekly Newsletter Date – 16th October, 2020 We have got big news for you, the Certifiers platform is live on the Desktop Wallet. Furthermore, Mainland China users can now access the SelfKey Marketplace. The post Certifiers Platform is Live! appeared first on SelfKey.

SelfKey Weekly Newsletter

Date – 16th October, 2020

We have got big news for you, the Certifiers platform is live on the Desktop Wallet. Furthermore, Mainland China users can now access the SelfKey Marketplace.

The post Certifiers Platform is Live! appeared first on SelfKey.


PingTalk

Ping Identity Named a Strong Performer in The Forrester Wave™: Customer Identity and Access Management, Q4 2020

As security and business leaders are under pressure to accelerate digital transformation, it’s no surprise that the customer identity and access management (CIAM) category is reported to be growing annually at 20-25%. Forrester Research conducted an evaluation of the 13 most significant vendors in CIAM, in The Forrester Wave™: Customer Identity and Access Management, Q4 2020. We’re pleased to a

As security and business leaders are under pressure to accelerate digital transformation, it’s no surprise that the customer identity and access management (CIAM) category is reported to be growing annually at 20-25%.

Forrester Research conducted an evaluation of the 13 most significant vendors in CIAM, in The Forrester Wave™: Customer Identity and Access Management, Q4 2020. We’re pleased to announce that Ping was named a “Strong Performer” in this evaluation, noted for our Ping Intelligent Identity™ platform and dedicated cloud-based, API-oriented CIAM platform, PingOne for Customers.
 

Monday, 19. October 2020

Smarter with Gartner - IT

Gartner Top 10 Trends in Data and Analytics for 2020

In response to the COVID-19 emergency, over 500 clinical trials of potential COVID-19 treatments and interventions began worldwide. The trials use a living database that compiles and curates data from trial registries and other sources. This helps medical and public health experts predict disease spread, find new treatments and plan for clinical management of the pandemic. Data and analytics com

In response to the COVID-19 emergency, over 500 clinical trials of potential COVID-19 treatments and interventions began worldwide. The trials use a living database that compiles and curates data from trial registries and other sources. This helps medical and public health experts predict disease spread, find new treatments and plan for clinical management of the pandemic.

Data and analytics combined with artificial intelligence (AI) technologies will be paramount in the effort to predict, prepare and respond in a proactive and accelerated manner to a global crisis and its aftermath.

“In the face of unprecedented market shifts, data and analytics leaders require an ever-increasing velocity and scale of analysis in terms of processing and access to accelerate innovation and forge new paths  to a post-COVID-19 world,” said Rita Sallam, Distinguished VP Analyst, during her presentation at virtual Gartner Symposium IT/Xpo® 2020.

[swg_ad id="37066"]

Here are the top 10 technology trends that data and analytics leaders should focus on as they look to make essential investments to prepare for a reset.

Trend 1: Smarter, faster, more responsible AI

By the end of 2024, 75% of enterprises will shift from piloting to operationalizing AI, driving a 5X increase in streaming data and analytics infrastructures.

Within the current pandemic context, AI techniques such as machine learning (ML), optimization and natural language processing (NLP) are providing vital insights and predictions about the spread of the virus and the effectiveness and impact of countermeasures. AI and machine learning are critical realigning supply and the supply chain to new demand patterns.

Pre-COVID models based on historical data may no longer be valid

AI techniques such as reinforcement learning and distributed learning are creating more adaptable and flexible systems to handle complex business situations; for example, agent-based systems can model and stimulate complex systems - particularly now when pre-COVID models based on historical data may no longer be valid. 

Significant investments made in new chip architectures such as neuromorphic hardware that can be deployed on edge devices are accelerating AI and ML computations and workloads and reducing reliance on centralized systems that require high bandwidths. Eventually, this could lead to more scalable AI solutions that have higher business impact.

Responsible AI that enables model transparency is essential to protect against poor decisions. It results in better human-machine collaboration and trust for greater adoption and alignment of decisions throughout the organization.

Trend 2: Decline of the dashboard

Dynamic data stories with more automated and consumerized experiences will replace visual, point-and-click authoring and exploration. As a result, the amount of time users spend using predefined dashboards will decline. The shift to in-context data stories means that the most relevant insights will stream to each user based on their context, role or use. These dynamic insights leverage technologies such as augmented analytics, NLP, streaming anomaly detection and collaboration.

Data and analytics leaders need to regularly evaluate their existing analytics and business intelligence (BI) tools and innovative startups offering new augmented and NLP-driven user experiences beyond the predefined dashboard.

Trend 3: Decision intelligence

By 2023, more than 33% of large organizations will have analysts practicing decision intelligence, including decision modeling.

Decision intelligence brings together a number of disciplines, including decision management and decision support. It encompasses applications in the field of complex adaptive systems that bring together multiple traditional and advanced disciplines.

It provides a framework to help data and analytics leaders design, compose, model, align, execute, monitor and tune decision models and processes in the context of business outcomes and behavior.

Explore using decision management and modeling technology when decisions need multiple logical and mathematical techniques, must be automated or semi-automated, or must be documented and audited.

Trend 4: X analytics

Gartner coined the term “X analytics” to be an umbrella term, where X is the data variable for a range of different structured and unstructured content such as text analytics, video analytics, audio analytics, etc.

Data and analytics leaders use X analytics to solve society’s toughest challenges, including climate change, disease prevention and wildlife protection.

During the pandemic, AI has been critical in combing through thousands of research papers, news sources, social media posts and clinical trials data to help medical and public health experts predict disease spread, capacity-plan, find new treatments and identify vulnerable populations. X analytics combined with AI and other techniques such as graph analytics (another top trend) will play a key role in identifying, predicting and planning for natural disasters and other business crises and opportunities in the future.

Data and analytics leaders should explore X analytics capabilities available from their existing vendors, such as cloud vendors for image, video and voice analytics, but recognize that innovation will likely come from small disruptive startups and cloud providers.

Trend 5: Augmented data management

Augmented data management uses ML and AI techniques to optimize and improve operations.  It also converts metadata from being used in auditing, lineage and reporting to powering dynamic systems.

Augmented data management products can examine large samples of operational data, including actual queries, performance data and schemas. Using the existing usage and workload data, an augmented engine can tune operations and optimize configuration, security and performance.

Data and analytics leaders should look for augmented data management enabling active metadata to simplify and consolidate their architectures, and also increase automation in their redundant data management tasks.

Trend 6: Cloud is a given

By 2022, public cloud services will be essential for 90% of data and analytics innovation.

As data and analytics moves to the cloud, data and analytics leaders still struggle to align the right services to the right use cases, which leads to unnecessary increased governance and integration overhead.

The question for data and analytics is moving from how much a given service costs to how it can meet the workload’s performance requirements beyond the list price.

Data and analytics leaders need to prioritize workloads that can exploit cloud capabilities and focus on cost optimization and other benefits such as change and innovation acceleration when moving to cloud.

Trend 7: Data and analytics worlds collide

Data and analytics capabilities have traditionally been considered distinct capabilities  and managed accordingly. Vendors offering end-to-end workflows enabled by augmented analytics blur the distinction between once separate markets.

The collision of data and analytics will increase interaction and collaboration between historically separate data and analytics roles. This impacts not only the technologies and capabilities provided, but also the people and processes that support and use them. The spectrum of roles will extend from traditional data and analytics roles in IT to information explorer, consumer and citizen developer as an example.

To turn the collision into a constructive convergence, incorporate both data and analytics tools and capabilities into the analytics stack. Beyond tools, focus on people and processes to foster communication and collaboration. Leverage data and analytics ecosystems enabled by an augmented approach that have the potential to deliver coherent stacks.

Trend 8: Data marketplaces and exchanges

By 2022, 35% of large organizations will be either sellers or buyers of data via formal online data marketplaces, up from 25% in 2020.

Data marketplaces and exchanges provide single platforms to consolidate third-party data offerings. These marketplaces and exchanges provide centralized availability and access (to X analytics and other unique data sets, for example) that create economies of scale to reduce costs for third-party data.

To monetize data assets through data marketplaces, data and analytics leaders should establish a fair and transparent methodology by defining a data governance principle that ecosystems partners can rely on.

Trend 9: Blockchain in data and analytics

Blockchain technologies address two challenges in data and analytics. First, blockchain provides the full lineage of assets and transactions. Second, blockchain provides transparency for complex networks of participants.

Outside of limited bitcoin and smart contract use cases, ledger database management systems (DBMSs) will provide a more attractive option for single-enterprise auditing of data sources. By 2021, Gartner estimates that most permissioned blockchain uses will be replaced by ledger DBMS products.

Data and analytics should position blockchain technologies as supplementary to their existing data management infrastructure by highlighting the capabilities mismatch between data management infrastructure and blockchain technologies.

Trend 10: Relationships form the foundation of data and analytics value

By 2023, graph technologies will facilitate rapid contextualization for decision making in 30% of organizations worldwide. Graph analytics is a set of analytic techniques that allows for the exploration of relationships between entities of interest such as organizations, people and transactions.

It helps data and analytics leaders find unknown relationships in data and review data not easily analyzed with traditional analytics.

For example, as the world scrambles to respond to current and future pandemics, graph technologies can relate entities across everything from geospatial data on people’s phones to facial-recognition systems that can analyze photos to determine who might have come into contact with individuals who later tested positive for the coronavirus.

Consider investigating how graph algorithms and technologies can improve your AI and ML initiatives

When combined with ML algorithms, these technologies can be used to comb through thousands of data sources and documents that could help medical and public health experts rapidly discover new possible treatments or factors that contribute to more negative outcomes for some patients.

Data and analytics leaders need to evaluate opportunities to incorporate graph analytics into their analytics portfolios and applications to uncover hidden patterns and relationships. In addition, consider investigating how graph algorithms and technologies can improve your AI and ML initiatives.

This article has been updated from the June 9, 2020 original to reflect new events, conditions and research.

The post Gartner Top 10 Trends in Data and Analytics for 2020 appeared first on Smarter With Gartner.


Gartner Top Strategic Technology Trends for 2021

When employees at an industrial site returned to the workplace after it was closed during the COVID-19 pandemic, they noticed a few differences. Sensors or RFID tags were used to determine whether employees were washing their hands regularly. Computer vision determined if employees were complying with mask protocol and speakers were used to warn people of protocol violations. What’s more, this beh

When employees at an industrial site returned to the workplace after it was closed during the COVID-19 pandemic, they noticed a few differences. Sensors or RFID tags were used to determine whether employees were washing their hands regularly. Computer vision determined if employees were complying with mask protocol and speakers were used to warn people of protocol violations. What’s more, this behavioral data was collected and analyzed by the organizations to influence how people behaved at work .

The collection and use of such data to drive behaviors is called the Internet of Behavior (IoB). As organizations improve not only the amount of data they capture, but also how they combine data from different sources and use that data, the IoB will continue to affect how organizations interact with people.

[swg_ad id="37663"]

The IoB is one of Gartner’s nine strategic technology trends that will enable the plasticity or flexibility that resilient businesses require in the significant upheaval driven by COVID-19 and the current economic state of the world.

The IoB is about using data to change behaviors

“The unprecedented socioeconomic challenges of 2020 demand the organizational plasticity to transform and compose the future,” said Brian Burke, Research Vice President, during virtual 2020 Gartner Symposium IT/Xpo®.

This year’s trends fall under three themes: People centricity, location independence and resilient delivery.

People centricity: Although the pandemic changed how many people work and interact with organizations, people are still at the center of all business. And they need digitaliized processes to function in today’s environment.

Location independence: COVID-19 has shifted where employees, customers, suppliers and organizational ecosystems physically exist. Location independence requires a technology shift to support this new version of business.

Resilient delivery: Whether a pandemic or a recession, volatility exists in the world. Organizations that are prepared to pivot and adapt will weather all types of disruptions.

As always, these nine strategic technology trends do not operate independently of each other, but rather build on and reinforce each other. Combinatorial innovation is an overarching theme for these trends. Together they enable organizational plasticity that will help guide organizations in the next five to 10 years.

Trend 1: Internet of Behaviors

As demonstrated by the COVID-19 protocol monitoring example, the IoB is about using data to change behaviors. With an increase in technologies that gather the “digital dust” of daily life — data that spans the digital and physical worlds — that information can be used to influence behaviors through feedback loops.

For example, for commercial vehicles, telematics can monitor driving behaviors, from sudden braking to aggressive turns. Companies can then use that data to improve driver performance, routing and safety.

IoB does have ethical and societal implications depending on the goals and outcomes of individual uses

The IoB can gather, combine and process data from many sources including: Commercial customer data; citizen data processed by public-sector and government agencies; social media; public domain deployments of facial recognition; and location tracking. The increasing sophistication of the technology that processes this data has enabled this trend to grow.

IoB does have ethical and societal implications depending on the goals and outcomes of individual uses. The same wearables that health insurance companies use to track physical activities to reduce premiums could also be used to monitor grocery purchases; too many unhealthy items could increase premiums. Privacy laws, which vary from region to region, will greatly impact the adoption and scale of the IoB.

Trend 2: Total experience

Total experience combines multiexperience, customer experience, employee experience and user experience to transform the business outcome. The goal is to improve the overall experience where all of these pieces intersect, from technology to employees to customers and users.

This trend enables organizations to capitalize on COVID-19 disruptors

Tightly linking all of these experiences — as opposed to individually improving each one in a silo — differentiates a business from competitors in a way that is difficult to replicate, creating sustainable competitive advantage. This trend enables organizations to capitalize on COVID-19 disruptors including remote work, mobile, virtual and distributed customers.

For example, one telecommunications company transformed its entire customer experience in an effort to improve safety and satisfaction. First, it deployed an appointment system via an existing app. When customers arrived for their appointment and came within 75 feet of the store, they received two things: 1) A notification to guide them through the check-in process and 2) an alert letting them know how long it would be before they could safely enter the store and maintain social distance.

The company also adjusted its service to include more digital kiosks and enabled employees to use their own tablets to co-browse customers’ devices without having to physically touch the hardware. The result was a safer, more seamless and integrated overall experience for customers and employees.

Trend 3: Privacy-enhancing computation

Privacy-enhancing computation features three technologies that protect data while it’s being used. The first provides a trusted environment in which sensitive data can be processed or analyzed. The second performs processing and analytics in a decentralized manner. The third encrypts data and algorithms before processing or analytics.

This trend enables organizations to collaborate on research securely across regions and with competitors without sacrificing confidentiality. This approach is designed specifically for the increasing need to share data while maintaining privacy or security.

Trend 4: Distributed cloud

Distributed cloud is where cloud services are distributed to different physical locations, but the operation, governance and evolution remain the responsibility of the public cloud provider.

Distributed cloud is the future of cloud

Enabling organizations to have these services physically closer helps with low-latency scenarios, reduces data costs and helps accommodate laws that dictate data must remain in a specific geographical area. However, it also means that organizations still benefit from public cloud and aren’t managing their own private cloud, which can be costly and complex. Distributed cloud is the future of cloud.

Read more: The CIO’s Guide to Distributed Cloud

Trend 5: Anywhere operations

An anywhere operations model will be vital for businesses to emerge successfully from COVID-19. At its core, this operating model allows for business to be accessed, delivered and enabled anywhere — where customers, employers and business partners operate in physically remote environments.

The model for anywhere operations is “digital first, remote first;” for example, banks that are mobile-only, but handle everything from transferring funds to opening accounts with no physical interaction. Digital should be the default at all times. That’s not to say physical space doesn’t have its place, but it should be digitally enhanced, for example, contactless check-out at a physical store, regardless of whether its physical or digital capabilities should be seamlessly delivered.

Trend 6: Cybersecurity mesh

Cybersecurity mesh is a distributed architectural approach to scalable, flexible and reliable cybersecurity control. Many assets now exist outside of the traditional security perimeter. Cybersecurity mesh essentially allows for the security perimeter to be defined around the identity of a person or thing. It enables a more modular, responsive security approach by centralizing policy orchestration and distributing policy enforcement. As perimeter protection becomes less meaningful, the security approach of a “walled city” must evolve to current needs.

Read more: Gartner Top 9 Security and Risk Trends for 2020

Trend 7: Intelligent composable business

An intelligent composable business is one that can adapt and fundamentally rearrange itself based on a current situation. As organizations accelerate digital business strategy to drive faster digital transformation, they need to be agile and make quick business decisions informed by currently available data.

To successfully do this, organizations must enable better access to information, augment that information with better insight and have the ability to respond quickly to the implications of that insight. This will also include increasing autonomy and democratization across the organization, enabling parts of the businesses to quickly react instead of being bogged down by inefficient processes.

Trend 8: AI engineering

A robust AI engineering strategy will facilitate the performance, scalability, interpretability and reliability of AI models while delivering the full value of AI investments. AI projects often face issues with maintainability, scalability and governance, which makes them a challenge for most organizations.

Read more: 2 Megatrends Dominate the Gartner Hype Cycle for Artificial Intelligence, 2020

AI engineering offers a pathway, making AI a part of the mainstream DevOps process rather than a set of specialized and isolated projects. It brings together various disciplines to tame the AI hype while providing a clearer path to value when operationalizing the combination of multiple AI techniques. Due to the governance aspect of AI engineering, responsible AI is emerging to deal with trust, transparency, ethics, fairness, interpretability and compliance issues. It is the operationalization of AI accountability.

Trend 9: Hyperautomation

Hyperautomation is the idea that anything that can be automated in an organization should be automated. Hyperautomation is driven by organizations having legacy business processes that are not streamlined, creating immensely expensive and extensive issues for organizations.

Many organizations are supported by a “patchwork” of technologies that are not lean, optimized, connected, clean or explicit. At the same time, the acceleration of digital business requires efficiency, speed and democratization. Organizations that don’t focus on efficiency, efficacy and business agility will be left behind.

[swg_ad]

The post Gartner Top Strategic Technology Trends for 2021 appeared first on Smarter With Gartner.


Gartner Keynote: The Future of Business Is Composable

When COVID-19 hit and Australia went into lockdown, many people lost their jobs. The result was a surge in welfare applications to Services Australia, which provides health, child support and welfare services to 25 million citizens. There were so many applications, the site crashed.  But Services Australia rapidly pivoted to accommodate this surge in demand and made changes to how it trad

When COVID-19 hit and Australia went into lockdown, many people lost their jobs. The result was a surge in welfare applications to Services Australia, which provides health, child support and welfare services to 25 million citizens.

There were so many applications, the site crashed. 

But Services Australia rapidly pivoted to accommodate this surge in demand and made changes to how it traditionally operated. They shifted in-person appointments to phones or online, deployed voiceprint technology to 1.2 million users and saw a 600% increase in the use of digital assistants to orchestrate fast responses. 

Composable business means creating an organization made from interchangeable building blocks

The organization created resilience via the principles of composable business — a key feature of a successful business in 2020 and beyond — that enabled the agency to provide a safety net for citizens in need. 

[swg_ad id="37208"]

“Composable business is a natural acceleration of the digital business that you live every day. It allows us to deliver the resilience and agility that these interesting times demand,” said Daryl Plummer, Distinguished VP Analyst, during the opening keynote at virtual Gartner Symposium IT/Xpo®. “We’re talking about the intentional use of ‘composability’ in a business context — architecting your business for real-time adaptability and resilience in the face of uncertainty.”    

Composable means modularity 

The pandemic highlighted vulnerabilities in businesses models that for years focused on efficiency. Organizations that were once efficient suddenly became fragile at a time when they needed to be flexible. Businesses that were smart pivoted to a more modular setup, creating a composable business. Organizations were prepared for one type of future, but now must plan for multiple futures.

Read more: 8 Macro Factors That Will Shape the 2020s

Composable business means creating an organization made from interchangeable building blocks. The modular setup enables a business to rearrange and reorient as needed depending on external (or internal) factors like a shift in customer values or sudden change in supply chain or materials. 

The 4 principles of composable business

The idea of composable business operates on four basic principles: 

More speed through discovery

Greater agility through modularity

Better leadership through orchestration 

Resilience through autonomy

This type of thinking enables a business to survive, and even flourish, in times of great disruption. From a technical perspective, this type of composability is not new to CIOs. It exists in familiar technology, from APIs to containers. But, it is a new, or perhaps ignored, idea for a CIO’s business counterparts and board of directors. Composable business requires a foundational change in business thinking, architecture and technology. 

The building blocks of composable business

The three building blocks of composable business are: 

Composable thinking, which keeps you from losing your creativity. Anything is composable. When you combine the principles of modularity, autonomy, orchestration and discovery with composable thinking, it should guide your approach to conceptualizing what to compose, and when. Composable business architecture ensures that your organization is built to be flexible and resilient. It’s about structure and purpose. These are structural capabilities — giving you mechanisms to use in architecting your business. Composable technologies are the tools for today and tomorrow. They are the pieces and parts, and what connects them all together. The four principles are product design goals driving the features of technology that support the notions of composability.

When combined with the principles, the building blocks of composable business enable organizations to pivot quickly. For example, a Chinese appliance manufacturer pivoted from making dishwashers and wine coolers to distributing critical medical equipment during the pandemic. The company flexed beyond its core competencies, listened to what customers needed at the time and used its platform to move from an idea to a product launch. 

The building blocks of composable business enable organizations to pivot quickly

The more these composable business ideas are integrated within your business model, the more flexibility and agility your organization will have. That means faster response time and more consistency in execution for this new type of business setup. 

Leverage existing technologies

Organizations that embraced — and continue to embrace — the building blocks and principles of composable business were able to successfully leverage existing digital investments and, the best-case scenario, accelerate investment..

“Sixty-nine percent of corporate directors want to accelerate enterprise digital strategies and implementations to help deal with the ongoing disruption,” said Tina Nunno, Distinguished VP Analyst, Gartner. “For some enterprises that means that their digital strategies become real for the first time, and for others that means rapidly scaling digital investments.”

These strategies will help organizations handle the global industrywide volatility that will continue well into next year. 

“You, the CIOs, can contribute to the evolution of a more powerful and adaptable form of business, architected to deal with continuing business disruptions — composable business,” said Nunno.

Key opportunities for CIOs

Look for the “moments of composability” and seize the opportunity they present. These moments could be geopolitical, like the pandemic and global recession, but they could also be societal, such as a change in consumer attitude. These are moments in which the CIO must recognize the need for an immediate change in the organization or risk having the business falter or fail. 

“Throughout history, great leaders have faced turmoil and turned it into inspiration,” said Don Scheibenreif, Distinguished VP Analyst, Gartner. “Composing: being flexible, fluid, continuous, even improvisational — is how we will move forward.” 

[swg_ad id="36843"] 

The post Gartner Keynote: The Future of Business Is Composable appeared first on Smarter With Gartner.


Caribou Digital

Platform Livelihoods

A framework to help uncover trends Around the world, people and small businesses are working and selling in new ways, using digital platforms. News reports, heated policy debates, and perhaps especially our daily interactions with gig workers, freelancers, and virtual e-commerce storefronts all underscore a growing awareness that there is something essential and different about finding one’s
A framework to help uncover trends

Around the world, people and small businesses are working and selling in new ways, using digital platforms. News reports, heated policy debates, and perhaps especially our daily interactions with gig workers, freelancers, and virtual e-commerce storefronts all underscore a growing awareness that there is something essential and different about finding one’s way and earning a living in the platform economy.

This year, Caribou Digital has been working with Qhala and with the support of the Mastercard Foundation on a project exploring the “Quality of Youth Digital Livelihoods” in Kenya. That research, with its particular focus on how young women in Kenya experience platform work and platform sales, is well underway.

As a way to connect to the ongoing research and policy discussions, and to help create our interview guide, we read everything we could — 75+ primary research studies of platform work and platform sales in the Global South. We excited to report that Caribou Digital and Qhala have just released v1.0 of that review in a PDF, and in a searchable, filterable, online resource in the form of an evidence map.

Underpinning these new resources is an emerging overarching framework for understanding platform livelihoods, our term for the broad umbrella encompassing both platform work and platform sales. This post, also a page on the platform livelihoods website, is our most succinct (October 2020) version of the platform livelihoods framework.

More specifically, we define platform livelihoods as

active human efforts, sometimes combined with tools or assets, deployed to create value outside of the constructs of a stable employer-employee relationship, mediated by the infrastructure and accompanying logic of digital platforms.

From this definition, and drawing on that detailed literature review of 75 studies of platform livelihoods in the Global South, this framework addresses three broad questions for the digital development community:

What are the experiences of people with platform livelihoods?

We identify twelve elements — the kinds of experiences that individuals share and value when discussing their livelihoods with friends, family, and even the occasional researcher. They are a mix of economic, subjective, and broader human development experiences.

Platform Livelihood Experience Elements What are similarities and differences between platform livelihood types?

In the review, we offer a landscape of nine illustrative types of platform livelihood. Note that these are roles that individuals or small enterprises can fill, rather than “business models” or the names of specific platforms.

These are not the only roles that platforms are transforming or enabling, but these nine represent enough of the diversity in platform livelihoods to make two key distinctions. These types mix local and global (digital only) markets. Some of these roles are for individuals seeking work and offering their labor. Some of these roles are for small enterprises and even small farms, looking for new sales channels and new ways to connect with markets.

The early research and policy literature has been concentrated in platform work, especially ride-hailing, freelancing, and microwork. We feel strongly that platform work needs continued research and policy attention, as millions turn to platform labor amidst changing economies and a global pandemic.

At the same time, and in the longer run, platform sales (whether via marketplaces, social commerce, or search and discovery) may end up altering the livelihoods of millions more.

This this lens suggests that these two growing domains, platform work and platform sales, are supported by overlapping affordances of platformization — indeed it is often the same platforms supporting job seekers and microentrepreneurs alike. The lens invites synthesis of insights for policy and practice that draws on both domains, while retaining space for more comparative or specific inquiry into one livelihood type at time.

Platform Livelihood Types What kinds of technologies and business models support platform livelihoods?

Individuals and small enterprises use a mix of platform services and approaches to pursue their livelihoods.

The most easily recognizable way via formal digital marketplaces — the formal “multi-sided markets” in labor, goods and services, though with individuals and small firms can find customers. Informal social commerce — selling goods, services and even ‘influence’ via personal Facebook pages, Instagram, WhatsApp, etc. is also growing in popularity and impact, even if the research literature here is still sparse. And there is often (still), search and discovery. Millions of small businesses pay for advertising on Facebook and Google and other social media platforms, Many work to refine how they appear on maps or other digital databases and apps. Some have websites or social media pages or storefronts of their own. These activities, too, support platform livelihoods.

These activities and services blur and intersect as digital platforms continually slice and recombine, mediating the connections between sellers and their markets in ever-changing ways.

One the one hand, the digitalization and platformization of economies seems to reward sellers and workers who have developed capabilities in more or more of these than one approaches.

On the other hand, it is simultaneously a critical matter for inclusive digital development policy that these mediated connections do not divide, dissuade, or discriminate against small scale sellers and the self-employed.

In the longer run, we may update the online evidence map and accompanying discussion with a per-technology filter to better reflect differences and similarities (in practice and in policy) across these technological approaches.

Platform Livelihood Practices How to use this framework

By providing a common language framework, and a map of several kinds of platform livelihoods, this framework can help uncover many trends and cross cutting issues. For example, In this first iteration of the literature review we used the framework to explore four durable themes: gender, rurality, youth, and COVID-19. We also outline four emergent dynamics worthy of scrutiny: fractional work, amplification, hidden hierarchies, and contestation.

We hope that you might and use this framework in your own research, design, or policymaking activities. All the materials in this framework are licensed under Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International. We ask only that you provide attribution if this turns out to be useful to your research.

Platform Livelihoods was originally published in Caribou Digital on Medium, where people are continuing the conversation by highlighting and responding to this story.


Ontology

Lower Your Collateral Requirements & Loan Interests with OScore

Get your OScore today! OScore, the first credit score for the decentralized world built with Ontology’s Decentralized Identity has officially launched. For simplicity, consider OScore an on-chain reputational system that supports cross-chain interaction with verifiable credentials on the Ontology blockchain, hence eliminating third-party verification programs from the process. Once a user au
Get your OScore today!

OScore, the first credit score for the decentralized world built with Ontology’s Decentralized Identity has officially launched. For simplicity, consider OScore an on-chain reputational system that supports cross-chain interaction with verifiable credentials on the Ontology blockchain, hence eliminating third-party verification programs from the process. Once a user authorizes their digital financial data, Ontology’s OScore system automatically generates a quantifiable credit score while ensuring a user’s privacy is fully protected and controlled by themselves.

To begin, your OScore will be available once you log in and have connected your digital wallets and exchanges — this will allow the system to give you a real-time assessment of your OScore. Within the minimal and clean dashboard, users can authenticate assets to maintain a consistent OScore in accordance with their historical holdings, transactions, and current collection of assets.

Your Dashboard

In the world of Decentralized Finance, or DeFi, a credit-based scoring system that is both transparent and accurate has the potential to revolutionize not just individual users’ interactions amongst each other, but also the sector as a whole.

“…These perks may include things like being able to under-collateralize assets…”

A credit-based system in the DeFi space will bridge services with users by authenticating a certain level of trust so everyone can enjoy dApps, and DeFi projects with an ease of mind. Users with a higher OScore will enjoy positive externalities as a result. These perks may include things like being able to under-collateralize assets, therefore allowing for higher levels of liquidity and a more matrix-like decision tree for users who have consistently upheld their credit score.

“…users can earn WING tokens as rewards for punctually returning borrowed assets or loans…”

In addition, users can earn WING tokens as rewards for punctually returning borrowed assets or loans, as extra WING tokens will be distributed to reinforce positive behavior. This type of reinforcement can easily be replicated for other DeFi projects or tokens as well as long as they have OScore integrated into their systems. To take things even one step further, users can withdraw any assets which are connected through their wallet directly from the dashboard as well, removing a tedious step from the user journey for those who wish to withdraw assets. All of the features of OScore can be enjoyed through logging in the web browser, or through a user’s ONTO Wallet.

“…users can withdraw any assets which are connected through their wallet directly from the dashboard as well…”

A user’s OScore will also allow for the refinancing and restructuring of mortgages and credit lending where users who demonstrate proper behavior in their lending and repayment practices may also be rewarded with lower interest rates and more generous financing options. A second key component of OScore is the underlying Decentralized Identity solution which protects not just the user’s identity but also their data — granting them full control over who may or may not access it. Users can bind their OScore to their ONT ID just as they could with their real names. We have previously written about the drastically wide array of use cases for ONT ID, so we won’t go into too many details here.

When combined, an OScore becomes a powerful indicator of a user’s trust variable — which essentially is what a credit score is designed for. Just like traditional finance, DeFi can’t reach its full potential without credit. The good news is that with OScore, DeFi platforms like Wing.Finance can improve their underlying market mechanisms to provide users more usability, ultimately leading to a more mature and transparent market for the borrowing, lending, and insuring of digital assets.

The goal behind this launch is to redefine trust in the age of blockchain and elevating hundreds of millions of users to reach new financial heights. Not only will the OScore protect a user’s data privacy while providing full credit histories, but it will also be applicable across an array of real-life applications both offline and on-chain.

With the launch of OScore, Ontology’s mission to change the way we manage and transfer data while increasing security and protection will be realized one step further. As stated by our founder, Li Jun — “Ontology is providing a solution that connects users’ assets to their identity, providing increased security and trust to all parties, connecting the missing pieces of decentralized finance’s ecosystem.” A credit score, tied in with ONT ID and the ONTO Data Wallet, exemplifies the commitment that we envision.

Find Ontology elsewhere

Ontology website / Ontology GitHub / ONTO website / OWallet (GitHub)

Telegram (English)Discord

Twitter / Reddit / FacebookLinkedIn

Lower Your Collateral Requirements & Loan Interests with OScore was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


Forgerock Blog

ForgeTalks: How to Address Identity Governance Fatigue

Welcome back to another episode of ForgeTalks. This week I met with ForgeRock Senior Director for Product Marketing, Tim Bedard, to discuss how organizations can address their identity governance fatigue. Because of legacy identity governance and administration (IGA) limitations, IT and security teams are exhausted from manually reviewing and approving access requests. These organiz

Welcome back to another episode of ForgeTalks. This week I met with ForgeRock Senior Director for Product Marketing, Tim Bedard, to discuss how organizations can address their identity governance fatigue. Because of legacy identity governance and administration (IGA) limitations, IT and security teams are exhausted from manually reviewing and approving access requests. These organizations need an identity model that provides visibility into who has access to what and why, eliminating these manual processes. 

In this episode we'll dive into:

Why are organizations suffering from identity governance fatigue? How can we address legacy IGA solution gaps? What are the benefits of using ForgeRock Autonomous Identity?

I hope you enjoy this episode of ForgeTalks. Coming up next: tune in to part of 1 of our season finale with ForgeRock CEO, Fran Rosch, and titan of the cybersecurity world, Dave DeWalt. And if you want to check out any of our previous episodes you can do so here.


Authenteq

General Data Protection Regulation – and all that’s behind it

When most people think of GDPR, they are reminded of the onslaught of emails they received after it passed. Any company or newsletter […] The post General Data Protection Regulation – and all that’s behind it appeared first on Identity Verification & KYC | Authenteq.

When most people think of GDPR, they are reminded of the onslaught of emails they received after it passed. Any company or newsletter they had ever subscribed to–and yes, we really mean ever–was reaching out to them to reconfirm they wanted to stay on the mailing list and sharing their updated privacy policies. Companies they had perhaps long since forgotten about, companies they weren’t sure they had ever even subscribed to. But while most people today are familiar with the term, they don’t fully understand the legislation behind it. 

What is GDPR

GDPR stands for General Data Protection Regulation. It is a European regulation that was approved by the EU Parliament in 2016 and came into effect in 2018. At its core, it has three main purposes. 

Harmonize – data privacy laws across Europe Protect – and empower all EU citizens’ data privacy Reshape – the way organizations across the region approach data privacy

The aim of the GDPR is to help EU citizens have full control over their personal data and to provide more transparency into the data collection and usage processes. 

Under the GDPR, businesses must safeguard their customers’ personal data and use the highest possible privacy settings by default. Data must not be processed or made publicly available without the individual’s explicit and informed consent.

“Data controller” and “data processor” are the most important terms used in GDPR regulations and it’s important to clearly distinguish one from the other.

A data controller: determines the purpose and means of processing personal data A data processor: is responsible for processing data on behalf of the data controller Why Does it Matter? 

According to the European Commission, the value of personalized data will be 1 trillion euros, almost 8 percent of the EU’s GDP, by the end of 2020. Personal data is incredibly valuable, with many of the tech giants likening it to the new oil. With so much personal data on the internet, and so many companies looking to capitalize on it, the GDPR is a landmark piece of legislation.

Personal data can include anything and everything from someone’s full name, birthday, address both physical and IP, geotags, religion, race, political views, passport number, health records, and social media profiles. 

While it is an EU regulation, it has a global reach. The GDPR applies to any business that handles personal data of EU citizens, even if that business is located outside of Europe. Within the GDPR framework, controllers of personal data must implement appropriate technical and organizational measures to comply with the seven data protection principles.

Oh, and before we get into the principles the fines are significant as well. It has generated an estimated €114 million in fines in just two years. Under the GDPR, companies can be fined up to €20 million or four percent of their annual revenues, whichever is higher. For less serious violations, the fine is reduced to €10 million and two percent of annual revenues, which is still no laughing matter. 

The Seven Key Principles of GDPR

Lawfulness, Fairness, Transparency

Data must be obtained, stored, and processed on a lawful basis, and the individual must be fully informed regarding the purposes, the means, and the time period of data processing. Data controllers and data processors must stay true to their privacy promises.

Purpose Limitation 

The purpose of data collection must be specified, explicit, and legitimate. Data must only be used for the purpose for which the individual’s consent was received.

Data Minimization 

GDPR is designed to reduce data collection to the minimum required to fulfill the purpose. The data collected must be adequate, relevant, and limited to what is necessary in relation to the purposes for which it is processed.

Accuracy

Personal data must be accurate and up to date. Any old and, or outdated data should be erased.

Storage Limitation 

Personal data must not be kept for longer than required by the legitimate business purpose to which the individual consented.

Integrity and Confidentiality 

Personal data must be handled in a way that ensures appropriate security and is protected against unlawful processing, accidental loss, destruction, or damage.

Accountability 

All policies that govern the collection and processing of data must be documented in a thorough manner, and in a way that can demonstrate full GDPR compliance to authorities when called upon.

______________________________________________________________________________

Authenteq’s Compliance 

Our compliance with the EU GDPR regulation was created in cooperation with, and is routinely reviewed by, expert legal professionals from the German branch of the global law firm DWF (www.dwf.law)

______________________________________________________________________________

How to Comply with the GDPR

GDPR is actually a good thing overall, even if it feels intimidating to tackle. More control over personal data increases the end-users’ confidence in your product and with big data in general. The seven key principles can be translated into a few simple, actionable steps. You can also follow the GDPR checklist for data controllers

Gather clear, explicit consent. Keep your terms and conditions and privacy policies readable. Make it easy for anyone to withdraw from your service at any time.
Communicate breaches. Even the world’s largest companies can be hacked, and while the initial reaction to a breach may be to deal with it behind closed doors, breaking this rule and failing to report a breach will result in a fine. According to the GDPR, you have 72 hours to report a breach so build that directly into your security protocols and measures. 
Provide access. If a customer or user requests access to their data, which they may do at any time, you must be able to provide them with a clear and detailed copy that outlines which data you have collected, where it has been stored, and how it has been used. And you must provide this for free. 
Forget them. Not only can personal data be requested at any time, but people can also request that it be deleted. While there are some specific grounds on which you could deny this request, generally you have one month to honor these requests. 

_____________________________________________________________________________

We will be putting out a comprehensive downloadable version of this guide with what these definitions mean to Authenteq. Follow us on LinkedIn to make sure you get first access! 

The post General Data Protection Regulation – and all that’s behind it appeared first on Identity Verification & KYC | Authenteq.


MyKey

MYKEY Weekly Report 21 (October 12th~October 18th)

Today is Monday, October 19, 2020. The following is the 21st issue of MYKEY Weekly Report. In the work of last week (October 12th to October 18th), there are mainly 5 updates: 1. Welcome to mine in MYKEY Defibox From 18:00 on October 11th to 18:00 on October 25th(UTC 8), the weight of Defibox Swap was adjusted. The weight of ‘EOS+KEY’ mining has been increased to 1.0. 2. HashKey

Today is Monday, October 19, 2020. The following is the 21st issue of MYKEY Weekly Report. In the work of last week (October 12th to October 18th), there are mainly 5 updates:

1. Welcome to mine in MYKEY Defibox

From 18:00 on October 11th to 18:00 on October 25th(UTC 8), the weight of Defibox Swap was adjusted. The weight of ‘EOS+KEY’ mining has been increased to 1.0.

2. HashKey Hub&MYKEY launched the new period of BTC financial products

MYKEY and the third-party partner HashKey Hub launched a new period of 5% BTC 30-day regular financial products on October 13, 2020. Both parties will further deepen cooperation and jointly explore the development of digital currency financial products.

3. MYKEY Lab added 1 billion KEY TOKEN to the “multiple chains exchange pool”

Due to the demand of KEY TOKEN on multiple chain, MYKEY Lab locked 1 billion KEY TOKEN on EOS to the exchange pool on October 14, for detail, click to read: https://bit.ly/3k9lFn6

4. The token AAVE had been listed in MYKEY Swap this week

Come to enter [Home] — [Swap] in MYKEY to experience the lightning speed and smooth experience!

5. Ricky was live on bihu.com as the host on October 15

Ricky, the co-founder of MYKEY was live on bihu.com and talked about mining guide on CoFiX at 8:00 p.m.(UTC+8) on October 15.

!!! If you encounter any abnormal situation while using MYKEY, remember not to uninstall MYKEY APP, please contact MYKEY Assistant: @mykeytothemoon in Telegram.

!!! Remember to keep the 12-digit recovery phrase properly from [Me] — [Manage Account] — [Export Recovery Phrase] in MYKEY even if your account is not real-named.

About Us

KEY GROUP: https://keygroup.me/

MYKEY Web: https://mykey.org/

BIHU: https://bihu.com/people/1133973

Telegram: https://t.me/mykey_lab

Twitter: https://twitter.com/mykey_lab

Medium: https://medium.com/mykey-lab

Github: https://github.com/mykeylab

Youtube: MYKEY Laboratory

MYKEY Weekly Report 21 (October 12th~October 18th) was originally published in MYKEY Lab on Medium, where people are continuing the conversation by highlighting and responding to this story.


Otaka

Create a Secure Ktor Application with Kotlin

In this tutorial, you will build your very own Nano Blogging Service (nabl for short) using a modern JVM stack. This includes using the Kotlin programming language, the Ktor web framework, and securing it with Okta. Users can log in or sign up, post updates, and browse specific or global chronological feed without advertisements. The blogging service displays posts from the selected user or everyo

In this tutorial, you will build your very own Nano Blogging Service (nabl for short) using a modern JVM stack. This includes using the Kotlin programming language, the Ktor web framework, and securing it with Okta. Users can log in or sign up, post updates, and browse specific or global chronological feed without advertisements. The blogging service displays posts from the selected user or everyone in the chronological feed.

Kotlin is often considered a “better Java” and often becomes an easy, efficient substitution because it has excellent Java interoperability. That allows you to employ the largest ecosystem of existing JVM frameworks and libraries written and designed for Java in your Kotlin application and vice-versa. Kotlin works well with Spring Boot, Jersey, Dropwizard, and many others. “Kotlin-native” frameworks provide first-class language support, provide additional type-safety not available in the Java world, and often give competitive advantages.

Ktor is one of the most prominent “Kotlin-native” web frameworks officially supported by JetBrains, creators of the Kotlin language, and IntelliJ IDEA. It’s an unopinionated highly-customizable modular framework that gives developers full control over implementation while providing sensible defaults.

Prerequisites

Computer with installed Java 8+, Git, bash-like command line Familiarity with Java or Kotlin Your favorite IDE, for instance, IntelliJ IDEA Community Edition A Free Okta Developer account 15 mins of your time

Table of Contents

Build a Ktor Application with Kotlin Ktor Project Structure Start your Ktor Application Secure Your Ktor Application with Okta Configure Ktor’s OAuth 2.0 Module Setup a Ktor Authentication Module Sign in with the /login endpoint Authorization endpoint /login/authorization-callback Logout endpoint /logout Start Your Kotlin + Ktor Application Logout with Okta Manage Users With Okta Enable User Registration Ktor Secure App Code Review Ktor Data Layer Ktor Main Application Configuration Ktor Service Routes Type-Safe Views with Kotlin Learn More about Ktor and Kotlin Build a Ktor Application with Kotlin

As with any web application framework, Ktor provides several libraries and imposes some conventions. Don’t worry—it doesn’t tell you how to write your code. The conventions are mostly for the HTTP layer and you’re free to write other lower layers the way you want. A few of the most notable things include:

The web application is a pipeline processing incoming requests through features and route handlers. Request handling is non-blocking; it relies on Kotlin coroutines. The configuration file format is HOCON. Framework is employing DSL for the top-level declarations, e.g., modules setup, routing, etc. Pluggable features are configured using install(FeatureObject) { config }. Most of the functions and properties you use are extension functions. Ktor Project Structure

The application in this example depends on several libraries:

Kotlin programming language you use for this project Ktor server with Ktor server CIO - server implementation and coroutine-based HTTP engine core Ktor client with Ktor client CIO - client used to communicate to OAuth2 server Ktor Auth module to handle authorization flow kotlinx.html set of classes allowing to write type-safe HTML generators Okta JWT Verifier library helps to parse and verify access and id tokens

You can bootstrap this tutorial by cloning our Git repository and starting with the initial branch:

git clone -b initial https://github.com/oktadeveloper/okta-kotlin-ktor-example.git cd okta-kotlin-ktor-example

NOTE: If you want to see the completed app, with Okta already integrated, you can check out the main branch. See the project’s README for instructions on how to configure it to work with your Okta account.

Start your Ktor Application

Use IntelliJ runner or type ./gradlew run in the command line to start your application, point your web browser to http://localhost:8080.

All the messages displayed are from the in-memory database. Note that at this stage, the user can’t log in; hence they can’t post messages.

Secure Your Ktor Application with Okta

Real-world applications often require users to log in to perform some actions or access information. User management and security are much more complicated than they might seem and it can be tough to make them right. If you have done it previously, you know what I’m talking about.

User management shouldn’t take much of your time because that problem is solved already, right? In this tutorial, you’ll be using Okta’s OAuth 2.0 authorization service along with OpenID Connect (OIDC). Okta provides many features for both enterprise and personal project needs - MFA, SAML, groups, policies, social media logins, and many more. We offer solutions for different size companies - from pet projects just for yourself to big enterprises such as FedEx, Box, HubSpot, Experian, and many others. Okta helps developers implement secure authentication, handles authorization, and can act as an identity provider with a minimum effort and lines of code.

If you haven’t created an Okta account yet, sign up first. It’s free, no credit card required.

Login to the Okta admin console. On the top menu select Applications → Add Application:

Then, configure your Okta application. Don’t worry, if you want to change anything it’s always possible to return to this screen. At the very least, you need to set the following settings:

Name - give it a meaningful name, for instance, My Ktor nano Blogging Service Base URIs - put http://localhost:8080/ there. Multiple URI can be provided; you can add more URIs if needed. Login redirect URIs - set it to http://localhost:8080/login/authorization-callback. Upon successful login, the user will be redirected to URI provided with tokens in the query. Logout redirect URIs - value http://localhost:8080 allows you to provide a redirect URL on successful logout.

Click Done to finish the initial setup.

Take note of the following three values. You’ll use them in your Ktor application:

Org URL: Hover over API on the top menu bar, and select Authorization Servers menu item, copy the value from Issuer URI

Client ID and Client Secret as below:

Configure Ktor’s OAuth 2.0 Module

Ktor has an implementation of OAuth Client—it just needs to be configured. It’s always good practice to never insert any keys, tokens, or credentials directly into the code. Even for a demo project. To inject Okta parameters from environment variables, append a new block in resources/application.conf:

... okta { orgUrl = ${OKTA_ORGURL} clientId = ${OKTA_CLIENT_ID} clientSecret = ${OKTA_CLIENT_SECRET} }

To start your application from IntelliJ IDEA or any other IDE, these environment variables must be provided. In the Run/Debug Configuration dialog, click on the Environment variables and specify them as I have below.

Then, create a src/auth-settings.kt file to contain all Okta-configuration related functions.

You could also create an okta.env file with the following code:

export OKTA_ORGURL=https://{yourOktaDomain}/oauth2/default export OKTA_CLIENT_ID={yourClientId} export OKTA_CLIENT_SECRET={yourClientSecret}

Next, run source okta.env before running your app.

If you’re on Windows, name the file okta.bat and use SET instead of export.

Add a generic configuration class for Okta services in src/auth-settings.kt.

data class OktaConfig( val orgUrl: String, val clientId: String, val clientSecret: String, val audience: String ) { val accessTokenUrl = "$orgUrl/v1/token" val authorizeUrl = "$orgUrl/v1/authorize" val logoutUrl = "$orgUrl/v1/logout" }

Create a configuration reader in src/auth-settings.kt. This takes a Config object, reads from it, and creates an OktaConfig object.

fun oktaConfigReader(config: Config): OktaConfig = OktaConfig( orgUrl = config.getString("okta.orgUrl"), clientId = config.getString("okta.clientId"), clientSecret = config.getString("okta.clientSecret"), audience = config.tryGetString("okta.audience") ?: "api://default" )

Finally, the Ktor Auth module is expecting configuration to be passed as OAuthServerSettings.OAuth2ServerSettings. For that, you need a mapping function in src/auth-settings.kt:

fun OktaConfig.asOAuth2Config(): OAuthServerSettings.OAuth2ServerSettings = OAuthServerSettings.OAuth2ServerSettings( name = "okta", authorizeUrl = authorizeUrl, accessTokenUrl = accessTokenUrl, clientId = clientId, clientSecret = clientSecret, defaultScopes = listOf("openid", "profile"), requestMethod = Post ) Setup a Ktor Authentication Module

All authentication configuration and handling happen inside the setupAuth() function of src/auth.kt file. Start filling it with configuration. Use oktaConfigReader() to read configuration from the application file. Then, install the Authentication feature and configure it to use OAuth, provide it a redirect callback, the Okta OAuth2 configuration, and a default HttpClient for the Ktor OAuth client features.

package com.okta.demo.ktor import com.typesafe.config.ConfigFactory import com.okta.jwt.JwtVerifiers import io.ktor.application.* import io.ktor.auth.* import io.ktor.client.* fun Application.setupAuth() { val oktaConfig = oktaConfigReader(ConfigFactory.load() ?: throw Exception("Could not load config")) install(Authentication) { oauth { urlProvider = { "http://localhost:8080/login/authorization-callback" } providerLookup = { oktaConfig.asOAuth2Config() } client = HttpClient() } } }

To ensure that tokens provided are valid, they need to be verified. This can be done using theOkta JWT Verifier library. Construct access token and ID token verifiers as follows:

val accessTokenVerifier = JwtVerifiers.accessTokenVerifierBuilder() .setAudience(oktaConfig.audience) .setIssuer(oktaConfig.orgUrl) .build() val idVerifier = JwtVerifiers.idTokenVerifierBuilder() .setClientId(oktaConfig.clientId) .setIssuer(oktaConfig.orgUrl) .build()

Next, configure three login-specific endpoints. Ktor DSL assumes the following structure:

fun Application.setupAuth() { ... routing { authenticate { // Okta calls this endpoint providing accessToken along with requested idToken get("/login/authorization-callback") { // ⚫ handle authorization } // When guest accessing /login it automatically redirects to okta login page get("/login") { // ⚫ perform login } } // Perform logout by cleaning cookies get("/logout") { // ⚫ perform logout } } } Sign in with the /login endpoint

It’s the easiest one. Ktor will require user authentication for all endpoints located within the authenticate block. If a user is not authenticated, they will be redirected to the authorization URL. Its value is taken from the authorizeUrl property from OktaConfig.

Since the Ktor Auth module is handling this itself, the implementation is a single line. The condition checks if a visitor has a session and, if so, redirects it to the root of the website:

// When guest accessing /login it automatically redirects to okta login page get("/login") { call.respondRedirect("/") } Authorization endpoint /login/authorization-callback

Upon successful authorization, the user is redirected to this URL. The Okta authorization service provides access and ID tokens as part of the login flow. If unsure, read our Illustrated Guide to OAuth and OIDC.

To extract information (aka, parse the JWT) about the user, you can use Okta’s JWT Verifier. In the code below, the user’s name is taken from the token’s claims and “slugified”, to create a URL-safe alphanumeric username. Finally, a new session is created and the user redirected to the /.

// Okta calls this endpoint providing accessToken along with requested idToken get("/login/authorization-callback") { // Get a principal from from OAuth2 token val principal = call.authentication.principal<OAuthAccessTokenResponse.OAuth2>() ?: throw Exception("No principal was given") // Parse and verify access token with OktaJwtVerifier val accessToken = accessTokenVerifier.decode(principal.accessToken) // Get idTokenString, parse and verify id token val idTokenString = principal.extraParameters["id_token"] ?: throw Exception("id_token wasn't returned") val idToken = idVerifier.decode(idTokenString, null) // Try to get handle from the id token, of failback to subject field in access token val fullName = (idToken.claims["name"] ?: accessToken.claims["sub"] ?: "UNKNOWN_NAME").toString() println("User $fullName logged in successfully") // Create a session object with "slugified" username val session = UserSession( username = fullName.replace("[^a-zA-Z0-9]".toRegex(), ""), idToken = idTokenString ) call.sessions.set(session) call.respondRedirect("/") } Logout endpoint /logout

Users might have reasons to log out from the website—they might even simply erase cookies! Some people may consider that a little bit too technical. You can help them to do so by resetting the session on the server-side:

// Perform logout by cleaning session get("/logout") { call.sessions.clear<UserSession>() call.respondRedirect("/") } Start Your Kotlin + Ktor Application

Run your application, open your browser to http://localhost:8080, and click Login from the top menu bar. You will see an Okta login screen. After you type your credentials you’ll be redirected back to the app but as a user this time. Try to send some messages!

🎉 Congratulations, you just added authorization to your service!

Logout with Okta

Did you try to t̶u̶r̶n̶ ̶i̶t̶ ̶o̶f̶f̶,̶ ̶t̶h̶e̶n̶ ̶o̶n̶ ̶a̶g̶a̶i̶n̶ logout and login again? You might observe an unexpected behavior. If you checked “remember me” box in the Okta screen, you virtually can’t log out—or at least it looks like that.

From the user’s point of view, they expect to see a login screen inviting to put login/password——not to automatically be logged in:

You might ask yourself: why is it done this way? Why doesn’t the Authorization server purge sessions?

What if you’re using Facebook instead of Okta as an Authorization and Identity Provider service? And you want to logout from some website and that website also destroys your session in Facebook. It doesn’t sound nice, does it?

If you intend to logout users from Okta, as well, you’ll need to use something called RP-Initiated Logout. You can read more about it in this blog post. The basic idea is straightforward - after you remove a session inside your app, the user needs to visit a specially formed logoutUrl with idToken provided as a GET parameter. Update your logout handler in src/auth.kt:

// Perform logout by cleaning cookies and start RP-initiated logout get("/logout") { val idToken = call.session?.idToken call.sessions.clear<UserSession>() val redirectLogout = when (idToken) { null -> "/" else -> URLBuilder(oktaConfig.logoutUrl).run { parameters.append("post_logout_redirect_uri", "http://localhost:8080") parameters.append("id_token_hint", idToken) buildString() } } call.respondRedirect(redirectLogout) }

Restart your application and try to logout. Now the application behaves as you’d expect:

Manage Users With Okta

The Nano Blogging Service is more fun when different people can log in! You can create additional users from the Okta Developer Console. From the top menu bar, click on Users, then Add Person. You’ll be presented with a dialog to add a new user:

Enable User Registration

Okta also provides a self-sign up service. You can enable it by heading to the Okta Developer Console, hovering over the Users top menu item, and selecting Registration from the sub-menu. Okta will show a single button you need to click to activate the feature:

If desired, tune the default options and save.

Then, when you try to sign in to your service, you’ll see a “Sign up” link:

Ktor Secure App Code Review

Now that you have everything working, let’s take a look at the Kotlin code that makes it all possible.

Ktor Data Layer

Look at the basic data models of your application in the src/entities.kt file:

package com.okta.demo.ktor import java.time.LocalDateTime data class BlogRecord( val userHandle: String, val text: String, val createdAt: LocalDateTime = LocalDateTime.now() ) data class UserSession( val username: String, val idToken: String )

The BlogRecord class contains information about the userHandle, posted text and createdAt timestamp. UserSession is an object which contains information about a currently signed in user; see the authentication section for more details.

The BlogRecordRepository class is responsible for data manipulation. For demo purposes, data is stored in memory and initialized with some dummy records at startup time.

Your data repository is in the src/BlogRecordRepository.kt file:

package com.okta.demo.ktor class BlogRecordRepository { private val records = mutableListOf<BlogRecord>() val all: List<BlogRecord> get() = records fun insert(userHandle: String, text: String) { records += BlogRecord(userHandle, text) } fun byUser(userHandle: String) = records.filter { it.userHandle == userHandle } } val blogRecords = BlogRecordRepository().apply { insert("kack", "Hello world!") insert("kack", "Keep messages short and sweet! 💬") insert("ann", "OMG it's a future unikorn 🦄!") insert("rux", "Chronological feed! It's just like the good old days! ") insert("kotlin", "Wise language selection") insert("whitestone", "We'd like to invest 💰💰💰") insert("cat", "🐈🐱🙀😼😻🐾") } Ktor Main Application Configuration

Before you get into the route handling and login flow, the web service itself needs to be configured. As per convention, Ktor services are configured by creating an Application.module() extension function. Look at the configuration sections in src/application.kt:

package com.okta.demo.ktor import io.ktor.application.* import io.ktor.features.* import io.ktor.request.* import io.ktor.sessions.* import io.ktor.util.* import org.slf4j.event.Level import kotlin.collections.set fun main(args: Array<String>): Unit = io.ktor.server.cio.EngineMain.main(args) @Suppress("unused") // Referenced in application.conf @kotlin.jvm.JvmOverloads fun Application.module(testing: Boolean = false) { // Sessions are stored in encrypted cookies install(Sessions) { cookie<UserSession>("MY_SESSION") { val secretEncryptKey = hex("00112233445566778899aabbccddeeff") val secretAuthKey = hex("02030405060708090a0b0c") cookie.extensions["SameSite"] = "lax" cookie.httpOnly = true transform(SessionTransportTransformerEncrypt(secretEncryptKey, secretAuthKey)) } } // Respond for HEAD verb install(AutoHeadResponse) // Load each request install(CallLogging) { level = Level.INFO filter { call -> call.request.path().startsWith("/") } } // Configure ktor to use OAuth and register relevant routes setupAuth() // Register application routes setupRoutes() } // Shortcut for the current session val ApplicationCall.session: UserSession? get() = sessions.get<UserSession>()

Your application module configures the session handler to keep data in encrypted cookies and enable logging, which is very useful for debugging. Two of the functions - setupAuth() and setupRoutes()- configure OAuth 2.0 and setup web service routes.

Ktor Service Routes

This application registers two routes with Ktor DSL making it very expressive:

POST / takes a text parameter from the body and current actor(user handle) from the session and creates a new nano blog record. Both actor and text must be valid to create a new record; otherwise, an error is thrown. Upon a successful insertion, the user gets redirected to the /. GET /{username?} effectively handles all GET requests and attempts to extract the username URL parameter if present. Then, it renders the main template with either global or requested user’s feed using the feedPage() method.

See src/routes.kt:

package com.okta.demo.ktor import io.ktor.application.* import io.ktor.html.* import io.ktor.request.* import io.ktor.response.* import io.ktor.routing.* fun Application.setupRoutes() = routing { post("/") { root -> val actor = call.session?.username ?: throw Exception("User must be logged in first") val text = call.receiveParameters()["text"]?.takeIf(String::isNotBlank) ?: throw Exception("Invalid request - text must be provided") blogRecords.insert(actor, text) call.respondRedirect("/") } get("/{username?}") { val username = call.parameters["username"] call.respondHtmlTemplate(MainTemplate(call.session?.username)) { content { val canSendMessage = call.session != null if (username == null) feedPage("🏠 Home feed", blogRecords.all, canSendMessage) else feedPage("👤 ${username}'s blog", blogRecords.byUser(username), canSendMessage) } } } }

The page-rendering function, feedPage(), takes three parameters: page title, list of the nano blog posts to render, and a boolean flag canSendMessage (if it’s true, the text submission form will be visible). The variable canSendMessage is set to true only when the current user has an active session, that is possible only after login.

Type-Safe Views with Kotlin

Kotlin syntax empowers developers to create type-safe DSL. This Nano Blogging Service is using the kotlinx.html library, which provides HTML-like syntax for HTML-rendering. All the views are in the src/views.kt file.

The primary and only template MainTemplate includes Bootstrap CSS library, renders the top navbar menu, and provides a basic layout for the frontend:

/** * Generic web page template, contains content placeholder where * content should be placed */ class MainTemplate(private val currentUsername: String? = null) : Template<HTML> { val content = Placeholder<HtmlBlockTag>() override fun HTML.apply() { head { title { +"Nano Blogging Service" } styleLink("https://stackpath.bootstrapcdn.com/bootstrap/4.5.2/css/bootstrap.min.css") meta(name = "viewport", content = "width=device-width, initial-scale=1, shrink-to-fit=no") meta(charset = "utf-8") } body("d-flex flex-column h-100") { header { div("navbar navbar-dark bg-dark shadow-sm") { div("container") { a(href = "/", classes = "font-weight-bold navbar-brand") { +"📝 𝓝𝓐𝓝𝓞 𝓑𝓛𝓞𝓖𝓖𝓘𝓝𝓖 𝓢𝓔𝓡𝓥𝓘𝓒𝓔" } div("navbar-nav flex-row") { if (currentUsername != null) { a(href = "/${currentUsername}", classes = "nav-link mr-4") { +"Hello, $currentUsername" } a(href = "/logout", classes = "nav-link") { +"Logout" } } else { div("navbar-text mr-4") { +"Hello, Guest" } div("navbar-item") { a(href = "/login", classes = "nav-link") { +"Login" } } } } } } } main("flex-shrink-0 mt-3") { div("container col-xs-12 col-lg-8") { insert(content) } } } } }

Confused about plus(+) sign in front of the string inside HTML elements? Don’t worry. It’s just a shortcut for the text() function which sets current tag content.

View blocks such as feedBlock(), sendMessageForm() and feedPage() are extension functions (I know, there’s a lot of them!) on FlowContent. That prevents global scope pollution with enormous HTML DSL elements and provides better encapsulation.

/** * Displays feed block only */ fun FlowContent.feedBlock(feedItems: List<BlogRecord>) { feedItems.forEach { record -> div("entity card m-4") { div("w-100 card-header") { h4("user font-weight-bold mb-0 pb-0 d-inline-block") { a(href = "/${record.userHandle}") { +record.userHandle } } span("float-right text-secondary") { +record.createdAt.format(timeFormatter) } } div("card-body") { h5 { +record.text } } } } } /** * Renders send message form */ fun FlowContent.sendMessageForm() { form("/", encType = applicationXWwwFormUrlEncoded, method = post) { div("mb-3") { div("input-group") { input(classes = "form-control", name = "text") { placeholder = "Your nano message" required = true autoFocus = true } div("input-group-append") { button(classes = "btn btn-success") { +"Send! 🚀" } } } } } } /** * Renders feed page with given title and records */ fun FlowContent.feedPage(title: String, records: List<BlogRecord>, canPostMessage: Boolean) { if (canPostMessage) sendMessageForm() hr { } h2("text-center") { +title } feedBlock(records.sortedByDescending(BlogRecord::createdAt)) } Learn More about Ktor and Kotlin

Congratulations on finishing this tutorial! You built a Kotlin and Ktor-based Nano Blogging Service secured with Auth 2.0.

The source code for this tutorial is available on GitHub in the oktadeveloper/okta-kotlin-ktor-example repository.

If you liked this post, you might like these others too:

What the Heck is OAuth? A Quick Guide to OAuth 2.0 with Spring Security Deploy a Secure Spring Boot App to Heroku

Make sure to follow us on Twitter and subscribe to our YouTube Channel so that you never miss any of our developer content!

Sunday, 18. October 2020

KuppingerCole

KuppingerCole Analyst Chat: The European Cybersecurity Month - Ensuring Security Beyond Awareness

This analyst chat episode is the 50th and therefore a bit different. This time Matthias talks to two experienced analysts, Martin Kuppinger and Alexei Balaganski, about the ECSM, the European Cyber Security Month, which is to provide information and awareness on cyber security in October 2020. The particular aim they pursue is to go beyond awareness to arrive at specific measures that can benefit

This analyst chat episode is the 50th and therefore a bit different. This time Matthias talks to two experienced analysts, Martin Kuppinger and Alexei Balaganski, about the ECSM, the European Cyber Security Month, which is to provide information and awareness on cyber security in October 2020. The particular aim they pursue is to go beyond awareness to arrive at specific measures that can benefit individuals and organizations alike.




Meeco

October Digital Events

This month, Meeco invites you to join us live for a number of digital events, or circle back when you have time and catch-up on the podcasts and webinars featured here. Digital Catchup Let’s Talk About Digital Identity with Katryna Dow, founder & CEO of Meeco – Ubisecure Podcast Episode ... Read More The post October Digital Events appeared first on The Meeco Blog.
This month, Meeco invites you to join us live for a number of digital events, or circle back when you have time and catch-up on the podcasts and webinars featured here.

Digital Catchup

Let’s Talk About Digital Identity with Katryna Dow, founder & CEO of Meeco – Ubisecure Podcast Episode 30

Oscar Santolalla talks to Katryna Dow about her career (including inspiration from Minority Report), Meeco’s personal data & distributed ledger platform, the importance of data minimisation to inspire trust in organisations, and cultural differences in attitudes towards digital identity.
The greatest way to overcome this privacy paradox is transparency.

– Katryna Dow, Meeco
The Data Sovereignty Now Movement

Don’t worry if you missed the 15 October webinar about ‘Data Sovereignty Now: unleashing the benefits of data in line with European values. You can watch the recorded version of the webinar via this link.

Drawing on the many years of experience within our network of like-minded organisations, we have developed our own, practical version of how European regulatory intervention can state the future of the digital economy and reduce our dependence on the platforms run by the increasingly dominant international Big Tech giants.

We believe that it is possible to create more room for innovation, fair competition and privacy without compromising on the availability and quality of interconnected services and staying true to the European Values. Structured and mandated data sovereignty forms the basis for this approach.

We regard data sovereignty as functionally extending the rights that are already defined in the GDPR and as building upon European legislation such as eIDAS and PSD2.

Recorded Thursday 15 October 2020, featuring: Arief Hühn, researcher at FreedomLab, Eric Pol, founder of A New Governance and Katryna Dow, CEO & Founder of Meeco. 

Panel discussion featured the following experts: Eric Pol, Katryna Dow, Jaana Sinipuro (project director at Sitra), Gerard van der Hoeven (director of the iSHARE Foundation) and Douwe Lycklama (founder of INNOPAY).

Live Digital Events

KuppingerCole Customer Technology 2020
Wednesday 21 October 2020 – 12:40-13:00 CET Human-centric Data & Identity Management – Implementing Real World Case Studies

Earlier in 2020, the European Union published “A European Strategy for Data” outlining its vision for a connected single digital market where the benefits of the digital economy could enhance the lives of its citizens, residents and trade partners.

However, we now find ourselves at a very real crossroad. A post-pandemic world will be a new type of normal. Amidst the tragic loss of lives there have been breakthroughs in science, new ways of working along with embracing new digital tools.

We are at the beginning of a new design and architectural phase where just because technology can, doesn’t mean it should. Personal data linked to identity, fuelled by AI sits at the centre of these decisions.

Enabling citizens, students, patients, passengers and consumers to more equitably join the value chains fuelled by data will ultimately lead to greater trust and personalisation, resulting in a more prosperous society. However, this will require new commercial models, enforceable regulation and the digital tools to transform our connected society.

This session will focus on the implementation of real-world case studies including standards, commercial models and technology choices. Keynote: Katryna Dow, Meeco Founder & CEO KuppingerCole Customer Technology 2020
Wednesday 21 October 2020 – 13:00-13:40 CET The Human-Centric Internet: Trustworthy, Resilient, Sustainable, Inclusive, Where are we?

During the EURODIG Conference 2018, EC Commissioner Mariya Gabriel addressed the audience with a video message describing her (and the EU´s) vision of a Human-Centric Internet.

Most if not all of what Mariya described, has been the mission of Meeco´s Founder Katryna Dow since many years, unresting, untiringly, earnestly and with a clear vision working making this a reality.

So, get your coffee now and take this unique opportunity to discuss with Katryna about current trends and the future of what we used to call “Life Management Platforms”.

Discussion:
Joerg Resch, Co-Founder and Management Board Member – KuppingerCole
Katryna Dow, Founder & CEO – Meeco
Jo Vercammen, CTO – Meeco Data Sovereignty Now: unleashing the benefits of data in line with European values
Webinar: Thursday 22 October 2020 – 14:15:15 CET On behalf of ourselves and our partners, we are pleased to invite you to our webinar debate on ‘Data Sovereignty Now: unleashing the benefits of data in line with European values’. The webinar is especially interesting for politicians and their staff in Europe. It is the aim of the European Commission and local governments to unleash the full benefits of data usage for everyone in Europe in line with European values. Data sovereignty holds the key to achieving that.

It is a worthy ambition, but one that also raises a number of questions, such as what exactly is data sovereignty, how does it support European values, and how can organisations in the public/private sector help to accomplish this ambition?

These questions will be addressed in the webinar on 22 October.

Featuring: Arief Hühn, researcher at FreedomLab, Eric Pol, founder of A New Governance and Katryna Dow, CEO & Founder of Meeco.

Followed by panel discussion featuring the following experts: Eric Pol, Katryna Dow, Jaana Sinipuro (project director at Sitra), Gerard van der Hoeven (director of the iSHARE Foundation) and Douwe Lycklama (founder of INNOPAY).
We hope you enjoy this week of events, or catch-up anytime!
Thank you, Team Meeco :)

The post October Digital Events appeared first on The Meeco Blog.

Friday, 16. October 2020

Self Key

Governance Tokens and Democratization Through DeFi

Governance tokens can help users to have better control over their assets. Such a utility has greatly increased the popularity of governance tokens and DeFi over the last year or so. The post Governance Tokens and Democratization Through DeFi appeared first on SelfKey.

Governance tokens can help users to have better control over their assets. Such a utility has greatly increased the popularity of governance tokens and DeFi over the last year or so.

The post Governance Tokens and Democratization Through DeFi appeared first on SelfKey.


COMUNY

comuny erhält Auszeichnung “MyData Operator 2020”

16.10.2020. Die EU-Datenstrategie fordert verantwortungsvolle Ökosysteme für personenbezogene Daten – MyData-Operatoren sind die Antwort. comuny ist nun offiziell eines dieser richtungsweisenden Systeme. Eine weltweite Bewegung führender Unternehmen und Experten stellt sich auf für das neue Normale beim Datensharing und zeichnet das Produkt Trust Data Operator des deutschen Security-Startups comuny
16.10.2020. Die EU-Datenstrategie fordert verantwortungsvolle Ökosysteme für personenbezogene Daten – MyData-Operatoren sind die Antwort. comuny ist nun offiziell eines dieser richtungsweisenden Systeme.

Eine weltweite Bewegung führender Unternehmen und Experten stellt sich auf für das neue Normale beim Datensharing und zeichnet das Produkt Trust Data Operator des deutschen Security-Startups comuny GmbH mit dem bemerkenswerten Status “MyData Operator 2020” aus.

Die Auszeichnung wurde am 14.10.2020 an sieben weltweit führende Unternehmen aus sieben Ländern verliehen, die sich alle für einen ausgewogenen Umgang mit persönlichen Daten einsetzen. “Wir sind überglücklich und bekommen mit dieser Auszeichnung eine weitere Bestätigung dafür, wie innovativ und wertvoll der Einsatz unseres Produktes ist”, so Beatrix Reiß, die zusammen mit ihrem Geschäftspartner Dominik Deimel comuny zu diesem Erfolg geführt hat. “Dominik und ich kommen aus dem eHealth-Markt und wissen aus langjähriger Erfahrung, vor welchen Hürden Modelle für neue Services, Versorgung und Zusammenarbeit stehen. Das wollten wir ändern.”

comuny erhält diese Auszeichnung für den Nachweis, wie verifizierte Daten zwischen Unternehmen und Kunde, Diensten und Lösungspartnern unter Einbeziehung und zum Vorteil aller Beteiligten übergreifend fließen. Der comuny Trust Data Operator beweist als B2B-Komponente hier das enorme Geschäftspotenzial, welches aus dem verantwortungsbewusstem Umgang mit Daten durch einen Operator entsteht.

“Viele sehen heute einfach nicht die Chancen, die eine wertschätzende digitale Beziehung mit sich bringt: bessere digitale Prozesse, Kosteneinsparungen, höhere Kundenzufriedenheit und nicht zuletzt das positive digitale Image für’s Geschäft. Deshalb unterstützen wir Unternehmen hier und lösen quasi nebenbei ihre dringenden Probleme im Kundendatenmanagement” freut sich Dominik Deimel. Das beweist comuny gerade mit ersten Kunden, so ist über den comuny Operator der Zugang zur elektronischen Patientenakte oder die Ausgabe der elektronischen Gesundheitskarte nach PDSG komplett digital möglich. Für die elektronische Patientenakte arbeitet comuny mit einer der größten Krankenkassen Deutschlands zusammen, um die Nutzung der ePA für ihre Versicherten so komfortabel wie möglich zu gestalten. comuny wurde außerdem ins aktuelle Batch#20 des InsurLab Germany berufen – das ist wie ein Ritterschlag der deutschen Versicherer.

Die 7 Organisationen, denen der Status “MyData Operator 2020” verliehen wurde, repräsentieren verschiedenste Länder und Serviceangebote. Einige bieten breit gefächerte Dienstleistungen direkt für Einzelpersonen an, einige haben fokussierte Angebote in nur einer Branche, während wieder andere Technologien und Infrastruktur für Dritte bereitstellen, um stärker auf Bedürfnisse von Kunden ausgerichtete Dienste zu ermöglichen. Sie alle dürfen ab sofort das Logo “MyData Operator 2020” verwenden und erklären, dass ihnen “der Status MyData Operator 2020 von MyData Global in Anerkennung ihres Wirkens für eine menschenorientierte Infrastruktur zur Datenverarbeitung und -nutzung verliehen wurde”.

Einzigartig an MyData-Operatoren ist auch ihr Wille zur radikalen Zusammenarbeit für sichere Datenflüsse ohne Lock-In-Effekt. Die Preisträger arbeiten untereinander zusammen, um ein erfolgreiches, interoperables Ökosystem für personenbezogene Daten zu schaffen. Joss Langford, Co-Leiter der thematischen Gruppe MyData-Operatoren gratuliert den Preisträgern “für ihre Integrität, ihren Einsatz und ihre Offenheit bei diesem beispiellosen Informationsaustausch, der weit über die Anforderungen jeglicher Gesetzgebung hinausgeht.”

“Wir glauben, dass offene Ökosysteme, die den Menschen echte Wahlmöglichkeiten zwischen guten Alternativen bieten, die neue Norm für persönliche Daten und darauf gestützte digitale Dienste sind. Die gegenwärtige Situation ist sowohl aus menschlicher als auch aus Marktperspektive unhaltbar”, erklärt Teemu Ropponen, Geschäftsführer von MyData Global.

Weitere Information

Diese Organisationen wurden jetzt mit dem Status “MyData Operator 2020” ausgezeichnet: org/operators/ Weißbuch “Understanding MyData operators” herunterladen oder Neuigkeiten aller MyData Operatoren verfolgen: https://mydata.org/operators/ MyData Global beitreten: mydata.org/join/ MyDataGermany LinkedIn-Gruppe beitreten: https://lnkd.in/dF_Jrxf Was ist der comuny Trust Data Operator?

 

#MyData #MyDataOperator #PersonalData #DataEconomy #future-of-trust #ecosystems #trust-services #operator #collaboration #MyDataGlobal #MyDataGermany #humanTec

Materialien

Pressemitteilung

Download als PDF 20201016_MyData Operator_awards_COMUNY_press release_dt.pdf

Download als docx 20201016_MyDataOperator_awards_COMUNY_press release_dt.docx

Übersicht Preisträger PDF List of all the awarded MyData Operators.pdf

Übersicht Preisträger docx List of all the awarded MyData Operators.docx

 

Bildmaterial & Logos

 

 

Logos aller am 14.10.2020 ausgezeichneten Operatoren weltweit

Logos aller bisher anerkannten MyData Operatoren weltweit

 

 

Bild von MyData mit Joss Langford (Co-Leiter der Themen Gruppe MyData-Operatoren und leitender Redakteur des Weißbuchs “Understanding MyData Operators)

Logo der MyData Themengruppe “Operatoren”

 

 Logo comuny GmbH


Smarter with Gartner - IT

Data Sharing Is a Business Necessity to Accelerate Digital Business

Ahead of the COVID-19 pandemic, 10 large pharmaceutical companies — including Johnson & Johnson, AstraZeneca and GSK — undertook collaborative efforts to train their drug discovery, machine learning (ML) algorithms on each other’s data to promote drug discovery. The goal? To accelerate and reduce the cost of the discovery of drugs. They used digital trust technologies, including blockchain, to

Ahead of the COVID-19 pandemic, 10 large pharmaceutical companies — including Johnson & Johnson, AstraZeneca and GSK — undertook collaborative efforts to train their drug discovery, machine learning (ML) algorithms on each other’s data to promote drug discovery. The goal? To accelerate and reduce the cost of the discovery of drugs. They used digital trust technologies, including blockchain, to share data without compromising confidential or commercial secrets. 

By 2023, organizations that promote data sharing will outperform their peers on most business value metrics

This rare example shows that organizations can deliver more value when they collaborate in sharing data externally — even with competitors — yielding comparatively increased value through efficiency and cost savings for each organization. 

“There should be more collaborative data sharing unless there is a vetted reason not to, as not sharing data often frustrates business outcomes and can be detrimental,” says Lydia Clougherty Jones, Senior Director Analyst, Gartner.

[swg_ad]

Data and analytics leaders who promote both internal and external data sharing are more successful in demonstrating superior team and organizational performance. In fact, Gartner predicts that by 2023, organizations that promote data sharing will outperform their peers on most business value metrics.

Yet, at the same time Gartner predicts that through 2022, less than 5% of data-sharing programs will correctly identify trusted data and locate trusted data sources.

The traditional “don’t share data unless” mindset has outlived its original purpose

Many organizations inhibit access to data, preserving data silos and discouraging data sharing. This unnecessarily undermines efforts to maximize business and social value from data and analytics — at a time when COVID-19 is driving demand for data and analytics to unprecedented levels. The traditional “don’t share data unless” mindset has outlived its original purpose.

Read more: How to Achieve Smart Data Sharing

This default must be reversed to an approach of “must share data unless.” By recasting data sharing as a business necessity, data and analytics leaders will have access to the right data at the right time, enabling more robust data and analytics strategies that deliver business benefit and digital transformation. 

While it’s not easy to change the status quo, data and analytics leaders must ask themselves what two areas to prioritize now to foster a data sharing mindset. The answer: Establishing trust-based mechanisms and preparing a data-sharing environment. 

Establish trust-based mechanisms

If you do not introduce trust throughout your data-sharing process, you cannot achieve business value from the data you collect. Gartner predicts that through 2023, organizations that can instill digital trust will be able to participate in 50% more ecosystems, expanding revenue-generation opportunities. 

Develop trust-based mechanisms that establish high levels of trust in the data source and separately in the trustworthiness of the data. This allows you to align appropriate data use with your business goals, both within and outside your organization.

It’s important to trust the quality of the data you collect, use and share to match your business context and requirements. Separately, organizations must trust their data sources so that they can rely on (and pass on to others) appropriate and enforceable rights to use, reuse, share and reshare data.

Foster a data-sharing culture — not a data “ownership” culture — by identifying the emotional impacts and inherent biases that hamper data sharing

Adopt digital trust technologies such as blockchain smart contracts, which enable a trusted data collection method, while also enabling the efficient transfer and sharing of any asset of monetary or nonmonetary value.

Overall, use data-quality metrics and augmented data catalogs to compile your data and data source trustworthiness evaluations. By 2021, organizations that offer users access to a curated catalog of internally and externally prepared data will realize 100% more business value from analytics investments than those that do not.

Preparing a data-sharing environment 

To establish a data-sharing environment, work with your business leaders across business units to create a data-sharing mindset. Foster a data-sharing culture — not a data “ownership” culture — by identifying the emotional impacts and inherent biases that hamper data sharing.

Within your IT department, distinguish your data management strategy between data warehouses, data lakes and data hubs. Gartner predicts that through 2020, organizations that adopt data hub strategies will achieve outcomes dependent on shared and governed data with at least 60% lower cost. 

Create new and flexible data management practices that adapt to uncertain and changing environments. And, drive organizational enablement of data sharing, prioritizing use cases in which increased data sharing will yield maximum alignment with business outcomes — including increased costs savings, net new revenue or nonmonetary value creation, or improved risk mitigation decision making.

The post Data Sharing Is a Business Necessity to Accelerate Digital Business appeared first on Smarter With Gartner.

Thursday, 15. October 2020

Gluu | Blog

Gluu wins 2020 Couchbase Independent Partner of the Year Award

Couchbase, the creator of the enterprise-class, multi-cloud NoSQL database, today announced at the Connect Conference that Gluu was acknowledged with the Couchbase Independent Software Vendor...

Couchbase, the creator of the enterprise-class, multi-cloud NoSQL database, today announced at the Connect Conference that Gluu was acknowledged with the Couchbase Independent Software Vendor Partner of the Year Award. Guest judges for the Couchbase Community Awards included Carl Olofson, Research Vice President at IDC, and Jon Reed, Co-Founder of Diginomica.

Open to Couchbase’s network of more than 300 partners, the award recognizes partners’ work in bringing differentiation and innovation to customers, and leading them to successful outcomes.

The Couchbase database is used for Gluu Server authentication and authorization requirements that requires extreme concurrency and “active-active” multi-datacenter replication. “Couchbase and cloud native technology has enabled Gluu for the first time to achieve true horizontal auto-scaling,” said Michael Schwartz, CEO of Gluu. “It’s a great honor to win this award, because it recognizes the sustained efforts of our combined engineering teams to make this accomplishment a reality. Couchbase has enabled Gluu to make performance into a competitive advantage. We’re crushing the competition in both transactions per second and total cost of ownership. You can’t do this stuff with ancient LDAP servers, no matter how much lipstick you put on them.”

Learn more from Gluu and Couchbase’s previous joint press release:
Open Source Gluu Server and Couchbase Shatter Billion-Login-Per-Day Threshold

“We are excited to not only continue, but to strengthen our long-standing partnership with Gluu,” said Matt McDonough, Vice President of Business Development at Couchbase. “Through Couchbase and Gluu’s cloud native approach, we’ve been able to split services and workloads to support auto-scaling easily enough to surpass the 1B authentications in a single day milestone. Together, we will continue to optimize the Couchbase platform along with Gluu’s robust, industry-proven solution for single sign-on and two-factor authentication to deliver the next billion authentications in a day.”

 

Read the press release:
http://www.globenewswire.com/news-release/2020/10/15/2109259/0/en/UPS-Tesco-and-Infosys-Among-Winners-of-Inaugural-Couchbase-Community-Awards-and-2020-Couchbase-Partner-Awards.html


One World Identity

World Bank: The Inclusion Challenge

Vyjayanti Desai, Practice Manager for the ID4D and G2Px, two global, multi-sectoral initiatives of the World Bank Group joins State of Identity to discuss The Mission Billion Challenge. The solutions-focused challenge highlights the fundamental role that digital platforms can play in helping a country to effectively provide assistance to its people.

Vyjayanti Desai, Practice Manager for the ID4D and G2Px, two global, multi-sectoral initiatives of the World Bank Group joins State of Identity to discuss The Mission Billion Challenge. The solutions-focused challenge highlights the fundamental role that digital platforms can play in helping a country to effectively provide assistance to its people.


Jolocom

How KERI tackles the problem of trust

The problem of trust over a network is longstanding. It has been addressed in a variety of ways since the internet blossomed, but never as yet solved. Take, for example, Certificate Authorities (CAs), the organizations with the power to issue certificates to websites so browsers will recognize them as legitimate. These certificates function as a ... The post How KERI tackles the problem of trust

The problem of trust over a network is longstanding. It has been addressed in a variety of ways since the internet blossomed, but never as yet solved.

Take, for example, Certificate Authorities (CAs), the organizations with the power to issue certificates to websites so browsers will recognize them as legitimate. These certificates function as a primitive identification for websites, where the CA is attesting to the identity of the website and the website can use this certificate to identify itself to browsers – because all browsers trust the CAs.

The flaw here though is obvious: what if a CA is compromised and begins issuing false certificates? No set of keys can be considered absolutely immune to compromise, no matter how powerful the controller.

The central problem here is that of trustless key distribution and attribution (more specifically, the implementation of a Distributed Key Management System (DKMS). That is, how do we know that a key is truly controlled by an identifier and, given an identifier, how can we know which keys it controls? (It’s important to note here that CAs are not trustless mechanisms, but that browsers rely entirely on trusting that a CA has not been compromised.)

Decentralized digital identity using the technology of Decentralized Identifiers (DIDs) appears to remedy the issue. However, this collection of related specifications only defines a consistent interface for different DID methods to implement a DKMS in any way they see fit, as well as associated semantics to make them very flexible and useful.

The exact way that key distribution and attribution is addressed is different between DID methods. Distributed ledger technologies (DLTs) and blockchains have proven effective for the purposes of designing decentralized DID methods and even centralized trust-requiring systems, such as GitHub or Twitter, have been used.

Most DID methods are still fairly young, but they often share certain characteristics:

Central registry-based: the infrastructure might be decentralized (e.g. a blockchain or other DLT), however the DID method is informationally centralized, i.e. registration and resolution involve the updating or reading of the shared state of a single network or infrastructure. Canonicity: the central registry is considered to be the single trustworthy source of truth. Availability provision: the central registry’s entire state can be queried at any time from anywhere (with a network connection).

The logical centralization of these kinds of methods provides an important, arguably more fundamental property: conflict resolution. The underlying consensus mechanism of these DID method implementations (e.g. the Bitcoin blockchain for did:btcr or the Ethereum blockchain for did:ethr) prevents any conflicting states from being propagated on the network. And if this sounds similar to a cryptocurrency, that’s because it is.

Blockchain consensus mechanisms were and are designed to maintain some kind of invariant about the current state of the chain. In Bitcoin and other currency chains, this invariant is the amount of existing coins: the mechanism is designed to solve the double-spend problem. It seemingly makes sense, then, to use this property of blockchains for identity. Or does it?

Interestingly, it actually does not. The invariant of single-spend leads to slow consensus time because it must be applied to the globally-shared state of the chain. In identity, however, we do not have a double-spend problem.

The invariant we care about in identity is different. It is only that an update to the state of the keys and other metadata associated with an identifier be valid. The crux here is that the state of accounts (i.e. identifiers) never overlap or depend upon each other.

So, where does KERI fit into this?

KERI stands for Key Event Receipt Infrastructure. It was initially conceived by Dr. Samuel Smith, distributed systems pioneer and founder of ProSapien, and is now being worked on and implemented under the Decentralized Identity Foundation (DIF). In a nutshell, KERI is a flexible, secure, minimal and privacy-preserving trustless DKMS.

In contrast to blockchain or central registry-based trust systems, KERI is based on a hash-chain data structure called a ‘key event receipt log’ (KERL). Conceptually, it’s similar in some ways to the Peer DID Method specification, except that its data model is a KERL rather than a DID document. And while KERI can be used as a DID method, it is fundamentally not reliant on any of the DID specifications and can be used in many other contexts as well. In particular, it is also useful for Internet of Things (IoT) networks and other security-conscious, low-resource use cases.

System design trade space. (visual rework of the original graphics by
Samuel Smith) Exploring events

A KERL is composed of a list of key events and witness receipts – more on these later. A key event, however, is a transaction signed by the controller of the identifier and indicates an update in the state of the identifier’s keys or other semantics. There are a few basic kinds of key events.

Inception: this event signals the creation of an identifier, and it cryptographically binds some information to the identifier. The result of an inception event is an identifier with two sets of keys: one for signing, another for controlling. The signing key set can be used to authenticate as the identifier, while the control key set (which is hashed and not visible in the event) can be used to create rotation events. Rotation: this event signals the change of signing and control keys for the identifier. KERI relies upon a rule called pre-rotation for post-quantum security. Rotation events maintain this rule. Upon rotation, the control keys (used to sign the rotation event) become the new signing keys, while a new set of control keys is added. This ensures that no attacker can ever compromise the control keys because they are never used (or perhaps never even existed in the first place) until they sign the transaction which makes them no longer the control keys. A rotation event which has an empty set for the control keys is equivalent to abandoning the identifier (as no new control events are possible). Interaction: these events simply contain a commitment to a piece of data. Within KERI they can be used by a delegator to commit to – and thus finalize – a delegated inception or rotation event. Delegated inception: this event signals adding a new set of delegated keys to the identifier. Similar to the inception event, this new key set has an identifier created by cryptographically binding its inception data, and as such can be referenced by a “path” from the main identifier, e.g. identifier#delegated_identifier. Delegated rotation: this event signals the rotation of a delegated key set, similarly to the rotation event. Much like the main identifier, the delegated identifier does not change when the underlying key set is rotated. State update rules

Along with these event types, there is a set of rules for validating a list of these events. The result of applying these rules is an identifier and a set of keys which are proven to be controlled by the identifier. In simple terms, these rules are:

There can only be one inception event, and it must come first. Each rotation event must be signed by the control keys established by the previous events and must set the new signing keys to be the previous set of control keys. Each delegated inception or rotation event must be signed by the signing keys established by the previous events. All events must have a monotonically-increasing counter. All events must contain the hash of the previous event.

These rules, when applied to the key events, provide the conflict resolution mechanism which enables KERI to work independently of a central registry. A key event log or KEL (i.e. a log containing only events with no receipts) which is being updated will reject an incorrect event. By appending the first valid event, all other valid events for that place in the KEL are rendered invalid (i.e. the counter or backhash or some other semantic detail will be deemed incorrect).

The key event log validation rules provide a way to know the correct key state for an identifier, just as the consensus rules of the Bitcoin blockchain provide a way to know the correct balance of an account. In this sense, a KEL can be thought of as a “single-account blockchain” or micro-ledger. This is perfect for knowing a KEL is valid — but when presented with multiple different KELs for a single Identifier, how can one know which is the “true” KEL?

Witnesses

The witness system is the conflict resolution mechanism for KERLs (remember, a KERL is a KEL plus witness receipts), the same way that the validation rules serve as the conflict resolution mechanism for key events. In an inception event, the creator of the event can list several “witness” identifiers. Upon creation of a key event (every kind, not just inception), the entity must present the event to its witnesses and in return the witness gives a signature of that event. These signatures are compiled into a witness receipt for each event in the KEL. In the case of divergent conflicting KELs being detected, a validator can collect the KEL and the witness receipts into a KERL and prove which is correct via an algorithm called KAACE (KERI’s Agreement Algorithm for Control Establishment, see section 11 of the KERI white paper for more details).

The security mechanism here is subtle but very effective:

The witnesses are cryptographically bound to the identifier in the inception event. The witnesses are assumed to be trusted (or at least not known to be malicious) by the creator of the identifier and give signatures which cannot be repudiated. Any attacker must compromise both the control keys of the identifier (very difficult already because of pre-rotation) and all of the signing keys used by the witnesses. If there is more than one event, they will have to compromise the entire key history of each current and previous witness and the identifier. This is much more difficult than, say, compromising the single key pair used to secure a Bitcoin account.

With these security properties, a KERI-based identifier is at least as hard to compromise as most blockchain accounts, equivalent to compromising a single private key. What’s more, a KERI-based identifier is more difficult to attack, being equivalent to reversing several hash functions and compromising an arbitrary number of private keys held by different parties.

This makes KERI an incredibly flexible and powerful way of managing identifiers and keys without recourse to, reliance on, or vulnerability through any centralized or decentralized service or infrastructure.

KERI can even act as a bridging account, whereby key pairs for different ledger networks can be managed and linked under a single Identifier, due to its cryptographic flexibility. This is because KERI currently supports five signature schemes and can be extended to add more.

Ultimately KERI’s core value propositions are:

Enforced pre-rotation: ensures that the recovery keys are never exposed prior to use, so that even an attacker with a quantum computer can not crack the recovery key set. Ambient verifiability: events being verifiable without reliance on a global state which must be kept synchronised allows for KERI to be effective on any scale and in any deployment environment, networked or not. Flexible trust structures: the two-layer multisig combined with the rotatable witness set provides support for any imaginable arrangement of trust, or none at all.

‧ ‧ ‧

KERI is currently being worked on in the Decentralized Identity Foundation’s Identifier and Discovery Working Group, co-chaired by Dr Sam Smith who originated KERI and contributed to this article via his ideas and feedback. There are three community-driven implementations in progress, in Python, Rust and JavaScript. Interested parties are actively encouraged to participate.

The post How KERI tackles the problem of trust appeared first on Jolocom.


Forgerock Blog

E-Voting Is the Future: Busting Myths and Objections

Demand for Online Voter Registration and Voting Doubles During the Pandemic The COVID-19 pandemic made people – perhaps for the first time – consider if it is really worth it to leave their homes for many activities. Is picking the right apple at the grocery store worth the risk? As we near the U.S. presidential election, this same question remains, although the stakes are much higher. Constitu
Demand for Online Voter Registration and Voting Doubles During the Pandemic

The COVID-19 pandemic made people – perhaps for the first time – consider if it is really worth it to leave their homes for many activities. Is picking the right apple at the grocery store worth the risk? As we near the U.S. presidential election, this same question remains, although the stakes are much higher. Constituents shouldn’t have to agonize between choosing their health over casting a vote, or vice-versa, because there’s a better solution. It’s time for the U.S. to bring e-voting to the American people.   

So what’s stopping the U.S. from implementing e-voting? Creating a national system of voting online is very much within the realm of possibility today. We have all the tools and technologies available at our disposal today. Below, we address common myths and objections. 

Myth #1: People don’t want to e-vote.

As the world remains in the grips of a global pandemic, people’s preferences towards digital activities are changing rapidly across industries. In fact, findings from ForgeRock’s New Normal Report show consumer preferences for online voting doubled across all regions. Almost two-thirds of consumers prefer to register to vote online as well. Inarguably, most voters want a modern and secure way to cast their ballot, which means the end of the paper ballot’s exclusivity is likely near. 

At ForgeRock, we believe that digital identity has a huge part to play in this. Digital identity technology can be essential in securing registration, user identification, and authentication – all key steps in ensuring a trustworthy and accurate vote count.

Myth #2: E-voting will enable voter fraud.  

Today, verification of votes is utterly archaic. It relies on polling volunteers to compare signatures on voter cards, which seems absurdly low tech, given the digital world we live in. That said, it is challenging to compromise in-person voting at scale. Fraudsters would have to send pretend “voters” one by one to the polls to pass off the false votes – and that’s a federal felony. For the risk, the reward just isn’t there.  

E-voting, on the other hand, would introduce a much stronger root of trust than we have in the existing voting system. Identity verification technologies, which are widely available on the market today, can quickly validate the authenticity that people are who they say they are. They use a variety of methods that are much stronger than today’s simple signature match. Technologies such as biometrics, device reputation, behavioral signals, and other digital identity capabilities offer a much more accurate validation of a voter’s identity and avoid widespread voter fraud. These digital identity technologies would transparently put every voter through multiple layers of validation that would provide much greater security without adding friction to the voter experience. The Real ID system that has now been adopted in all 50 states in the U.S. is one step forward toward a minimum standard of identity information. Real ID state licenses are required to provide a core set of security and validation features that make state licenses a very strong level of identity validation.

Myth #3: E-voting will create a new attack vector for hackers and invalidate election results. 

One of the reasons that the U.S. voting system is resistant to manipulation today is its decentralized nature. Town to town, state to state, voting methods vary. To create an e-voting system that is resistant to an external digital attack, it, too, must be distributed or decentralized. Blockchain, which is already being utilized for online voting in several countries, is one technology that could be critical. By making it difficult to manipulate individual votes, through containerizing the voting information, utilizing encryption, rotating keys, and leveraging distributed ledger technology, hackers will face a similar challenge to paper ballots – the effort to access a single vote means it would require too much effort to impact the larger voting pool. 

Myth #4: There is no way to maintain anonymity in voting digitally. 

Identity and access management (IAM) solutions are used by the biggest brands every day with a need to balance both privacy and data integrity. A decentralized, blockchain-based recording of votes could be held as an unchangeable backup, similar to the paper backup approach used today, while the information aggregated and shared outside that blockchain removes personally identifiable information (PII). 

Digital voting would likely need to be decoupled into several steps to maintain security and anonymity simultaneously. A user would need to be strongly authenticated. A record that an individual voted would need to be stored in an immutable way that can’t be linked to their actual vote. The individual’s vote would need to be deposited in a different immutable system so that votes can be easily counted and never changed. Maintaining this strict separation ensures that the vote can’t be traced back and linked to the individual who cast it. 

Myth #5: Online voting will disenfranchise those who do not have access to smartphones or computers. 

With voters at risk of exposing themselves to COVID-19, or the potential risk of future pandemics increasing the need for voting alternatives, e-voting should replace in-person voting. While the vast majority would benefit, there would still be measures in place for those who can’t cast an e-vote. New approaches should be introduced to ensure that no one is left behind in the voting process, but providing an e-voting option quickly will give U.S. citizens an opportunity to balance the risks they face between health and patriotism without having to stand in historically long lines, which have already become an issue in the 2020 election. 

As we move towards an e-voting future, the disenfranchised cannot be left behind. Rather, our focus should be on re-enfranchising these communities while ensuring alternative solutions are in place so that every voice is heard and the digital divide doesn’t become the civic divide. These efforts will be well worth the investment in the end. 

At ForgeRock, we have a big hammer in our ForgeRock Identity Platform, but voting is a nail-shaped problem that is rooted in identity. The capability to securely identify a person, anonymize and secure their session, and then record their vote is, at its core, digital identity. While identity can't entirely make e-voting 100% viable, as lots of other factors will contribute to its ultimate success, e-voting can’t exist without identity. To create a safe option for citizens to exercise their right and responsibility to vote, the U.S. must make e-voting a reality, starting with the 2024 election. The good news is that the technology is already here.  

To learn more about how consumer preferences are changing, check out the ForgeRock Consumer Survey: The New Normal

 


Otaka

What's New in Laravel 8

With Laravel 8’s release in September 2020, the popular PHP framework continues to offer new features and improvements. After version 5, Laravel moved to semantic versioning and the more frequent releases have meant smaller changes between each one. That said, there are still several exciting updates in this version of the framework. While Laravel will continue to offer security fixes for versi

With Laravel 8’s release in September 2020, the popular PHP framework continues to offer new features and improvements. After version 5, Laravel moved to semantic versioning and the more frequent releases have meant smaller changes between each one. That said, there are still several exciting updates in this version of the framework.

While Laravel will continue to offer security fixes for version 7 until early 2021, no more bug fixes will be released after October 2020, so you should upgrade to Laravel 8 as soon as possible. In this article, you’ll see all the new features and changes released in Laravel 8. After going through the new features, I’ll show you how to add authentication to your Laravel 8 application using Okta.

Jetstream

The biggest new feature available in Laravel 8 is an application scaffolding tool called Jetstream. Long-time Laravel users are probably familiar with Laravel Spark, which adds authentication, billing, teams, and improved security options to Laravel for a small fee. Jetstream now offers almost all of Spark’s features (without the billing) for free.

While Jetstream won’t help you much if you’re upgrading an existing Laravel app, as you’ve probably already built the features you need, it will accelerate the process of building new applications. Developers who take advantage of Jetstream’s features won’t have to build user profiles, change password flows, API token generation, or team account creation from scratch. You can also integrate Jetstream into third-party authentication providers like Okta using its various authentication hooks.

That said, Jetstream isn’t going to be right for everyone. It’s opinionated so, while it gives you two options for scaffolding your frontend code (Livewire or Inertia.js), you won’t get much out of it if you’re already committed to another popular frontend framework like React or Angular. Jetstream also relies on Tailwind UI, so users of Bootstrap or other styling libraries will have a lot of work to customize all the CSS elements.

Depending on your application workflow and priorities, Jetstream could save you a ton of time. You can also publish the package’s files and edit them, so it’s possible to customize everything it does to suit your needs.

Migration Schema Dumps

If you’ve worked on a Laravel application for a long time, you might have dozens or hundreds of database migration files in your project. Typically, you’ll only run the newest ones each time but, when a new developer joins your team or you want to refresh your database tables, you’ll have to run all those migrations in sequence again.

Laravel 8’s new schema:dump command fixes this problem. After you run the Artisan command, your existing migrations will be “squashed” and saved to a single SQL file. You can opt to set the --prune flag, which will also remove the original migration files from your project. Next time you run all your migrations, Laravel will just run the SQL file, followed by any newer migrations you’ve added since the squash.

To see the schema dump in action, create a new Laravel 8 project and connect a MySQL or Postgres database. Laravel comes with a few default migrations, so once you configure your database, you can run the following:

php artisan schema:dump --prune

You’ll see that Laravel has deleted your migrations in the ./database/migrations directory and created a single SQL file in the ./database/schema directory. Now, you can run all your migrations again, and Laravel will use the SQL file:

php artisan migrate:fresh

Note that Laravel 8’s migration schema dumps only work when using a SQL-based database like MySQL or PostgreSQL. NoSQL databases like MongoDB can’t use this feature, and it doesn’t work for SQLite yet either.

Class-based Factories

Laravel has removed model factory functions in favor of class-based model factories. This means that you can create an instance of a model for testing or seeding purposes using the new factory() method. For example, the following code will create five users and save them to your database:

User::factory()->count(5)->create();

Faker is always available to factory classes, so it’s really easy to generate nice-looking test data. If you need legacy support for factory functions, Laravel released a package that you can use to maintain the old method until you upgrade your code.

Rate Limiting Improvements

Before Laravel 8, the best way to add rate-limiting to your application was to use the throttle middleware. You could customize this middleware by extending it or creating your own class, but it wasn’t easy to do.

In version 8, Laravel added a new method to the RouteServiceProvider called configureRateLimiting(). Here you can use Laravel’s new RateLimiter facade to implement custom logic around rate limiting. For example, you could allow admins to make unlimited API requests while other users are limited to 60 requests per minute:

... protected function configureRateLimiting() { RateLimiter::for('api', function (Request $request) { return $request->user()->isAdmin() ? Limit::none() : Limit::perMinute(60); }); } ...

Complex logic around rate limits can be compelling for API-based Laravel applications.

Improved Maintenance Mode Options

Developers typically put their Laravel apps into maintenance mode while running tasks like upgrading Composer packages or database migrations. In previous versions of Laravel, developers could use their IP address to bypass maintenance mode, but in Laravel 8, this method has been replaced with URL-based tokens.

For example, put your application into maintenance mode using the following command:

php artisan down --secret="12345"

Users won’t be able to access the application unless they navigate to <YOUR_APP_URL>/12345. If they do this, they can bypass maintenance mode and see the application. This allows you to share a link with other developers or stakeholders who might need to bypass maintenance mode.

Another problem with maintenance mode in previous versions of Laravel was that it depended on Laravel being in a working state. In other words, if your composer install command broke your Laravel installation, the maintenance page would be broken too.

To get around this, Laravel 8 added an option to prerender a specific view that users will see while your app is in maintenance mode. For example, if you want to show the default Laravel 503 error page during maintenance mode, you can run the following:

php artisan down --render="errors::503"

This feature ensures that maintenance mode is more robust. While you don’t want to spend too much time in maintenance mode, you want it to work.

Time Traveling Tests

Testing time-based code is always tricky. Applications that rely on time differences relative to now() will have difficulty testing their logic. Fortunately, Laravel 8 includes a new time manipulation feature that allows you to change the application’s perceived time during testing.

For example, you might have a method on your User model that returns true when a user’s account is more than 90 days old:

... public function isExperienced() { return $this->created_at < Carbon::now()->subDays(90); } ...

To test this, you can write a test that uses the travel() method:

... public function testUserIsExperienced() { $user = User::inRandomOrder()->first(); $this->travel(91)->days(); $this->assertTrue($user->isExperienced()); $this->travelBack(); $this->assertFalse($user->isExperienced()); } ...

This feature dramatically improves your ability to test time-based code and catch edge cases.

Other Improvements

In addition to the significant new features outlined above, Laravel 8 also includes many relatively small improvements you can read more about in the version 8 release notes. For example:

Laravel now puts models into a new directory (called Models) by default. Tailwind is being used for more of Laravel’s default styling, including pagination. You can now batch background jobs using the Bus::batch() method. The php artisan serve command now reloads your app whenever you update your .env file. Event listening closures can now be run in the background using the queueable() function. Adding Authentication to a Laravel 8 Application with Okta

If you’ve added authentication to your Laravel application in previous versions, you’ll notice some differences when using Laravel 8. The most significant change is that Laravel’s authentication UI code has been moved to a separate package, so you’ll need to either import this package or Jetsream in addition to the Socialite package.

In the remainder of this article, I’ll walk you through setting up a new Laravel 8 application using Socialite with Okta as your Authentication provider. By the end of this section, you’ll be able to log into your Laravel 8 application using Okta.

Prerequisites: Please ensure you have already installed PHP and Composer. This tutorial assumes you have already created a new Laravel PHP application and have signed up for a free Okta developer account.

Setting Up Your Okta Application

First, log into or create a new Okta account. From the Applications page, click “Add Application” to start the creation process.

Select “Web Application” from the Platform menu.

Give your application a name, enter http://localhost:8000/ as the Base URI, http://localhost:8000/login/okta/callback as the Login redirect URI, and http://localhost:8000 as the Logout redirect URI. This will ensure that the Okta API recognizes and allows requests from your local Laravel application.

Click Done and copy the Client ID and Client secret shown on your app’s settings page. Finally, go to API > Authorization Servers in the Okta admin and copy the Issuer URI (without the /oauth2/default part). You will use this as your OKTA_BASE_URL in the next section.

You’re done setting up your Okta application. The rest of this tutorial assumes you have an existing Laravel application without authentication set up. If you don’t have a Laravel application yet, refer to the installation instructions here.

Configuring Okta in Laravel

Next, you need to install the Socialite package, Okta Socialite provider, and the Laravel UI package using composer:

composer require laravel/socialite socialiteproviders/okta laravel/ui

With the packages installed, you need to register them and your Okta credentials in your Laravel application. Add your Okta Client ID, Client Secret, Base URL, and Redirect URI to your Laravel application’s .env file:

OKTA_CLIENT_ID=********** OKTA_CLIENT_SECRET=************ OKTA_BASE_URL=https://*****.okta.com OKTA_REDIRECT_URI=http://localhost:8000/login/okta/callback

You’ll use these environment variables in your application’s configuration. Open up the config/services.php file and add a new array for Okta:

... 'okta' => [ 'client_id' => env('OKTA_CLIENT_ID'), 'client_secret' => env('OKTA_CLIENT_SECRET'), 'redirect' => env('OKTA_REDIRECT_URI'), 'base_url' => env('OKTA_BASE_URL'), ], ...

You also need to register Socialite with Laravel. Add Socialite’s service provider to the $providers array in your config/app.php file:

... $providers = [ ... SocialiteProviders\Manager\ServiceProvider::class, ... ] ...

Finally, you need to make sure the Okta provider knows when Socialite is called to make the appropriate API calls. Open your app/Providers/EventServiceProvider.php file and add the following listener:

... protected $listen = [ ... \SocialiteProviders\Manager\SocialiteWasCalled::class => [ 'SocialiteProviders\\Okta\\OktaExtendSocialite@handle', ], ]; ...

Your Okta application is now connected to Socialite and your Laravel application, but you need to update your user model and database migrations before you can test the login flow.

Update the User Model and Migrations

By default, Laravel creates a User model and database table with a password field and a database table for password resets. You won’t need these when you switch to Okta, so you can remove them if you’ve already created them. If not, you can simply remove the CreatePasswordResetsTable migration and update your CreateUsersTable:

<?php use Illuminate\Support\Facades\Schema; use Illuminate\Database\Schema\Blueprint; use Illuminate\Database\Migrations\Migration; class CreateUsersTable extends Migration { /** * Run the migrations. * * @return void */ public function up() { Schema::create('users', function (Blueprint $table) { $table->id(); $table->string('name'); $table->string('email')->unique(); $table->text('token'); $table->timestamps(); }); } /** * Reverse the migrations. * * @return void */ public function down() { Schema::dropIfExists('users'); } }

Run the migrations from your command line using Artisan:

php artisan migrate

Next, update the User model to reflect these changes. Open app/Models/User.php and add update the $fillable property:

... protected $fillable = ['email', 'name', 'token']; ...

This ensures that Laravel can write to the token column when a user signs in with Okta.

You can also remove the $hidden and $casts arrays as the password, remember_token, and email_verified_at fields are no longer used. Your User model and database table are now ready to connect to Okta for authentication. The next step is to update your routes and login controller.

Adding the Okta Authentication Routes

When you created your Okta application, you set a callback URL. After a user logs in, Okta will redirect them to this callback URL with a token, so your application needs to save that token and (if not already created) the user. You also need a route that directs users to Okta to login.

Open your routes/web.php file and add the following:

... Route::get('/login/okta', 'App\Http\Controllers\Auth\LoginController@redirectToProvider')->name('login-okta'); Route::get('/login/okta/callback', 'App\Http\Controllers\Auth\LoginController@handleProviderCallback');

Now that the routes are set up, you need to update the LoginController to handle these new methods. Assuming this is a new Laravel application without authentication installed yet, you need to run the Artisan command to generate the authentication scaffolding. This will publish the authentication controllers and view files so you can edit them:

php artisan ui bootstrap --auth

Next, open the app/Http/Controllers/Auth/LoginController.php file and replace it with the following:

<?php namespace App\Http\Controllers\Auth; use App\Http\Controllers\Controller; use App\Models\User; use App\Providers\RouteServiceProvider; use Illuminate\Foundation\Auth\AuthenticatesUsers; use Illuminate\Http\Request; use Illuminate\Support\Facades\Auth; use Laravel\Socialite\Facades\Socialite; class LoginController extends Controller { use AuthenticatesUsers; /** * Where to redirect users after login. * * @var string */ protected $redirectTo = RouteServiceProvider::HOME; /** * Create a new controller instance. * * @return void */ public function __construct() { $this->middleware('guest')->except('logout'); } /** * Redirect the user to the Okta authentication page. * * @return \Illuminate\Http\Response */ public function redirectToProvider() { return Socialite::driver('okta')->redirect(); } /** * Obtain the user information from Okta. * * @return \Illuminate\Http\Response */ public function handleProviderCallback(Request $request) { $user = Socialite::driver('okta')->user(); $localUser = User::where('email', $user->email)->first(); // Create a local user with the email and token from Okta if (!$localUser) { $localUser = User::create([ 'email' => $user->email, 'name' => $user->name, 'token' => $user->token, ]); } else { // if the user already exists, just update the token: $localUser->token = $user->token; $localUser->save(); } try { Auth::login($localUser); } catch (\Throwable $e) { return redirect('/login/okta'); } return redirect('/home'); } }

The redirectToProvider() method sends users to Okta to enter their login credentials, and the handleProviderCallback() method saves the token returned by Okta to the user’s account. It can also create new users or log them in. Your Laravel application is almost ready to authenticate users, but the last step is to update the user interface login link.

Updating the User Interface

Before you can test your new authentication flow, update the login link to point to the new Okta route. Open your Laravel application’s resources/views/welcome.php file, and find the line containing @if (Route::has('login')).

Replace the entire @if block with the following:

... @if (Route::has('login-okta')) <div class="hidden fixed top-0 right-0 px-6 py-4 sm:block"> @auth <a href="{{ url('/home') }}" class="text-sm text-gray-700 underline">Home</a> @else <a href="{{ route('login-okta') }}" class="text-sm text-gray-700 underline">Login</a> @endif </div> @endif ...

Install the frontend packages and run Laravel’s dev build command to create the necessary CSS files:

npm i && npm run dev

To test the entire authentication flow out, start the local development server:

php artisan serve

Visit http://localhost:8000/ in your browser. Click the “Login” link and enter your email and password. You should be taken to your dashboard.

The complete source code for this project is available on GitHub.

Learn More

In this post, you’ve seen all the major new features released in Laravel 8. Many of these new features will impact how you build Laravel apps in the future. While you probably won’t use them all immediately, it’s helpful to keep an eye on where the framework is progressing. Finally, setting up authentication in a new Laravel application has changed in the past two versions. You’ve also seen the most current way to add Okta as an authentication provider for your Laravel 8 applications.

If you’d like to learn more about integrating Okta with your Laravel and PHP applications, be sure to check out some of these resources:

Build a Simple Laravel App with Authentication Protecting a PHP API Using OAuth Create and Verify JWTs in PHP with OAuth 2.0

If you like this blog post and want to see more like it, follow @oktadev on Twitter, subscribe to our YouTube channel, or follow us on LinkedIn. As always, please leave a comment below if you have any questions.

Wednesday, 14. October 2020

KuppingerCole

Policy-Based Access Control – Consistent Across the Enterprise

The evolution of cybersecurity protection demands a more nuanced response to providing access to a company’s sensitive resources. Policy-based access control (PBAC) combines identity attributes and context variables to enable sophisticated granting of access to corporate systems and protected resources based on centrally managed policies that ensure consistent access control decisions across the e

The evolution of cybersecurity protection demands a more nuanced response to providing access to a company’s sensitive resources. Policy-based access control (PBAC) combines identity attributes and context variables to enable sophisticated granting of access to corporate systems and protected resources based on centrally managed policies that ensure consistent access control decisions across the enterprise. Advancement in both business requirements and technology (such as growing use of micro-services), require a better way to control access. In a way that is consistent across all silos, dynamic enough to react to change in risk, and provides better control for the application business owners.

PBAC facilitates the application of consistent policy across all applications that use the PBAC authorization service. Furthermore, policies are evaluated in real-time against current attributes rather than having to wait for a nightly update of identity attributes before access control policy is correctly applied. PBAC also facilitates a risk management approach to access decisions. If access outside business hours represents a greater risk the authorization service could prompt for an additional authentication factor before access is granted.




Evernym

SSI Roundup: October 2020

Below is a copy of our October 2020 newsletter, The SSI Roundup. To get the best SSI headlines, events, and resources sent straight to your inbox each month, subscribe below:   The Self-Sovereign Identity Roundup: October 2020 Welcome back to another SSI Roundup. Today, we’re sharing a few inspiring examples of just how far the […] The post SSI Roundup: October 2020 appeared first on Everny

Below is a copy of our October 2020 newsletter, The SSI Roundup. To get the best SSI headlines, events, and resources sent straight to your inbox each month, subscribe below:   The Self-Sovereign Identity Roundup: October 2020 Welcome back to another SSI Roundup. Today, we’re sharing a few inspiring examples of just how far the […]

The post SSI Roundup: October 2020 appeared first on Evernym.


Ontology

How Being On The Road Will Feel Just Like Home With Ontology & Daimler Mobility

There are currently several trends which are shaping mobility industries around the globe. As a result, the mobility sector has consented the future belongs to C.A.S.E Connected, Autonomous, Shared, and Electric. Cars are experiencing a conjectural transition to become smart devices for drivers, just as how featured phones were replaced worldwide by smartphones. In addition, a decline in car owne

There are currently several trends which are shaping mobility industries around the globe. As a result, the mobility sector has consented the future belongs to C.A.S.E Connected, Autonomous, Shared, and Electric.

Cars are experiencing a conjectural transition to become smart devices for drivers, just as how featured phones were replaced worldwide by smartphones. In addition, a decline in car ownership and a surge in shared mobility services have resulted in a call for more user-centric, integrated mobility service offerings for drivers.

According to Yun Xi, Senior UI/UX designer at Daimler Mobility, “We want to enhance trust between different mobility service players. The question was how to provide customized services to end-users based on their authorized data. You need to identify a way to ensure the user’s data privacy and security. That’s challenging as a diverse range of third-party service providers is participating in the process. And all of them provide an integrated and personalized user experience throughout different mobility offerings.”

This is the reason why Daimler Mobility — an industry leader that envisions the automobile as a home on the road — chose to collaborate with Ontology. This partnership, capitalizing on the latter’s digital identity and data life-cycle management technology via the blockchain, aims to create a trust-worthy, personalized, and secure experience for drivers. Customers should feel as comfortable and at ease as they would be at home.

The drivers of cars, who have access to the Welcome Home in-car system receive a couple of unique benefits, including:

Your in-car and mobility service preferences are attached to your profile, not to any one particular vehicle, therefore you could transform any vehicle into your own, anytime, anywhere across the globe. You can access third-party service providers integrated into the system with one-time verification, yet stay anonymous to all providers after the services are completed. You will be authorizing your data for service providers to give personalized services, yet you retain full transparency and control of all access and usage of your data, which would allow you to stay anonymous.

For a full outline over the 6 key steps of the “Welcome Home” journey, we’d like to demonstrate the example of a user named Julia who is traveling in Germany and needs to rent a car in Stuttgart, Germany.

To begin with, Julia would have gone through a one-time KYC (1) when she previously purchased her own automobile. Through Welcome Home, Julia would have bound her personal preferences from her own car with the platform, so that when she rents the car in Stuttgart, it would come tailored to her preferences (2).

Once inside the car, Julia would receive third-party provider suggestions through the Fleet2Share, a different platform hosted on a different blockchain. This is the primary value add for Ontology’s partnership since we offer cross-chain interoperability (3). The good news for Julia though is that she doesn’t need to re-register or sign-up for anything new since all of her data is already stored on the Welcome Home platform. All she needs to do is grant authorization and access to the 3rd party app providers suggested from Fleet2Share. The authorization to have access to this data is managed on the blockchain, making it both immutable and transparent. Once the journey begins, the car would be able to track data points including wear and tear, if there were any accidents, or more minute details like how heavy Julia steps on the breaks versus the gas pedals.

At the end of her journey (4), Julia can unpair her car profile settings from the car and choose to restrict her personal data from being accessed by third parties — while the blockchain grants data points that she previously allowed for things to do with the car, such as the tread on the tires or the amount of charge the battery it has left (5). And finally, Julia can choose to share this experience with her social circle, or grant access to her data in — either way, it is her decision (6).

In summary, the 6 key steps of a user’s journey through the Welcome Home platform are:

User Onboarding & Data Access Control Setting Varied Profiles & Data Management Pairing Profile with Rental Interoperability Ending the rental and having access to your Data Profile Sharing & trusted Data Sharing Receiving your profile via Social deliveries

At the end of the day, Welcome Home is designed to be a user-centric application granting us full control over when, where, and whom we share data with. It is designed for the perfect mobility experience, partnered with Daimler Mobility. We firmly believe that the control should belong to the user, and is something achievable through our decentralized Identity (DeID) protocol as well as our distributed data exchange framework (DDXF) — both powered by cryptographic algorithms on the blockchain.

Welcome Home is an excellent case of using blockchain to give us the option of benefitting from the perks and convenience of the digital economy while keeping our data private and secure. This also holds vast potential outside of mobility services such as wherever we are making compromises for this trade-off including on smartphones, in smart homes, and in smart cities.

Data sovereignty is what ultimately safeguards our freedom.

For blockchain enthusiasts, Ontology’s Decentralized Identity Protocol used underneath Welcome Home is the first of its kind seen in the industry where one-click verification and service integration could be made interoperable for applications on different blockchains, making the solution truly scalable.

Check out the video demo of Welcome Home on Startup Autobahn EXPO Day powered by Daimler Mobility AG here.

Find Ontology elsewhere

Ontology website / Ontology GitHub / ONTO website / OWallet (GitHub)

Telegram (English)Discord

Twitter / Reddit / FacebookLinkedIn

How Being On The Road Will Feel Just Like Home With Ontology & Daimler Mobility was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


MyKey

Announcement: MYKEY Lab adds 1 billion KEY TOKEN to the “multiple chains exchange pool”

Due to the demand of KEY TOKEN on multiple chain, MYKEY Lab locked 1 billion KEY TOKEN on Ethereum to the exchange pool on October 14: 0xc4947bf8c74033c7079f6780460e72e82a8df33c Exchange pool change record: Announcement: MYKEY Lab adds 1 billion KEY TOKEN to the “multiple chains exchange pool” Announcement: MYKEY Lab adds 1 billion KEY TOKEN to the “multiple chains exchange 

Otaka

Spring Security SAML and Database Authentication

Spring Boot is a ubiquitous and well-supported suite of tools for developing web applications in Java. Database authentication, in which credentials identifying authorized users are stored in a database accessible by the application, is maybe the most common and straightforward method of authenticating users. SAML is a well-supported open standard for handling authentication between identity provi

Spring Boot is a ubiquitous and well-supported suite of tools for developing web applications in Java. Database authentication, in which credentials identifying authorized users are stored in a database accessible by the application, is maybe the most common and straightforward method of authenticating users. SAML is a well-supported open standard for handling authentication between identity providers and service providers.

Configuring SAML authentication in Spring Security is a common topic, and examples are easy to come by. It’s also well documented, with straightforward configuration options available, as in this example from the Okta blog.

However, what if you want to combine both database and SAML authentication methods within the same Spring Boot application, so a user can be authenticated using either way? We will discuss and implement a solution in this tutorial!

Prerequisites

Java 11

Acknowledgment: Much of the groundwork for the implementation of SAML 2.0 authentication used in this project was developed by Vincenzo De Notaris and can be found in this project on GitHub. For this project, some changes have been made to support dual DB + SAML authentication and use Okta as the SAML identity provider rather than SSOCircle.

Table of Contents

SAML Authentication with Spring Security Set Up Your Okta Account with SAML and Run the Application How to Combine Database and SAML Authentication in Spring Boot The SAML and Database Auth “Pre-Login” Page Authenticate with SAML and Spring Security Authenticate with a Database and Spring Security Learn More About SAML and Okta SAML Authentication with Spring Security

There are several benefits to using SAML to handle authentication for your application:

Loose coupling between your application and your authentication mechanism increases independence between the two, allowing for more rapid development and evolution of application logic, with less risk of regression Shifts the responsibility of authentication, which involves storing and retrieving sensitive user information, to the identity provider (e.g., Okta), which almost always offers less risk since identity management is their business model Allows for an improved user experience via Single Sign-On while navigating between multiple apps

Okta is a very well established identity provider with robust features and a wealth of support. Managing users, accounts, and permissions with Okta is simple and straightforward. Simultaneously, it is still flexible and extensible enough to support your application no matter how much it grows (even as it grows into several applications). And the friendly, growing community is available to answer any questions you may have!

You’ll need to create a forever-free Okta developer account to complete this tutorial. If you already have a developer account, you should complete this tutorial by switching to the Classic UI in the top-left corner.

In case you need to support legacy systems or because you have strange security requirements, you may need to allow users to authenticate using either SAML or database credentials. The process to combine SAML 2.0 with DB auth in Spring Boot is what we’ll tackle here!

Set Up Your Okta Account with SAML and Run the Application

Please complete the following ten steps to see a working example.

Step 1: Clone the okta-spring-security-saml-db-example repository:

git clone https://github.com/oktadeveloper/okta-spring-security-saml-db-example.git

Step 2: Sign up for a free developer account at https://developer.okta.com/signup. This is required to create SAML 2.0 applications in Okta.

Step 3: Log in to your Okta account at https://your-okta-domain.okta.com. If you see a developer dashboard like the screenshot below, click on Developer Console in the top left, and select Classic UI.

Click Admin.

Step 4: Create a new application via Admin > Applications > Add Application > Create New App with the following settings:

Platform: Web Sign On Method: SAML 2.0

Click Create.

Enter an App name like Spring Boot DB/SAML (or whatever you’d like). Click Next.

Enter the following SAML Settings:

Single Sign-On URL: http://localhost:8080/saml/SSO Use this for Recipient URL and Destination URL: YES Audience URI: http://localhost:8080/saml/metadata

Click Next.

Select the following two options:

I’m an Okta customer adding an internal app This is an internal app that we have created

Then, click Finish.

Step 5: Navigate to Assignments > Assign to People.

Step 6: Assign to your account with the custom username samluser@oktaauth.com.

Step 7: Navigate to Sign On and copy the following values to your /src/main/resources/application.properties file:

saml.metadataUrl — Right-click and copy the URL from the Identity Provider metadata link below the View Setup Instructions button.

saml.idp — Click the View Setup Instructions button and copy the value in (2) Identity Provider Issuer.

For example, here are the values I used:

saml.metadataUrl=https://dev-763344.okta.com/app/exk74c26UmANQ0ema5d5/sso/saml/metadata saml.idp=http://www.okta.com/exkrmibtn3VG9S2Pa4x6

Step 8: Run your Spring Boot application in your IDE or via Maven:

mvn spring-boot:run

Step 9: Navigate to your application’s home page at http://localhost:8080.

Step 10: For database authentication, log in using dbuser@dbauth.com / oktaiscool.

You should see a success message saying you’re logged in.

For SAML authentication, sign in using samluser@oktaauth.com.

You should be prompted to select your identity provider.

Then, you should be redirected to the SAML Okta auth flow and returned to your application following successful authentication.

You’re done! You’ve successfully configured your project to support authentication via both the database and SAML 2.0! 🥳

It’s nice to see everything working, but what about the code that makes it happen? Keep reading for a walkthrough of the code and how it works.

How to Combine Database and SAML Authentication in Spring Boot

To get a better understanding of how DB and SAML auth are combined in this example, clone the repository for this tutorial if you have not already:

git clone https://github.com/oktadeveloper/okta-spring-security-saml-db-example.git

Open the project up in your favorite IDE or editor and take a look at the Maven POM file located at /pom.xml.

This application inherits from the spring-boot-starter-parent parent project. This provides you with Spring Boot’s dependency and plugin management:

<parent> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-parent</artifactId> <version>2.3.4.RELEASE</version> </parent>

This project uses the following Spring Boot Starter dependencies:

spring-boot-starter-web provides support for building web applications spring-boot-starter-security provides support for securing the application (e.g., Basic Auth, Form Login) spring-boot-starter-data-jpa provides support for the Java Persistence API, which is used to communicate with the database for DB authentication spring-boot-starter-thymeleaf provides support for the Thymeleaf templating engine, a simple and powerful way to create web pages for Spring Boot applications

The spring-security-saml2-core extension for Spring Boot provides the necessary SAML-related libraries. This extension depends on the opensaml library, which is contained in the Shibboleth repository and is added to the <repositories> block:

<repositories> <repository> <id>Shibboleth</id> <name>Shibboleth</name> <url>https://build.shibboleth.net/nexus/content/repositories/releases/</url> </repository> </repositories> <dependencies> ... <dependency> <groupId>org.springframework.security.extensions</groupId> <artifactId>spring-security-saml2-core</artifactId> <version>1.0.10.RELEASE</version> </dependency> ... </dependencies>

The following dependencies also make life easier:

com.h2database:h2 to provide a simple in-memory database org.projectlombok:lombok to reduce boilerplate code (e.g. getters, setters, toString()) nz.net.ultraq.thymeleaf:thymeleaf-layout-dialect, a useful add-on for formatting Thymeleaf templates

NOTE: Some IDEs have trouble digesting Lombok-ified code due to version and plugin incompatibilities. If you have difficulty compiling this project, consider removing this dependency and adding the missing boilerplate code, or just use Maven to build and run.

The SAML and Database Auth “Pre-Login” Page

You want to have an initial page in which a user enters their username for login. Depending on the username pattern, you either direct the user to a standard username-and-password page for authenticating against the database, or direct them to the SAML auth flow.

/src/main/resources/templates/index.html

<!doctype html> <html lang="en" xmlns:th="http://www.thymeleaf.org" xmlns:layout="http://www.ultraq.net.nz/thymeleaf/layout" layout:decorate="~{layout}" > <body> <section layout:fragment="content"> <h6 class="border-bottom border-gray pb-2 mb-0">Please Log In:</h6> <div class="media text-muted pt-3"> <form action="#" th:action="@{/pre-auth}" th:object="${username}" method="post"> <p>Username: <input type="text" th:field="*{username}" /></p> <p><input type="submit" value="Submit" /></p> </form> <br/> <p th:text="${error}" style="color: red"></p> </div> </section> <body> </html>

IndexController is the backend @Controller defined to serve this page and handle requests:

/src/main/java/com/okta/developer/controller/IndexController.java

package com.okta.developer.controller; @Controller public class IndexController { @GetMapping public String index(Model model) { model.addAttribute("username", new PreAuthUsername()); return "index"; } @PostMapping("/pre-auth") public String preAuth(@ModelAttribute PreAuthUsername username, Model model, RedirectAttributes redirectAttributes) { if (StringUtils.endsWithIgnoreCase(username.getUsername(), Constants.OKTA_USERNAME_SUFFIX)) { // redirect to SAML return "redirect:/doSaml"; } else if (StringUtils.endsWithIgnoreCase(username.getUsername(), Constants.DB_USERNAME_SUFFIX)) { // redirect to DB/form login return "redirect:/form-login?username="+username.getUsername(); } else { redirectAttributes.addFlashAttribute("error", "Invalid Username"); return "redirect:/"; } } }

Within IndexController, you are checking whether the username matches a particular pattern and redirecting accordingly.

Authenticate with SAML and Spring Security

The WebSecurityConfig class, which extends the WebSecurityConfigurerAdapter parent, defines much of the security settings, including:

The filter chains to handle SAML requests and responses How and when to authenticate a user with either the database or SAML and Okta Required permissions for URLs within the application Logging out

When redirected to the /doSaml endpoint, the SAML flow is initiated by a custom authentication entry point defined in WebSecurityConfig.configure(HttpSecurity):

/src/main/java/com/okta/developer/config/WebSecurityConfig.java

package com.okta.developer.config; @Configuration @EnableWebSecurity @EnableGlobalMethodSecurity(securedEnabled = true) public class WebSecurityConfig extends WebSecurityConfigurerAdapter implements DisposableBean { ... @Autowired private SAMLEntryPoint samlEntryPoint; ... @Override protected void configure(HttpSecurity http) throws Exception { ... http .httpBasic() .authenticationEntryPoint((request, response, authException) -> { if (request.getRequestURI().endsWith("doSaml")) { samlEntryPoint.commence(request, response, authException); } else { response.sendRedirect("/"); } }); ... } }

Here you can see if the requested URL ends with doSaml, the request is handled by the SamlEntryPoint defined in your configuration. This redirects the user to authenticate via Okta, and returns the user to /doSaml upon completion. To handle this redirect, a Controller is defined to redirect the user following a successful SAML auth:

/src/main/java/com/okta/developer/controller/SamlResponseController.java

package com.okta.developer.controller; @Controller public class SamlResponseController { @GetMapping(value = "/doSaml") public String handleSamlAuth() { Authentication auth = SecurityContextHolder.getContext().getAuthentication(); LOGGER.info("doSaml auth result: {}", auth); if (auth != null) { return "redirect:/landing"; } else { return "/"; } } }

At this point, the user should be successfully authenticated with the app!

Authenticate with a Database and Spring Security

If the username matches another pattern, the user is redirected to a standard-looking form login page:

/src/main/resources/templates/form-login.html

<!doctype html> <html lang="en" xmlns:th="http://www.thymeleaf.org" xmlns:layout="http://www.ultraq.net.nz/thymeleaf/layout" layout:decorate="~{layout}" > <body> <section layout:fragment="content"> <h6 class="border-bottom border-gray pb-2 mb-0">Database Login:</h6> <div class="media text-muted pt-3"> <form action="#" th:action="@{/form-login}" th:object="${credentials}" method="post"> <p>Username: <input type="text" th:field="*{username}" /></p> <p>Password: <input type="password" th:field="*{password}" /></p> <p><input type="submit" value="Submit" /></p> </form> <br/> <p th:text="${error}" style="color: red"></p> </div> </section> <body> </html>

The login submission is handled by a @Controller which calls on the AuthenticationManager built in WebSecurityConfig:

/src/main/java/com/okta/developer/config/WebSecurityConfig.java

package com.okta.developer.config; @Configuration @EnableWebSecurity @EnableGlobalMethodSecurity(securedEnabled = true) public class WebSecurityConfig extends WebSecurityConfigurerAdapter implements DisposableBean { ... @Override protected void configure(AuthenticationManagerBuilder auth) throws Exception { auth.authenticationProvider(dbAuthProvider); auth.authenticationProvider(samlAuthenticationProvider); } }

DbAuthProvider is a custom component which performs standard DB authentication by checking the supplied password versus a hashed copy in the database:

/src/main/java/com/okta/developer/auth/DbAuthProvider.java

package com.okta.developer.auth; @Component public class DbAuthProvider implements AuthenticationProvider { private final CombinedUserDetailsService combinedUserDetailsService; private final PasswordEncoder passwordEncoder; ... @Override public Authentication authenticate(Authentication authentication) throws AuthenticationException { if (!StringUtils.endsWithIgnoreCase(authentication.getPrincipal().toString(), Constants.DB_USERNAME_SUFFIX)) { // this user is not supported by DB authentication return null; } UserDetails user = combinedUserDetailsService.loadUserByUsername(authentication.getPrincipal().toString()); String rawPw = authentication.getCredentials() == null ? null : authentication.getCredentials().toString(); if (passwordEncoder.matches(rawPw, user.getPassword())) { LOGGER.warn("User successfully logged in: {}", user.getUsername()); return new UsernamePasswordAuthenticationToken( user.getUsername(), rawPw, Collections.emptyList()); } else { LOGGER.error("User failed to log in: {}", user.getUsername()); throw new BadCredentialsException("Bad password"); } } @Override public boolean supports(Class<?> aClass) { return aClass.isAssignableFrom(UsernamePasswordAuthenticationToken.class); } }

The above class calls on CombinedUserDetailsService which is another custom component providing an appropriate UserDetails object depending on whether the user is authenticated using the database or SAML, by implementing UserDetailsService and SAMLUserDetailsService respectively:

/src/main/java/com/okta/developer/auth/CombinedUserDetailsService.java

package com.okta.developer.auth; @Service public class CombinedUserDetailsService implements UserDetailsService, SAMLUserDetailsService { private final UserRepository userRepository; ... @Override public UserDetails loadUserByUsername(String s) throws UsernameNotFoundException { StoredUser storedUser = lookupUser(s); return new CustomUserDetails( AuthMethod.DATABASE, storedUser.getUsername(), storedUser.getPasswordHash(), new LinkedList<>()); } @Override public Object loadUserBySAML(SAMLCredential credential) throws UsernameNotFoundException { LOGGER.info("Loading UserDetails by SAMLCredentials: {}", credential.getNameID()); StoredUser storedUser = lookupUser(credential.getNameID().getValue()); return new CustomUserDetails( AuthMethod.SAML, storedUser.getUsername(), storedUser.getPasswordHash(), new LinkedList<>()); } private StoredUser lookupUser(String username) { LOGGER.info("Loading UserDetails by username: {}", username); Optional<StoredUser> user = userRepository.findByUsernameIgnoreCase(username); if (!user.isPresent()) { LOGGER.error("User not found in database: {}", user); throw new UsernameNotFoundException(username); } return user.get(); } }

The resulting @Controller to handle DB authentication looks like this:

/src/main/java/com/okta/developer/controller/DbLoginController.java

package com.okta.developer.controller; @Controller public class DbLoginController { private final AuthenticationManager authenticationManager; ... @GetMapping("/form-login") public String formLogin(@RequestParam(required = false) String username, Model model) { ... } @PostMapping("/form-login") public String doLogin(@ModelAttribute DbAuthCredentials credentials, RedirectAttributes redirectAttributes) { try { Authentication authentication = authenticationManager.authenticate(new UsernamePasswordAuthenticationToken( credentials.getUsername(), credentials.getPassword())); if (authentication.isAuthenticated()) { SecurityContextHolder.getContext().setAuthentication(authentication); } else { throw new Exception("Unauthenticated"); } return "redirect:/landing"; } catch (Exception e) { redirectAttributes.addFlashAttribute("error", "Login Failed"); return "redirect:/form-login?username="+credentials.getUsername(); } } }

When doLogin() is called via POST, the AuthenticationManager handles the username and password authentication and redirects the user if successful.

For ease of use, two users are defined in the database: one for DB auth and one for SAML. Both users are defined in our database, but only one of them is authenticated against the database:

/src/main/resources/data.sql

INSERT INTO user (ID, USERNAME, PASSWORD_HASH) VALUES ('17e3d83c-6e09-41b8-b4ee-b4b14cb8a797', 'dbuser@dbauth.com', '(bcrypted password)'), /*DB AUTH*/ ('17e3d83c-6e09-41b8-b4ee-b4b14cb8a798', 'samluser@oktaauth.com', 'bcrypted password'); /*SAML AUTH*/ Learn More About SAML and Okta

Much of the complexity of this project comes from the need to combine both database and SAML authentication in one app. Normally you would choose one or the other. If you want to use only SAML for authentication (which is a fine idea, especially using Okta), visit this blog post using the standard Spring SAML DSL extension to integrate with Okta and SAML to secure your application.

The source code used in this example is on GitHub.

See a good primer on how SAML works here: What is SAML and How Does it Work?

Check out some other articles on authentication in Spring Boot:

Use Spring Boot and MySQL to go Beyond Authentication A Quick Guide to Spring Boot Login Options Build a Web App with Spring Boot and Spring Security in 15 Minutes Easy Single Sign-On with Spring Boot and OAuth 2.0

Please provide comments, questions, and any feedback in the comments section below.

Follow us on social media (Twitter, Facebook, LinkedIn) to know when we’ve posted more articles like this, and please subscribe to our YouTube channel for tutorials and screencasts!

We’re also streaming on Twitch, follow us to be notified when we’re live.

Tuesday, 13. October 2020

KuppingerCole

IAM Essentials: Identity Governance and Administration




2020 Is the Year of the Identity Management Revolution

2020 has been the year of dispersed workforces and working environments. The impact on existing infrastructure, strategies and legacy technology has been unprecedented. As a result, we embarked on a revolution for Identity & Access Management and a mind-set change for organizations big and small, global or local. Never before has IAM been more present and upfront. In this new world, trusted id

2020 has been the year of dispersed workforces and working environments. The impact on existing infrastructure, strategies and legacy technology has been unprecedented. As a result, we embarked on a revolution for Identity & Access Management and a mind-set change for organizations big and small, global or local. Never before has IAM been more present and upfront. In this new world, trusted identities need to be the starting point for all organizational strategies and driving organizational change.




Global ID

The GiD Report#130 — It’s no accident that Big Tech are monopolies

The GiD Report#130 — It’s no accident that Big Tech are monopolies Welcome to The GiD Report, a weekly newsletter that covers GlobaliD team and partner news, market perspectives, and industry analysis. You can check out last week’s report here. What we have for you this week: Amazon, Apple, Facebook and Google are monopolies Justice Department could charge Google this week Mic
The GiD Report#130 — It’s no accident that Big Tech are monopolies

Welcome to The GiD Report, a weekly newsletter that covers GlobaliD team and partner news, market perspectives, and industry analysis. You can check out last week’s report here.

What we have for you this week:

Amazon, Apple, Facebook and Google are monopolies Justice Department could charge Google this week Microsoft allows app store users to choose their own payments Americans have hated monopolies for centuries How Facebook screwed up by trying to get TikTok banned The fintech wave Stuff happens 1. Guess what? Amazon, Apple, Facebook and Google are monopolies according to the House Judiciary Committee. Photo Suzy Hazelwood

That much is outlined in their 450 page report, which you can check out here.

“No surprises here,” Greg Kidd tweeted. (Also not a surprise if you’ve been following this newsletter.)

He added:

“It’s the world we live in now. It didn’t happen by accident. We need to change the way antitrust and anticompetition law works so that it works for rather than against innovation, inclusion and neutrality.”

Which is more or less spiritually aligned with the intentions of the report. These are some of the changes tol antitrust law proposed — via Axios:

limiting companies’ ability to compete unfairly against third parties on their own platforms by either requiring online marketplaces to be independently run businesses or establishing rules for how such marketplaces can be organized; blocking online platforms from giving themselves preferential treatment or playing favorites with other content providers; requiring social networks to be interoperable so that people can communicate across platforms and carry their data over from one platform to another; directing antitrust enforcers to assume that an acquisition by a dominant tech firm is anticompetitive unless proven otherwise; and allowing news publishers to team up to negotiate against tech platforms looking to carry their content.

Of interest here, the 16 month investigation outlined specifically how Apple and Facebook achieved and continues to wield its monopoly power — via Axios:

Apple: The report says Apple exerts monopoly power over software distribution to more than half the mobile devices in the U.S. It accuses the company of exploiting rivals by levying commissions and fees and copying apps, and says Apple gives preference to its own apps and services.
Facebook: The social media network has monopoly power in the social networking space, the report finds, and takes a “copy, acquire, kill” approach to would-be rivals such as WhatsApp and Instagram, both of which it bought in the early 2010s.

First, we should just acknowledge that the conclusions of the report and various proposals are all an incredible validation of the work we’ve been doing here at GlobaliD the past few years. There will always be two ways to fix a systemic issue — from the top down and from the bottom up. Ideally, you have both happening in concert.

GlobaliD has been working on the bottom-up solution to the problem. Now the top-down approach is coming, and it’s great to see.

Ideally, regulations are timely but in practice, they are lagging indicators. Consumer preferences and demands are already changing.

(Anecdotal: My sister was telling me this morning that she doesn’t really use Instagram anymore or sees as much value in it for her personally. And isn’t that how these things usually go? Long before lawmakers jump through all the hoops in order to break up Facebook, the kids will have already found a new shiny toy.)

A bit of reality: This is really just the start of a potentially long and drawn out process. We’re talking about the titans of industry, and they won’t go down without a fight. (They’ve all already denied the findings of the report, denying that they are monopolies or violate antitrust law. Unsurprisingly, Big Tech thinks the competitive landscape is healthy.)

But if we do end up achieving the proposals of the report, it will be game changing — there’s no doubt about that.

This is politics, though, so you never know. That being said and given the way the current election buildup is panning out:

If Democrats take the White House and the Senate in November, they could use this report as a blueprint for longer-term legislative and enforcement changes to limit tech giants’ power.

Relevant:

Antitrust and Regulation Over Time | The Regulatory Review The House Antitrust Report on Big Tech Antitrust Isn’t the Solution to America’s Biggest Tech Problem House Lawmakers Condemn Big Tech’s ‘Monopoly Power’ and Urge Their Breakups The single, simple rule change that could force tech platforms to compete Google, antitrust and how best to regulate big tech Congress Gets Ready to Smash Big Tech Monopolies House Panel to Seek Breakup of Tech Giants, GOP Member Says 2. This will be a long drawn out battle but there are already skirmishes abound. The Justice Department could charge Google as early as this week. (Top down)

That would amount to the biggest such action against a U.S. tech company in 20 years (since Microsoft).

It could mean forcing Google to sell off Chrome, reports Politico.

A bit of context: This particular action is coming from the Trump administration. The point? This is truly a bipartisan issue.

3. Microsoft’s app store now lets users choose their own payments systems. (Bottom up)

That’s part of 10 app store principles promoting “choice, fairness and innovation” the company just published:

Developers will have the freedom to choose whether to distribute their apps for Windows through our app store. We will not block competing app stores on Windows. We will not block an app from Windows based on a developer’s business model or how it delivers content and services, including whether content is installed on a device or streamed from the cloud. We will not block an app from Windows based on a developer’s choice of which payment system to use for processing purchases made in its app. We will give developers timely access to information about the interoperability interfaces we use on Windows, as set forth in our Interoperability Principles. Every developer will have access to our app store as long as it meets objective standards and requirements, including those for security, privacy, quality, content and digital safety. Our app store will charge reasonable fees that reflect the competition we face from other app stores on Windows and will not force a developer to sell within its app anything it doesn’t want to sell. Our app store will not prevent developers from communicating directly with their users through their apps for legitimate business purposes. Our app store will hold our own apps to the same standards to which it holds competing apps. Microsoft will not use any non-public information or data from its app store about a developer’s app to compete with it. Our app store will be transparent about its rules and policies and opportunities for promotion and marketing, apply these consistently and objectively, provide notice of changes and make available a fair process to resolve disputes.

OK, sure, this is mostly a symbolic move (given Microsoft’s market share in the space) in order to get ahead of the news and to take some well-timed jabs at Microsoft’s rivals.

But it also gives you a sense of how companies in general are reacting to the new vibes of the times.

4. A bit of history. TL;DR — Americans have hated monopolies for centuries.

A great longread from Ben Thompson’s Stratetchery — which is essentially a historical take on the American view of monopolies (and in turn antitrust).

First, here’s Ben quoting William Letwin in Law and Economic Policy in America: The Evolution of the Sherman Antitrust Act:

Hatred of monopoly is one of the oldest American political habits and like most profound traditions, it consisted of an essentially permanent idea expressed differently at different times. “Monopoly”, as the word was used in America, meant at first a special legal privilege granted by the state; later it came more often to mean exclusive control that a few persons achieved by their own efforts; but it always meant some sort of unjustified power, especially one that raised obstacles to equality of opportunity.

And boy does that hatred go way back. Here’s a quick teaser from Ben:

As Letwin notes, American distrust of monopolies had its roots in England and 1624’s Statute of Monopolies, which significantly constrained the ability of the King to grant exclusive privilege; colonial and state legislatures similarly passed laws restricting grants of exclusive power by governments, and while the Bill of Rights did not have an anti-monopoly provision (contra Thomas Jefferson’s wishes), one of the most divisive political questions for the first several decades of the United States was over the existence (or not) of a national bank, in large part because it was a government-granted monopoly.

Check out Ben’s full piece: Anti-monopoly vs. Antitrust

5. BONUS: How Facebook screwed up by trying to get TikTok banned

The NYTimes tech newsletter:

Instagram’s boss had a message this week for the White House and the world: It was counterproductive for the United States to try to ban TikTok, the popular video app from China.
It’s bad for U.S. tech companies and people in the United States, Adam Mosseri, the head of Instagram, told Axios, if other countries take similar steps against technology from beyond their borders — including Facebook and its Instagram app. (He and Mark Zuckerberg have said this before, too.) “It’s really going to be problematic if we end up banning TikTok and we set a precedent for more countries to ban more apps,” he said.
Mosseri has a point. What he didn’t say, though, was that Facebook has itself partly to blame. The company helped fan the fears about TikTok that Facebook is now worried will blow back on the company. This is bonkers.
Facebook complaining about a bad policy that Facebook helped initiate might seem like an eye-rolling joke, but it’s more than that. It’s the latest evidence that the company’s executives are incapable of foresight. Facebook not predicting how its own actions might cause harm later on is partly why we have sprawling conspiracies and autocrats harassing their own citizens.
They weren’t wrong. There are reasons to be worried about TikTok and other Chinese technology operating in the United States. But I don’t believe Facebook was bringing up these concerns out of principled commitment to American values. What Facebook was doing was pure short-term self-interest.

That sounds about right.

6. BONUS: The fintech wave: 7. Stuff happens: Via /vsVenmo Launches Credit Card Featuring User QR Codes | PYMNTS.com Via /vsClass Action: FinTech Middleman Plaid Uses App Login Credentials to Secretly Harvest Private Financial Data Regulator fines Citigroup $400 million for management failure

The GiD Report#130 — It’s no accident that Big Tech are monopolies was originally published in GlobaliD on Medium, where people are continuing the conversation by highlighting and responding to this story.


KuppingerCole

Safeguarding Your Most Valuable Data: Five Key Criteria to Assess Cloud Provider Security

by Alexei Balaganski This whitepaper focuses on defining the key security-focused selection criteria to help your company choose a secure platform for current and future cloud projects.

by Alexei Balaganski

This whitepaper focuses on defining the key security-focused selection criteria to help your company choose a secure platform for current and future cloud projects.


PingTalk

The SSO Practitioner's Introduction to Decentralized Identity

Decentralized identity—also known as self-sovereign identity—is earning a reputation as a silver bullet that can solve all of today’s identity problems. It promises to ensure perfect privacy, informed consent, user independence and control, and the ability to leverage the latest technology and cryptography.   But for those of us who’ve been in the identity industry for some time, these pr

Decentralized identity—also known as self-sovereign identity—is earning a reputation as a silver bullet that can solve all of today’s identity problems. It promises to ensure perfect privacy, informed consent, user independence and control, and the ability to leverage the latest technology and cryptography.

 

But for those of us who’ve been in the identity industry for some time, these promises might seem far-fetched. We’ve seen too many other purported magical solutions rise and fall over our careers, and evaluating decentralized identity’s true potential is no easy task. Complicating matters is the fact that if you’re coming from a single sign-on (SSO) background of SAML, OAuth or OpenID Connect, you have likely encountered an entirely new glossary of unfamiliar terminology, often conflicting technical descriptions and a distinct lack of actionable standards.

 

That’s not really too surprising for new technology efforts, though it does make it more challenging to follow. So we thought we’d offer a friendly introduction—mapping concepts you’re already familiar with to the new ones, explaining where there’s a lot of overlap, highlighting some of the differences and demonstrating where decentralized identity is likely to go—without any hype.

 

Monday, 12. October 2020

Otaka

Build a Modern API using Fastify and Node.js

Fastify is just as the name implies, fast. Not just in terms of development speed—its low overhead means the server is fast as well. When writing APIs, speed on both sides is paramount. Fastify is a web framework for Node.js that was designed for efficiency. Fastify is fully extensible with hooks, plugins, and decorators. It is schema-based, meaning you can define your response and request objects

Fastify is just as the name implies, fast. Not just in terms of development speed—its low overhead means the server is fast as well. When writing APIs, speed on both sides is paramount. Fastify is a web framework for Node.js that was designed for efficiency. Fastify is fully extensible with hooks, plugins, and decorators. It is schema-based, meaning you can define your response and request objects in your routes and have Fastify do the data validation for you. This feature isn’t mandatory, but it is useful. On the request side, it adds easy data validation. On the response side, this means you can shape your data for even lower overhead by not pushing down extraneous data to the client. Fastify is also TypeScript friendly. However, in this tutorial you will use JavaScript.

For this tutorial, you will create a secure API that returns some information regarding employees. To authenticate users, you will use Okta as an authentication server which will produce a JSON Web Token (JWT) after authenticating the user. The client will then send that JWT as part of the request to the server which will handle the validation logic. You will use Okta’s JWT verifier to quickly set up the authentication on your server.

Create your Okta application

The first thing you will need to do is create an application in Okta to act as your authentication server. This highlights the simplicity and streamlined process Okta authentication provides. Navigate to your Okta Developer Console and click Applications. Next click Add Application. For application type, select Service Machine-to-Machine. Give your application a meaningful name and click Done. Once your application is set up you will be presented with a Client ID and a Client secret. Make note of these as you will need them in your web application.

Next click on API in the header and the navigate to Authorization Servers. Okta will add an authorization server for you named default. This server is fine to use for development or testing. Click on Default and then click on Scopes. Click the button that says Add Scope and name it api. Click on Settings and note your Issuer URL as this is the base endpoint for requesting tokens.

Create your web server

Open your favorite IDE and navigate to the folder where you wish to store your project. Run the command mkdir *folder* where folder is your desired folder name. Next, run the command cd *folder* to navigate to your newly created project folder. Finally, create a new application by running the command npm init. Follow the wizard to help set up your project.

Now you will want to install your dependencies. You will only need a few for this project. First, you will need Fastify.

npm i fastify@3.3.0

Next, you will need Okta’s JWT Verifier. As I mentioned, this will handle the internal logic of interpreting the JWT provided by Okta’s authentication server and determining if it is valid.

npm i @okta/jwt-verifier@1.0.0

Finally, you will want to get dotenv to store your sensitive data.

npm i dotenv@8.2.0

Once you have installed these, you should add a new file called .env to your root directory. Add the following code to it.

OKTA_CLIENT_ID={yourClientID} OKTA_ISSUER=https://{yourOktaOrgUrl}/oauth2/default OKTA_AUDIENCE='api://default' PORT=3000

Replace {yourClientId} with the Client ID from your Okta application’s settings page. Replace {yourOktaOrgUrl} with your Okta organization URL. This can be found on your Okta Developer Dashboard with the label Org URL.

To provide data to the client you will need data on your server. For this, add a new file called sample-data.json and add the following.

{ "Employees": [ { "userId": "rirani", "jobTitleName": "Developer", "firstName": "Romin", "lastName": "Irani", "preferredFullName": "Romin Irani", "employeeCode": "E1", "region": "CA", "phoneNumber": "408-1234567", "emailAddress": "romin.k.irani@gmail.com" }, { "userId": "nirani", "jobTitleName": "Developer", "firstName": "Neil", "lastName": "Irani", "preferredFullName": "Neil Irani", "employeeCode": "E2", "region": "CA", "phoneNumber": "408-1111111", "emailAddress": "neilrirani@gmail.com" }, { "userId": "thanks", "jobTitleName": "Program Directory", "firstName": "Tom", "lastName": "Hanks", "preferredFullName": "Tom Hanks", "employeeCode": "E3", "region": "CA", "phoneNumber": "408-2222222", "emailAddress": "tomhanks@gmail.com" } ] }

There’s nothing special about this data, but it gives you something to work with.

Next, add a file called server.js. The code for this file follows.

"use strict"; require( "dotenv" ).config(); const jwtVerifier = require( "./jwtVerifier" ); const fastify = require( "fastify" )( { logger: true } ); const fs = require( "fs" ); const util = require( "util" ); const readFile = util.promisify( fs.readFile ); fastify.route( { method: "GET", url: "/employees", schema: { response: { 200: { type: "array", properties: { userId: { type: "string" } } } } }, preHandler: async ( request, reply ) => { return jwtVerifier( request, reply ); }, handler: async ( request, reply ) => { const obj = JSON.parse( await readFile( "sample-data.json", "utf8" ) ); return obj.Employees; } } ); fastify.route( { method: "GET", url: "/employees/:userId", schema: { querystring: { userId: { type: "string" } } }, preHandler: async ( request, reply ) => { return jwtVerifier( request, reply ); }, handler: async ( request, reply ) => { const obj = JSON.parse( await readFile( "sample-data.json", "utf8" ) ); const employee = obj.Employees.find( r => r.userId === request.params.userId ); if ( !employee ) return reply.code( 404 ).send(); return employee; } } ); const start = async () => { try { await fastify.listen( process.env.PORT ); fastify.log.info( `server listening on ${ fastify.server.address().port }` ); } catch ( err ) { fastify.log.error( err ); process.exit( 1 ); } }; start();

Here is the entire code you need to run an API with Fastify. First, you register dotenv with your application. Per their instructions, you should call the config() function at the earliest point possible in the application.

Next, you are defining two GET routes. The first will return all the employees and the second will return a specific employee. Both contain an option for preHandler where you will call your jwtVerifier. You will create this object shortly, but the trick here is that before the handler is called you will authenticate the user. In the route for /employees/:userId you are defining userId as a string however if your data was an integer or array you could define this argument that way. Also notice that since this is a GET you are defining the parameters in the queryString option. For POST or PUT, there is an option for body if needed. The handlers are straight forward. Both read the JSON data from your sample-data file and return the appropriate entity.

Finally, you start the server by calling fastify.listen on your desired port. Here you are using 3000 as defined in your .env file.

To verify the token you need to add a file called jwtVerifier.js. Add the code below to it.

"use strict"; const OktaJwtVerifier = require( "@okta/jwt-verifier" ); const oktaJwtVerifier = new OktaJwtVerifier( { issuer: process.env.OKTA_ISSUER, clientId: process.env.OKTA_CLIENT_ID } ); module.exports = async ( request, response ) => { const { authorization } = request.headers; if ( !authorization ) { response.code( 401 ).send(); } const [ authType, token ] = authorization.trim().split( " " ); try { const { claims } = await oktaJwtVerifier.verifyAccessToken( token, process.env.OKTA_AUDIENCE ); if ( !claims ) { response.code( 401 ).send(); } if ( !claims.scp.includes( "api" ) ) { response.code( 401 ).send(); } } catch ( err ) { console.log( err ); response.code( 401 ).send(); } };

This code is mostly just a wrapper around the @okta/jwt-verifier package. You are splitting out the header and obtaining the token from it. Then pass the token to the OktaJwtVerifier along with the desired audience. The verifier will throw an exception if the token is invalid and then it can be caught and a 401 can be returned to the client indicating it is unauthenticated.

Test your service

Run the command npm start and see your API come to life. You will want to run a few tests to ensure the data returned is correct and the request is being authenticated properly. Open your favorite rest client. I use Advanced Rest Client; Postman is another popular option.

First, try to call GET http://localhost:3000/employees without any authentication headers. You should receive a 401 Unauthorized as a response.

Next, call Okta’s authentication server token endpoint. You will need to set the Content-Type to application/x-www-form-urlencoded and include the following parameters: grant_type = client_credentials, scope= api, client_id = {yourOktaClientId}, client_secret={yourOktaClientSecret}. Sending this request should return a JWT with some additional information about the token.

Now with the JWT in hand, you can call localhost:3000/employees/GET again, but this time include the access_token as an authorization header using the Bearer prefix. Now your request should be authenticated.

The complete source code for this project is available on GitHub.

Learn More

To continue learning about building APIs in Node.js, check out these links:

Build a Simple REST API with Node and OAuth 2.0 Build A Secure Node.js API with KoaJS Build a REST API with Node and Postgres

If you like this blog post and want to see more like it, follow @oktadev on Twitter, subscribe to our YouTube channel, or follow us on LinkedIn. As always, please leave a comment below if you have any questions.


Smarter with Gartner - IT

How to Market to B2B Technology Buyers

When it comes to today’s B2B technology buyers, how you market your technology has become as important, if not more so, than what you sell. As a seller, you have to navigate customers’ organizational politics and help them make the business case to others for purchasing your technology.  This means you must understand who your buyers are and what they want. Who holds the buying and decision

When it comes to today’s B2B technology buyers, how you market your technology has become as important, if not more so, than what you sell. As a seller, you have to navigate customers’ organizational politics and help them make the business case to others for purchasing your technology. 

This means you must understand who your buyers are and what they want. Who holds the buying and decision-making power? What’s the budget? Is there a COVID-19 impact to consider? If so, what does that look like post-purchase? 

“When you can answer those questions you are better able to help your buyers buy,” says Derry Finkeldey, VP Analyst, Gartner. “As one B2B customer put it, ‘I appreciate a company that is able to create a customized experience to meet the needs of my company.’”

[swg_ad]

Gartner has identified three fundamentals of B2B buying behaviors that sellers must understand and adapt to in order to ensure more high-quality deals — those that leave the customer feeling they have received the value they expected from their technology purchase. 

Read more: Technology CEOs: Create Valuable Buyer Urgency for Shorter Sales Cycles

Organizations buy technology in teams

B2B deals of any significance are bought by cross-functional teams, not individuals. This means organizational practices and perspective override individual stakeholder preferences. Selling technology to such teams can be tough to navigate given that the average tech purchase involves between 14-23 people, the majority of who (80%) are in senior operations or product roles. And, as the spend increases, so will the size of the buying team. 

Appeal to the buying team, not the individuals on it. This is not to say you should discount central IT. That team remains vitally important. While the organization will be focused on outcomes, IT will be focused on how those outcomes are achieved. You have to sell to both dimensions.

“Although counterintuitive, having multiple stakeholders from multiple functions on the buying team makes for more high-quality deals — if you have done your job and satisfied them all,” says Finkeldey. “Your customers might not have figured this out, so encourage them to do so.”

Read more: Technology GMs: Adjust Your Product Strategies and Vision for COVID-19

Nothing significant is bought without a business case

Within the enterprise, any tech purchase usually indicates cost and change. The majority of B2B buyers (93%) require a business case for all technology solutions. Organizations need to see that investments will have a measurable impact on their metrics. And although this has become extremely important amid the current crisis, don’t expect the enhanced level of scrutiny to ease off moving forward. 

Buyers are open and looking for help to develop the business case. In fact, successful B2B tech providers participate in business case development in nearly half of all deals. The easier you make it for the customer to buy, the more likely you’ll secure the win. 

Don’t just focus on winning the business. Focus on how to make the customer successful

Start by helping your customers understand how your offering drives organizational-level objectives and why the investment needs to be prioritized now. You want to sell the value of your offering, not its capabilities. Doing so can also help unite diverse stakeholders on the buying team.

 “Show them that your offering is worth any changes it causes,” says Finkeldey. “Most importantly, don’t just focus on winning the business. Focus on how to make the customer successful.”  

Read more: 3 Actions for Cloud Providers to Support Customers Through COVID-19

Organizations struggle to buy technology effectively 

Organizations continue to increase their tech investments, but most struggle to do so effectively. They have difficulty building teams, managing complexity and dealing with competing priorities. According to Gartner research, 74% of B2B tech buyers found the buying process complex; only 27% reported achieving a high-quality deal.

Their challenges create an opportunity for technology service providers. The most successful help customers overcome their buying challenges by sharing these best practices, which reduce the complexities of purchase decisions: 

Build confidence in the buying process

Visualize the steps to change to capture value from the investment

Encourage best practices in decision making and behave as though you are on the decision-making team 

These best practices not only reduce the complexity of the B2B buying process, they inspire better buying that leads to renewal and growth opportunities. 

[swg_ad id="36843"]

The post How to Market to B2B Technology Buyers appeared first on Smarter With Gartner.


MyKey

MYKEY Weekly Report 20 (October 5th~October 11th)

Today is Monday, October 12, 2020. The following is the 20th issue of MYKEY Weekly Report. In the work of last week (October 5th to October 11th), there are mainly 2 updates: MYKEY was recommended in The DeFi List of DEFI PULSE MYKEY was recommended in The DeFi List of [Wallets] of DEFI PULSE. 2. Three Dapps were listed in MYKEY last week PDDEX, DeFiner, DeFi Pulse were l

Today is Monday, October 12, 2020. The following is the 20th issue of MYKEY Weekly Report. In the work of last week (October 5th to October 11th), there are mainly 2 updates:

MYKEY was recommended in The DeFi List of DEFI PULSE

MYKEY was recommended in The DeFi List of [Wallets] of DEFI PULSE.

2. Three Dapps were listed in MYKEY last week

PDDEX, DeFiner, DeFi Pulse were listed in MYKEY last week, welcome to search and use.

!!! If you encounter any abnormal situation while using MYKEY, remember not to uninstall MYKEY APP, please contact MYKEY Assistant: @mykeytothemoon in Telegram.

!!! Remember to keep the 12-digit recovery phrase properly from [Me] — [Manage Account] — [Export Recovery Phrase] in MYKEY even if your account is not real-named.

About Us

KEY GROUP: https://keygroup.me/

MYKEY Web: https://mykey.org/

BIHU: https://bihu.com/people/1133973

Telegram: https://t.me/mykey_lab

Twitter: https://twitter.com/mykey_lab

Medium: https://medium.com/mykey-lab

Github: https://github.com/mykeylab

Youtube: MYKEY Laboratory

MYKEY Weekly Report 20 (October 5th~October 11th) was originally published in MYKEY Lab on Medium, where people are continuing the conversation by highlighting and responding to this story.

Sunday, 11. October 2020

KuppingerCole

KuppingerCole Analyst Chat: There is More to IAM Processes than JML

When asked to describe IAM processes, managers tend to think first of traditional lifecycle management processes such as Joiner, Mover and Leaver (JML). While these are clearly essential for identity governance in interplay with authoritative sources, a comprehensive process framework for IAM and beyond encompasses many other areas. Martin Kuppinger and Matthias Reinwarth explore some of these add

When asked to describe IAM processes, managers tend to think first of traditional lifecycle management processes such as Joiner, Mover and Leaver (JML). While these are clearly essential for identity governance in interplay with authoritative sources, a comprehensive process framework for IAM and beyond encompasses many other areas. Martin Kuppinger and Matthias Reinwarth explore some of these additional areas between convenience and compliance.



Saturday, 10. October 2020

Ontology

Ontology Weekly Report (October 1–8)

This week we welcomed the September campaign for special NFT medals, where users can bring home the corresponding Wing NFT medal or Flamingo NFT medal. Furthermore, Ontology continued performance tests with bloXroute, reaping achievements that include the case study on bloXroute’s value for DeFi applications such as UniSwap. As the Founder of Ontology reminded all Wing DAO community members in his

This week we welcomed the September campaign for special NFT medals, where users can bring home the corresponding Wing NFT medal or Flamingo NFT medal. Furthermore, Ontology continued performance tests with bloXroute, reaping achievements that include the case study on bloXroute’s value for DeFi applications such as UniSwap. As the Founder of Ontology reminded all Wing DAO community members in his letter, now that WING has returned to a governance token, its value will grow increasingly visible to community members.

Back-end

- Completed 10% update of Wasm-NeoVm cross-protocol debugging tool

Product Development

ONTO

- ONTO v3.5.0 released, and increased the Wrapper module and Vault module for Flamingo

- ONTO v3.5.3 released, updated the Wrapper module for Flamingo, and launched Wing dApp

dApp

- 85 dApps now live on Ontology

- 6,143,050 dApp-related transactions since genesis block

- 25,317 dApp-related transactions in the past week

Bounty Program

- 2 new applications for the Technical Documentation Translation

Community Growth

- 1,031 new members onboarded across Ontology’s Pakistan, Vietnamese and French communities

Newly Released

- On September 30, the September campaign for special NFT medals was initiated. Users who supply assets to Wing Flash Pool or deposit assets in Flamingo Vault will receive the corresponding Wing NFT medal or Flamingo NFT medal.

- On October 5, Jun LI, founder of Ontology, sent a letter to the Wing DAO community. Wing had a promising start with a high TVL and APY, but the value of the WING token will be more visible to community members as it grows as a governance token. The Wing team will continue to take into account the community’s feedback to further improve the governance rules, simplify the process for member governance participation, perfect the governance protocol and enable the protocol to be applied to more scenarios.

- Ontology continued performance tests with bloXroute. A test comparing Infura/Alchemy to the bloXroute Cloud-API has been completed, as well as a case study on bloXroute’s value for DeFi applications such as Uniswap. The Ontology BDN integration has also been completed.

Global Events

- On September 28, Jun LI, founder of Ontology, made a keynote speech at POW’ER 2020 DeFi Innovation Summit hosted by Mars Blockchain on “What Barricades the Mass Adoption of DeFi?” He said, “The main reason behind the slow growth of DeFi users lies in an astonishingly small number of assets, and the slow speed at which stocks, equities, real estate properties, and other real assets are introduced into the DeFi sector. For DeFi to grow out of the current swamp, more types of assets need to be introduced to DeFi platforms, and the credit mechanism needs to be incorporated into the DeFi field to bridge the gap between decentralized finance and traditional finance.”

- On September 26 and 28 respectively, Jun LI was invited by Suanli Think-tank, a platform dedicated to the digital economy, on two occasions to elaborate on how to design an innovative DeFi platform based on credit and to brief the audience about “Super Oracle” and how oracles, as bridges in the data world, would connect on-chain assets with the real world.

Find Ontology elsewhere

Ontology website / ONTO website / OWallet (GitHub)

Twitter / Reddit / Facebook / LinkedIn / YouTube / NaverBlog / Forklog

Telegram Announcement / Telegram English / GitHubDiscord

Ontology Weekly Report (October 1–8) was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.

Friday, 09. October 2020

Forgerock Blog

ForgeTalks: Citizen Identity & Access Management

Welcome back to another episode of ForgeTalks! All around the world public sector organizations are trying to provide better and more secure digital experiences for their citizens. Here at ForgeRock, we believe that digital identity can help enable these experiences. With the rise of security breaches, online services, remote citizen and workforce user demands, digital transformatio

Welcome back to another episode of ForgeTalks! All around the world public sector organizations are trying to provide better and more secure digital experiences for their citizens. Here at ForgeRock, we believe that digital identity can help enable these experiences. With the rise of security breaches, online services, remote citizen and workforce user demands, digital transformation is a must. In this week's episode of ForgeTalks, I was joined by Tommy Cathey, ForgeRock RVP for Public Sector, to talk about citizen identity and access management.

This week we discussed: 

How can digital identity help public sector organizations modernize their digital experiences for their citizens? What are some exciting recent developments for US public sector organizations? And why are they important? 

I hope you enjoy this episode of ForgeTalks. If you want to check out any of our other episodes you can do so here


Self Key

The September Progress Report is here!🔔

SelfKey Weekly Newsletter Date – 07th October, 2020 September has seen many new updates and releases regarding SelfKey. Read all about it in September progress report. The post The September Progress Report is here!🔔 appeared first on SelfKey.

SelfKey Weekly Newsletter

Date – 07th October, 2020

September has seen many new updates and releases regarding SelfKey. Read all about it in September progress report.

The post The September Progress Report is here!🔔 appeared first on SelfKey.


Otaka

Test in Production with Spring Security and Feature Flags

Table of Contents Get Started with Okta Register for an Okta Org Create an OIDC App in your Okta Org Configure the OIDC App to Return Groups Get Started with Spring Boot Integrate with Okta Create a Basic Thymeleaf Template Get Started with Feature Flags Create the Treatment in Split Integrate the

Table of Contents

Get Started with Okta Register for an Okta Org Create an OIDC App in your Okta Org Configure the OIDC App to Return Groups Get Started with Spring Boot Integrate with Okta Create a Basic Thymeleaf Template Get Started with Feature Flags Create the Treatment in Split Integrate the Treatment with Spring Security Make Your New Functionality Generally Available How to Repeat the Beta / Release Cycle Learn More About Building Secure Applications

Okta is an Identity and Access Management platform. The TL;DR: you offload the responsibility for secure authentication and authorization to Okta so you can focus on the business logic of the app you’re building.

Okta and Spring Boot already go together like peanut butter and chocolate. Add in feature flags care of Split, and you can test new capabilities for your app without having to redeploy. That’s testing in production the smart way! And, you can leverage Okta’s groups to easily manage who should see the new stuff and who sees the old stuff.

In this post, I start with setting up an Okta org and configuring an OpenID Connect app. Next, I integrate Okta into a simple Spring Boot and Spring Security app. Finally, I add feature flags to deliver a new UI experience for select users - those that belong to a particular group.

Get Started with Okta

OpenID Connect (OIDC) rides on top of OAuth 2.0 for a modern Single Sign-on, authentication and authorization standard. Okta provides these standards as a service. Don’t know anything about these standards yet? Good news - You don’t have to! By following some simple instructions to provision an Okta org and setup a hosted instance of OIDC, you can easily integrate a Spring Boot app with just configuration. Let’s get started with 3 easy steps:

If you’re interested in learning more about OIDC and OAuth 2.0, here and here are good places to start. Look for more links to posts on OIDC and OAuth 2.0 at the end of this post.

Register for an Okta Org

Head on over to: https://developer.okta.com/signup. Fill out the form and click Get Started.

You’ll get an email from Okta to confirm your address. Click on ACTIVATE MY ACCOUNT. You’ll next see an onboarding page in your Okta Org. You can skip this for now.

Let’s add a few users and groups for use later in the Spring Boot app.

Add Users to your Okta Org

Click Users on the menu bar at the top. Here, you’ll see the user already created for you with the name and email you submitted to create the Okta org.

Click Add Person. This will bring you to the input form for adding a new user:

Change the Password field to Set by admin and uncheck User must change password on first login. Create the following users by filling out the form and clicking on Save and Add Another for each:

First name Last name Username Primary email Password Bob Belcher bob@belcher.com bob@belcher.com 123456aA$ Linda Belcher linda@belcher.com linda@belcher.com 123456aA$ Tina Belcher tina@belcher.com tina@belcher.com 123456aA$ Gene Belcher gene@belcher.com gene@belcher.com 123456aA$ Louise Belcher louise@belcher.com louise@belcher.com 123456aA$

Next, you’ll add some groups and assign some users to those groups.

Add Groups to your Okta Org

Choose Users > Groups from the top level menu. Here, you’ll see the built in Everyone group. As you might imagine, every current and new user is automatically added to this group.

Click Add Group. This will bring you to the input form for adding a new group:

Enter BETA_TESTER in both the Name and Group Description fields. Click Add Group.

Click the link to the newly created BETA_TESTER group. Click Manage People. Click on each of Tina, Gene, and Louise on the left. Notice that those users are moved from the left to the right, indicating that they will be members of the group. Click Save to finalize these changes.

Now, you’re Okta org is configured with five users (the Belcher family), 3 of whom belong to the BETA_TESTER group.

Create an OIDC App in your Okta Org

Next up, you’ll create an OIDC app that the Spring Boot app uses below. A deep dive into how OIDC works is outside the scope of this post, but check the links at the bottom if you’re interested in learning more about OIDC.

Click Applications from the top menu bar.

Click Add Application. Choose Web from the available app types and click Next. Update the Name field. Change the Login redirect URIs to: http://localhost:8080/login/oauth2/code/okta. Leave all the other fields as their defaults and click Done at the bottom.

At the top of the General tab, you’ll see the Client Credentials section at the top. Copy the values for Client ID and Client secret. You’ll use these values below to configure Spring Boot.

Configure the OIDC App to Return Groups

The last Okta configuration step is to make sure that the list of groups a user belongs to is returned when a user authenticates. You’ll see below that this integrates very easily with Spring Security.

Click API > Authorization Servers from the top menu bar. Click the default link under the list of Authorization Servers (it should be the only one right now).

Click the Claims tab. Click Add Claim:

Fill the form in with the following (leave anything not named as default):

Field value Name groups Include in token type ID Token Always Value type Groups Filter Matches regex .*

Click Create to finish.

Your Okta org is all ready to go to provide authentication and authorization services to your application. This is the great thing about Okta - with a little bit of configuration, you get all the services you need for auth allowing you to focus on the primary mission of your app.

All of the configuration you did in this section can be done via the Okta Management API. If this is something that interests you, check out the documentation here.

Next up, you’ll wire up Spring Boot with Spring Security to your Okta org. When you see how easy it is, your mind might just get blown.

Get Started with Spring Boot

The Spring team over at Pivotal has done an amazing job of making getting started with Spring Boot super easy. The Spring Initializr project over at https://start.spring.io allows you to select everything you need for the app you want to build.

For our purposes (and to keep things simple), you just need:

Component Description Spring Web RESTful APIs and Spring MVC Thymeleaf Server-side Java templating engine Okta Easy integration with Okta; includes Spring Security

Spring Initializr even makes it easy to load a pre-configured project from a direct link.

And, you can download the project from the command line with:

curl -G \ --data 'type=maven-project' \ --data 'language=java' \ --data 'bootVersion=2.3.4.RELEASE' \ --data 'baseDir=okta-split-example' \ --data 'groupId=com.okta.examples' \ --data 'artifactId=okta-split-example' \ --data 'name=okta-split-example' \ --data 'packageName=com.okta.examples.okta_split_example' \ --data 'packaging=jar' \ --data 'javaVersion=11' \ --data 'dependencies=web,thymeleaf,okta' \ --data-urlencode 'description=Feature Flags with Okta, Split and Spring Security' \ https://start.spring.io/starter.zip \ -o okta-split-example.zip

Expand the downloaded zip archive and open it in the IDE of your choice.

You can also find the completed application over on the Okta Developer GitHub repo.

Integrate with Okta

In order for this Spring Boot app to connect to your Okta org you need to set up an src/main/resources/application.yml file with just three configuration parameters:

okta: oauth2: issuer: <yourOktaDomain>/oauth2/default clientId: <oidc client id> clientSecret: <oidc client secret>

At this point, the Spring Boot app is ready to run! Execute:

./mvnw spring-boot:run

Navigate to: http://localhost:8080. You should immediately be redirected to Okta to authenticate. After authenticating, you should end up with a 404 error, as you haven’t added any content to your app yet. You’ll do that next.

Create a Basic Thymeleaf Template

The last two pieces that are needed for a basic Model-View-Controller (MVC) app are a controller and a view in Spring Boot.

Create a Controller

Add a file called HomeController with the following:

@Controller public class HomeController { @GetMapping("/") public String home(@AuthenticationPrincipal OidcUser user, Model model) { model.addAttribute("username", user.getPreferredUsername()); model.addAttribute("roles", user.getAuthorities()); return "home"; } }

Using the @AuthenticationPrincipal annotation, the authenticated OidcUser is autowired into the home method. A Model object is passed into the method as well.

The Model object is updated with the authenticated user’s username and a list of the Spring Security authorities associated with that user.

Since the project is configured with the Thymeleaf templating engine, returning a String at the end of the controller method will automatically return a template with that name. In this case, Thymeleaf will look for a template named: home.html.

Create a View

Create a file called: src/main/resources/templates/home/html with the following:

<html xmlns:th="http://www.w3.org/1999/xhtml"> <head></head> <body> <h2> <span th:inline="text">Hello, [[${username}]]!</span> </h2> <p/> Here are your roles: <br/> <ul th:each="role : ${roles}"> <li th:inline="text">[[${role}]]</li> </ul> </body> </html>

This template displays the username from the model. It then uses the Thymeleaf construct th:each to iterate over the list of roles from the model.

Restart the application, navigate once again to http://localhost:8080 and login as: tina@belcher.com.

Amongst a number of default roles and scopes, you should see that she has the BETA_TESTER role.

At this point, the Spring Boot + Spring Security application is fully functional (such as it is).

Next, I talk about feature flags with Split and then I bring Spring and Split together.

Get Started with Feature Flags

Split is a platform for combining feature flags and experiments with data, giving you more confidence in releases of your app.

In the example for this post, you integrate a split treatment such that a select group of users sees a new ‘beta’ interface for your app while ordinary users see the current production interface.

When our imaginary beta testing is complete, you can enable the new interface for all users.

Best of all, you could repeat this process, creating new beta experiences to be tested by your group and eventually making that generally available.

Getting setup with a free developer account for Split is as easy as 1, 2, 3:

Go to: https://split.io, click: Free Account Fill out the registration form and click: SIGN UP Follow the link you receive in email and set a password.

Create the Treatment in Split

Treatments allow you to define settings and behaviors for what you want to test. For my example, we want to setup a treatment that will return on or off depending on whether or not you are part of the beta tester group.

To start, click DE in the upper left. Choose Admin Settings and API Keys. Copy the value for sdk Type in the prod-default Environment. You’ll need this in the Spring Boot app shortly.

NOTE: If you’re new to using Split and/or on the free tier, the button in the upper left will say DE for default. If you’ve set up multiple workspaces, then the button will be labeled with the first two letters of the workspace name.

Next, Click Splits on the left-hand side and click Create Split. Give it a Name. Leave the other defaults and click Create.

Next, click Add Rules on the Targeting Rules tab. Split automatically adds on and off treatment definitions and sets off as the default.

For the use-case in this example, we want to add a group for which the treatment will return a value of on.

Click Add Rule in the Set Targeting Rules section. Here, we want to have the treatment return on if the user is in the group of beta testers. To accomplish this, enter groups in the Add attribute field. From the Select matcher dropdown, choose Set > has any of and enter BETA_TESTER in the field. Change the serve dropdown to on.

This now makes it read like an english sentence: “If the user has an attribute called groups and the groups list contains the value BETA_TESTER, then serve ‘on’ for the treatment”

Click Save Changes

Click Confirm on the summary screen.

Integrate the Treatment with Spring Security

Edit the pom.xml file in the project. Add the following dependency:

<dependency> <groupId>io.split.client</groupId> <artifactId>java-client</artifactId> <version>4.0.1</version> </dependency>

You can get the complete application, including the integration with Split over at the Okta Developer GitHub repo.

This brings the Split Java SDK into scope for the project.

Next, add a configuration to the project to make the Split Java Client available to the application. Here’s SplitConfig.java:

@Configuration public class SplitConfig { @Value("#{ @environment['split.api-key'] }") private String splitApiKey; @Bean public SplitClient splitClient() throws Exception { SplitClientConfig config = SplitClientConfig.builder() .setBlockUntilReadyTimeout(1000) .enableDebug() .build(); SplitFactory splitFactory = SplitFactoryBuilder.build(splitApiKey, config); SplitClient client = splitFactory.client(); client.blockUntilReady(); return client; } }

Add the following to the src/main/resources/application.yml file:

split: api-key: <your Split API Key>

Notice that it’s using the @Value annotation to pull in the Split API Key from the environment. This is a best practice. You should never hardcode an API Key into an application nor commit it in a git repo. In this case, application.yml is listed in the .gitignore file to ensure it’s not added to the git repo.

This is the key line that sets up the Split Client for use elsewhere in the code:

SplitFactory splitFactory = SplitFactoryBuilder.build(splitApiKey, config);

Let’s set up a new template called home-beta.html (It’s mostly copypasta from the original template and should be located in: src/main/resources/templates):

<html xmlns:th="http://www.w3.org/1999/xhtml"> <head></head> <body> <h1>WELCOME TO THE BETA EXPERIENCE</h1> <h2> <span th:inline="text">Hello, [[${username}]]!</span> </h2> <p/> Here are your roles: <br/> <ul th:each="role : ${roles}"> <li th:inline="text">[[${role}]]</li> </ul> </body> </html>

The last piece of the puzzle is in the HomeController. I want to have the app render the new home-beta template if the authenticated user is in the BETA_TESTER group. Here’s the updated controller:

@Controller public class HomeController { SplitClient splitClient; public HomeController(SplitClient splitClient) { this.splitClient = splitClient; } @GetMapping("/") public String home(@AuthenticationPrincipal OidcUser user, Model model) { model.addAttribute("username", user.getPreferredUsername()); model.addAttribute("roles", user.getAuthorities()); List<String> groups = user.getAuthorities().stream() .map(GrantedAuthority::getAuthority).collect(Collectors.toList()); String inBeta = splitClient.getTreatment( user.getPreferredUsername(), "BETA_UI_EXPERIENCE", Map.of("groups", groups) ); return "on".equals(inBeta) ? "home-beta" : "home"; } }

The first thing to notice is that I am injecting the SplitClient using constructor dependency injection.

The real magic happens with the splitClient.getTreatment call. The first parameter is the username provided by Spring Security for the authenticated user.

Note that in much of the Split documentation, this first parameter is referred to as a key. Don’t confuse this with the API Key, which should NEVER be used as the first parameter to the getTreatment call.

The second parameter is the name of the treatment in Split that we want to target.

The third parameter sends a map where the key is groups and the value is the list of group names that the authenticated user belongs to. In the case of our user linda, this will be Everyone (among some other defaults). In the case of our user louise, this will be Everyone and BETA_TESTER.

The last line of the controller method now returns the home-beta template if the result of getTreatment is on and the home template otherwise.

Fire up the app and try to login as linda in an incognito window. You should see the same home template as before. Kill that incognito window, open another one and login as louise. You should see the new beta template.

Make Your New Functionality Generally Available

Now that we have different treatments of our home template for regular users and beta testers, you may be wondering how to go about making the beta template available to everyone?

Split makes it easy-peasy. Go back to your Split definition for BETA_UI_EXPERIENCE and switch the serve setting in the Set The Default Rule section from off to on. Save and Confirm the change.

Without restarting the Spring Boot application, login as linda again. You should now see the same page that you saw for louise earlier.

Pretty cool, eh?

How to Repeat the Beta / Release Cycle

With this architecture in place, it’s now very easy to set up a new beta cycle. The steps would be something like this:

Copy the home-beta.html template to home.html (now that it’s ready for production) Create a new home-beta.html template Set the default rule back to off in Split Redeploy the app Let your beta testers test the new experience When ready, set the default rule back to on in Split NO NEED TO REDEPLOY

You could do this over and over again and never touch the controller code. The only thing that’s changing are the templates and the settings in split.

This approach also lends itself to changing who is in your beta test program without having to change your code.

In a real application, you’d be working with a database or an Identity Management system where you could add and remove users from the BETA_TESTER group. With Okta, you can easily manage who belongs to the group from the admin console or via the Okta Management API. Those users would always see the latest and greatest beta while ordinary users would see only the current release.

Learn More About Building Secure Applications

I hope you’ve seen how useful it can be to set up different experiences for different users using Okta, Split and the native functionality built into Spring Security.

To continue learning about authentication, authorization and feature flags and experimentation, check out these links:

Easy Session Sharing in Spring Boot with Spring Session and MySQL Deploy a Secure Spring Boot App to Heroku Use PKCE with OAuth 2.0 and Spring Boot for Better Security Leverage Spring Security to Test in Production Build a CRUD App with Spring Boot and MongoDB 7 Ways Feature Flags Improve Software Development

If you like this blog post and want to see more like it, follow @oktadev on Twitter, subscribe to our YouTube channel, or follow us on LinkedIn. As always, please leave a comment below if you have any questions.


MATTR

New to JSON-LD? Introducing JSON-LD Lint

JSON-LD, based on the ubiquitous JSON technology, is rapidly gaining adoption on the web. JSON-LD is an innovation relevant to both business minds and developers alike. For those unfamiliar with this technology, this short video is a great introduction. At MATTR we use JSON-LD in a variety of ways. For example, the platform generates credentials using this t

JSON-LD, based on the ubiquitous JSON technology, is rapidly gaining adoption on the web. JSON-LD is an innovation relevant to both business minds and developers alike. For those unfamiliar with this technology, this short video is a great introduction. At MATTR we use JSON-LD in a variety of ways. For example, the platform generates credentials using this technology so that they can be inherently understood and referenced.

Despite it’s growing adoption, the success of standards based technologies like JSON-LD tends to depend on how quickly and easily developers can understand it. Developers rely on tools such as compilers, IDE’s (integrated development environments) like visual studio code and linters to provide them with guidance and feedback as they code. These tools are essential for facilitating developer productivity and education.

When it comes to JSON-LD, many have observed that there are limited educational tools and resources available. The lack of training wheels in the space creates a barrier to entry, or results in developers breaking things along the way.

Having been on this journey ourselves, we want to make it easier for developers to pick up JSON-LD. That’s why we have developed a linter, which we are open-sourcing today.

Specifically, we are open-sourcing a mono-repo of packages (“JSON-LD Lint”) designed to lint/process JSON-LD documents. These packages are:

JSON-LD Lint Core – A typescript/javascript library containing the core linting engine for JSON-LD documents JSON-LD Lint CLI – A command line tool for linting/processing JSON-LD documents JSON-LD Lint VSCode Extension – A VS Code extension aimed at providing an improved development experience within VS Code when dealing with JSON-LD documents (coming soon to th VSCode marketplace).

We hope that these packages will help more developers to understand and adopt this technology. As always, we appreciate your feedback and welcome your involvement in progressing this project further! Head along to our GitHub to get involved. You can also gain access to MATTR’s sandbox platform to issue your own JSON-LD credentials today.

FAQ

What is JSON-LD and why is it on the rise?

The rise in popularity of javascript (due to its natural language monopoly in web-browsers) led to a mass exile from XML and shift over to JSON as the prefered data representation format. In the process, certain valuable features of XML were lost, in particular those that provide a standardised semantic syntax. JSON-LD defines this missing layer of syntax, which improves semantic reasoning around data. This is critical for maintaining data quality and trust in data, which is particularly important as we increase our reliance on digital infrastructure, IOT and AI. 

What is a Linter?

Developers are renowned for building tools that make their job easier – whether it be through automating previously manual processes or designing tools that help to catch their mistakes. The number of tools available has grown in tandem with the open source movement. 

In general a linter is a tool that analyzes some input (often source code) and flag errors, bugs, stylistic errors, and suspicious constructs. It provides developers with feedback around detected issues with their code/input and often includes information on how it could be fixed.

The article New to JSON-LD? Introducing JSON-LD Lint appeared first on MATTR.

Thursday, 08. October 2020

KuppingerCole

How to Hunt Threats Effectively With Network Detection & Response Solutions

The number of cyber-attacks globally continue to rise. Attacks are growing increasingly sophisticated. The tactics, techniques and procedures that were once only used by well-funded state actors are being commoditized by cybercriminals. State actors sometimes employ tools that were formerly mostly used by cybercriminals. The threat landscape evolves continuously.

The number of cyber-attacks globally continue to rise. Attacks are growing increasingly sophisticated. The tactics, techniques and procedures that were once only used by well-funded state actors are being commoditized by cybercriminals. State actors sometimes employ tools that were formerly mostly used by cybercriminals. The threat landscape evolves continuously.




Forgerock Blog

A Leader in the Wave for Customer IAM

We’ve all experienced the turbo-charged acceleration in digital transformation in the past six months of the pandemic. Working from home, banking from home, shopping from home, and eating your favorite restaurant meals at home are the new normal. We are also living this experience at ForgeRock. We’ve moved to a nearly 100% remote work environment and supported our customers who have relied on

We’ve all experienced the turbo-charged acceleration in digital transformation in the past six months of the pandemic. Working from home, banking from home, shopping from home, and eating your favorite restaurant meals at home are the new normal. We are also living this experience at ForgeRock. We’ve moved to a nearly 100% remote work environment and supported our customers who have relied on us for a seamless and secure online experience over the years. Enabling this transformation for our customers is what drives us as a company. It is also why we are all immensely proud to be recognized by Forrester Research as a Leader in The Forrester Wave: Customer Identity and Access Management (CIAM), Q4 2020. 

Innovation and Execution Matter Most 

Forrester evaluated the 13 most significant identity and access management (IAM) companies against 32 different criteria spanning three categories: current solution offerings, strategy, and market presence.  The evaluation of each company included in-depth reviews of product functionality, demonstration of capabilities, and customer references. ForgeRock was named a Leader in this CIAM evaluation which recognizes both the strength of our current offering as well as having the highest score in the strategy category amongst all vendors evaluated.

Strong Security and a Great User Experience Are Essential for CIAM

One  prominent theme in this report is the shift in CIAM from “just” a security technology to becoming a key component of the online user experience. Forward-looking organizations are seeking identity partners to help acquire and retain customers while providing them with the security, fraud protection, and personalization capabilities to engage and transact across all consumer channels including web, mobile, call center, or in person.  

As a company strongly focused on CIAM, we’ve invested heavily in designing, building, and continually improving our ForgeRock Identity Cloud to meet the needs of our customers. We’ve emphasized the importance of capabilities such as data orchestration and user management, customer identity verification and registration, and consumer self-service. We’ve also invested in high performance and scale because they directly impact the user experience. We enable our customers to securely manage hundreds of millions of identities – with demonstrated performance in excess of 3.6 million authentication transactions per minute – and ensure a seamless user experience.

Looking Ahead

Forrester emphasized future strategy, investment, and execution roadmap as important criteria in their evaluation. It’s all about cloud choice and enabling hybrid deployments. At ForgeRock, we have known this for a while. It serves as a driving force behind our strategy, product, and go-to-market plan. And, it’s what drove our decision to raise $93 million in the first half of the year to continue to invest in our business and the market. 

We believe recognition by Forrester Research is a testament to our momentum and validation of our future  strategy and direction. Increasingly, our platform is becoming mission-critical to the largest organizations in the world – and we take our responsibility to deliver on their expectations seriously. We are incredibly proud that Forrester has named us a leader in CIAM. 


Download a complimentary copy of the The Forrester Wave report here.

 


Smarter with Gartner - IT

Future of Sales 2025: Data-Driven B2B Selling to Drive Digital Commerce

The Gartner Future of Sales 2025 report reveals that 60% of B2B sales organizations will transition from experience- and intuition-based selling to data-driven selling by 2025. Why? Because B2B buyers now prefer to engage with suppliers through digital and self-service channels, making multiexperience selling a must-have.  More interdependence of people, processes and technology will rend

The Gartner Future of Sales 2025 report reveals that 60% of B2B sales organizations will transition from experience- and intuition-based selling to data-driven selling by 2025. Why? Because B2B buyers now prefer to engage with suppliers through digital and self-service channels, making multiexperience selling a must-have. 

More interdependence of people, processes and technology will render the traditional sales models less reliable over time

To stay relevant and drive revenue in a world where many B2B buyers see little need or desire to engage with a seller at all, sales organizations need to build adaptive systems that incorporate hyperautomation of interactions and transactions between sellers and buyers, digital scalability for sellers and artificial intelligence (AI).

[swg_ad id="37175"]

“The growing interdependence of people, processes and technology will render the traditional sales models less reliable over time — something for which most sales organizations are unprepared,” says Tad Travis, VP Analyst, Gartner. “Embracing this change means sales leaders must adopt the principles of hyperautomation — accept they have to meet customers where they already are and bring B2B digital commerce into the fold.”

Hyperautomation and AI-based selling

Gartner positions hyperautomation as the effective combination of complementary sets of tools that can integrate functional and process silos to automate and augment business processes. For sales leaders, that means automating sales process steps that were previously very analog and moving customer interactions and transactions into the digital channel they prefer, such as digital commerce.

Traditionally, sales organizations treat sales processes, sales applications, sales data and sales analytics as four distinct practices. But technology is rapidly transforming how sales organizations operate. In the next five years, there will be no separation between sales process, applications, data and analytics, as all four will merge into one single concept: AI for sales.

As sales leaders and IT leaders look to enable selling through all channels, they will have to rethink their sales force deployment model and focus investments on virtual selling and digital commerce channels. To do this, sales leaders and IT leaders should consider these three actions.

1. Build an advanced sales technology roadmap

The top priority for sales technology programs should be to build a roadmap that includes advanced approaches such as predictive analytics and guided selling. AI, such as prescriptive next best actions, can tell sellers what to do to close deals and prospects as quickly as possible. 

2. Prioritize AI-based guided selling

While developing an advanced sales technology roadmap, sales leaders must prioritize where AI-based guided selling functions would be most relevant by identifying the least efficient parts of the sales value chain. This can include processes that require a lot of human educated guessing, such as what to do next on a complex B2B deal. 

3. Invest in technology that attracts new talent and enables virtual selling

Technology not only can optimize processes, it can help sales leaders attract top talent from the Gen Z demographic, who value remote work opportunities and digital collaboration. In addition, investing in technology allows sales organizations to ramp up virtual selling more quickly and improve the buyer experience. This includes arming sellers with high-quality audio and video hardware, and reliable remote meeting platforms that enable sellers to conduct productive customer interactions. 

It’s no secret that the COVID-19 pandemic is accelerating this transformation from insight-based selling to data-driven selling. Sales organizations now must accept that buying preferences have permanently changed and, as a result, so too will the sales organization and the role of sellers. 

The post Future of Sales 2025: Data-Driven B2B Selling to Drive Digital Commerce appeared first on Smarter With Gartner.


One World Identity

IronNet Cybersecurity

IronNet Cybersecurity SVP for Strategy, Partnerships & Corporate Development joins State of Identity to discuss his storied career in national security, IronNet Cybersecurity's unique mission centered around the concept of collective defense, and why shared cybersecurity threat intelligence is even more critical at this junction in time.    
IronNet Cybersecurity SVP for Strategy, Partnerships & Corporate Development joins State of Identity to discuss his storied career in national security, IronNet Cybersecurity's unique mission centered around the concept of collective defense, and why shared cybersecurity threat intelligence is even more critical at this junction in time.    

KuppingerCole

Architecting your Security Operations Centre

by Paul Simmonds A security operations centre (SOC) is a dedicated team, usually operating 24x365, to detect and respond to cybersecurity incidents within your organisation that potentially affect your people and systems. Architecting your SOC properly in terms of technology, processes, people and a close coupling with the organisation is critical if you are to achieve value from implementing a S

by Paul Simmonds

A security operations centre (SOC) is a dedicated team, usually operating 24x365, to detect and respond to cybersecurity incidents within your organisation that potentially affect your people and systems. Architecting your SOC properly in terms of technology, processes, people and a close coupling with the organisation is critical if you are to achieve value from implementing a SOC within your organisation.


Cybersecurity Awareness – Are We Doing Enough?

by Alexei Balaganski It’s October and it means that we are having the European Cybersecurity Month again. ECSM is the European Union’s annual campaign dedicated to promoting cybersecurity among EU citizens and organizations. To be completely honest, I do not remember it being much of a thing in previous years, but apparently, in 2020, cybersecurity awareness is much more important for the Europea

by Alexei Balaganski

It’s October and it means that we are having the European Cybersecurity Month again. ECSM is the European Union’s annual campaign dedicated to promoting cybersecurity among EU citizens and organizations. To be completely honest, I do not remember it being much of a thing in previous years, but apparently, in 2020, cybersecurity awareness is much more important for the European Commission and not without, ahem, a very big reason.

I have always had mixed feelings about the whole notion of “awareness”. On one hand, raising awareness is basically what we analysts do on a daily basis: spreading the word about new security challenges and innovative products that solve them is a major part of our job. On the other hand, it does not really help, does it? We still hear about major data breaches and ransomware attacks every day: I suspect that many people have long become completely desensitized to that news. Well, perhaps learning about the first death of a patient in a clinic hit by ransomware (in my city of all places!) was a notable and unfortunate exception…

So, what are we doing wrong? Are we not putting enough effort into cybersecurity awareness? Should we do it differently somehow? I wish I had a clear-cut answer to these questions… Alas, I don’t think anybody does. However, there are several points we could address. First and foremost, cybersecurity culture should obviously not be limited to a special month. Sufficient for the day is its own trouble, and if people are not constantly reminded of the dangers, they will forget and focus on more relevant aspects of their daily jobs.

Awareness should be about solutions, not problems

Another critical aspect is that awareness alone never amounts to much. People (and organizations) should learn not just about potential troubles: they need to be given concrete solutions for those. For cybersecurity, this includes not only specific security tools but also giving actionable recommendations: how to improve your computer’s security, how to defend against account takeover, how to prepare for a phishing or ransomware attack and where to seek assistance after being hit… Ideally, cybersecurity hygiene has to become a routine part of your daily life like brushing your teeth or locking the front door.

One simple example: how many times have you heard about the dangers of using the same simple password between multiple online services? I’d argue that the public awareness of the issue is very high, and yet, the worldwide most popular password is still “123456”. How about suggesting using a free password manager instead, like LastPass or Dashlane? In enterprise environments, of course, one should look for solutions with centralized management and additional capabilities like Mateso Password Safe. Using such a tool completely changes your “password routine”, making re-using an old password more cumbersome that generating a fresh strong one each time.

Even better, of course, is activating multi-factor authentication whenever available. Alas, there is still no single convenient tool to support all online services, but a combination of a hardware security key like Yubikey and an authenticator mobile app like Authy will have almost all your bases covered. And by the way, forget about changing your passwords regularly and using security questions – these have long been proven useless and are no longer recommended by reputable organizations like NIST.

Awareness in times of Corona

The COVID-19 pandemic that forced so many people to work from home for months has also completely changed the scope of enterprise cybersecurity. For years, strict segregation of work and personal activities has been enforced by security and compliance policies, even for Bring Your Own Devices. Nowadays, when so many employees resort to using their home PCs for remote work, this approach no longer works.

Even worse: the same devices are often used by remotely schooled children, blending the line between home and work security even further and introducing new, unexpected challenges to corporate security departments. Will raising awareness among elementary school students help? Who is supposed to do this job: Teachers? Parents? Parents’ employers? Governments? Or maybe the companies that develop the software used for remote communications?

One thing is certain though: the situation is not going to sort itself out. I believe, some kind of government-backed incentive is necessary, not just for providing consistent guidance and governance across industries, but for supporting private initiatives and sanctioning negligence. Awareness campaigns like ECSM are a useful first step in that direction but other steps must follow soon.

In the meantime, the only sensible way to secure people working from home (including their families, because malware, like COVID, does not differentiate) is for organizations to expand the focus of their cybersecurity efforts beyond just BYOD. Cybersecurity Awareness Training should be an important and perhaps the first step in that direction. However, it has to focus on concrete, actionable, easily understandable guidance delivered in a way that actually makes people want to participate. A popular example of this approach is gamified phishing training. Not only people learn about the dangers of opening suspicious emails or clicking on unverified links, they are naturally incentivized to compete, learn more and apply their knowledge more often.

Expanding the scope of cybersecurity

However, organizations should not stop there. Securing their employees’ home devices should become a part of the corporate cybersecurity strategy. Of course, old-school tools like firewalls and VPNs are not suitable for such scenarios, but this is where security solutions delivered from the cloud come to the rescue. It is obvious that communications security (including not just email, but videoconferencing and online collaboration tools) should not differentiate between company-owned and personal devices. However, the same approach should apply to other fields as well, such as web security or endpoint detection and response. Even though such solutions are more “invasive” in terms of potential privacy issues, modern cloud-native products from companies like Akamai, Cisco or Zscaler offer a range of privacy-enhancing controls to overcome the compliance challenges.

Last but not least, government-issued guidance does not have to be your only source of expertise and best practices. Independent and strictly neutral research from industry analysts can be quite valuable as well, especially when it comes to selecting the most appropriate product for your specific risks and requirements. Check out KuppingerCole’s own research library and do not hesitate to reach out to us if you have questions.

Wednesday, 07. October 2020

KuppingerCole

Techniques for Securing Transactions With Identity Verification and Verifiable Claims

Consumer and Workforce identities are under assault. Cybercrime and fraud are pervasive problems that have only escalated during the pandemic. Even as the number of online and mobile transactions increases, businesses, government agencies, and other organizations are actively searching for solutions to help them minimize fraud and other kinds of cybercrime.

Consumer and Workforce identities are under assault. Cybercrime and fraud are pervasive problems that have only escalated during the pandemic. Even as the number of online and mobile transactions increases, businesses, government agencies, and other organizations are actively searching for solutions to help them minimize fraud and other kinds of cybercrime.




Evernym

Improvements to Connect.Me and our Mobile SDK

It’s been a busy few months for the mobile engineers at Evernym, and we wanted to take a minute to share a little about what we’ve been working on. A new and improved UI After months of user testing and gathering customer feedback, we’re excited to announce that we are making Connect.Me easier and more […] The post Improvements to Connect.Me </br>and our Mobile SDK appeared first on Everny

It’s been a busy few months for the mobile engineers at Evernym, and we wanted to take a minute to share a little about what we’ve been working on. A new and improved UI After months of user testing and gathering customer feedback, we’re excited to announce that we are making Connect.Me easier and more […]

The post Improvements to Connect.Me </br>and our Mobile SDK appeared first on Evernym.


Mythics Blog

It’s Time to Upgrade to Oracle Database 19c, Here is Why, and Here is How

Well, it's that time again, when the whole Oracle database community will be dealing with questions around…

Well, it's that time again, when the whole Oracle database community will be dealing with questions around…


Trinsic (was streetcred)

Trinsic & Zapier Partner to Bring SSI to 2000+ Applications

G-Suite, Office 365, Slack, Stripe, Asana, Twilio, Trello, Salesforce, LinkedIn, WordPress, HubSpot, Zoom, Typeform, Discord. What are two things these applications have in common? Odds are, you are a user of one or more of them. They now work seamlessly with self-sovereign identity (SSI). In our eternal quest to make SSI easier to adopt, Trinsic […] The post Trinsic & Zapier Partner to Brin

G-Suite, Office 365, Slack, Stripe, Asana, Twilio, Trello, Salesforce, LinkedIn, WordPress, HubSpot, Zoom, Typeform, Discord. What are two things these applications have in common?

 

Odds are, you are a user of one or more of them.

 

They now work seamlessly with self-sovereign identity (SSI).

 

In our eternal quest to make SSI easier to adopt, Trinsic partnered with leading workflow automation platform Zapier to enable Trinsic’s developer community to integrate self-sovereign identity with 2000+ common applications without coding! While Trinsic specializes in building the world’s best developer toolkit for decentralized identity, we recognize that plenty of non-technical people want to build SSI integrations. Zapier is the best tool we found to connect the APIs of various different services behind the scenes, making SSI more accessible than ever before.


If you’ve interacted with Trinsic much, you know we issue credentials for all sorts of things. Many of those workflows, including the Stripe, Calendly, and Zoom integrations, were done using Zapier and took less than 15 minutes to set up. We even issued our employees an Employee ID credential using Zapier! But we’re not the only ones who love this feature—see how other innovators are using Trinsic and Zapier to make self-sovereign identity a reality.

Featured use cases Internet Identity Workshop XXXI

The Internet Identity Workshop (IIW) is arguably the most important digital identity gathering there is. It’s happened every year, twice a year, for over 15 years now. Many of the internet’s most important digital identity standards, including Open ID, OAuth, and most recently SSI, were pioneered at this conference.

When IIW moved virtual, they wanted to practice some of what their community has been working on for several years. Phil Windley, co-founder of the workshop, set up an Eventbrite integration to enable digital tickets for IIW in the form of verifiable credentials. 


Sign up for IIW and get a digital, verifiable ticket credential here.

Proof Market

Proof Market is a startup working on MedCreds, a personal health wallet that leverages the Trinsic platform. Through its easy-to-use portal, patients can easily & securely collect their COVID test results, fill out medical questionnaires, and more. MedCreds wanted to keep a pulse on the performance of their application, so they set up a Zapier integration between Trinsic and Slack to notify a channel whenever a new registration is made through the platform.

 

Sign up for your MedCreds digital health wallet here.

VoteChain

Early-stage startup VoteChain, led by Marine veteran Robert Seitzberg, is building its first prototype using various no-code tools. With SSI technology, he can ensure that a given voter is eligible to vote. He is using Trinsic and Zapier to automate the credential verification process, so he can demo the working prototype to prospective stakeholders.

How it works

The magic happens behind the scenes where the Trinsic API is integrated with the Zapier platform using special webhooks. Zapier specializes in creating workflows, or a series of actions across multiple different apps. Through this process, Zapier can pass data from other apps into Trinsic and visa-versa, enabling completely automated credential exchange workflows.


In Zapier, triggers are an event that initiates an automated workflow and actions are the events that occur in that workflow. In the example below, a new registrant in Eventbrite is the trigger of this workflow. In the workflow, the actions that are completed are issuing a credential, registering for a webinar, and sending a confirmation email to the new event attendee.


With integrations to over 2000 other applications, the sky is the limit in terms of integrations. Build an automated credential exchange workflow in less than 15 minutes by following these instructions:

First, plan out your use case. Maybe you’re following the lead of the Internet Identity Workshop and issuing verifiable credentials as event tickets. Whatever you’re doing, make a plan for how you want the integration to work.

Next, head to the Trinsic Studio and create an Organization. You’ll need the API keys for the Organization in order to use Zapier.

Once in the Trinsic Studio, set up a credential template, verification template, or whatever else you need for your use case.

Finally, set up the flow in Zapier and follow the prompts you’re given.


Not sure where to start? Try one of the following integrations or check out our documentation:

Build your own integrations

While this is indeed a huge step forward for self-sovereign identity adoption, the real adoption is coming from the hundreds of developers who use the Trinsic platform for the applications they’re building. While we plan to continually improve the Zapier functionality, it will never match the power and flexibility of using the APIs directly.

 

With Trinsic’s fully-loaded package of 3 APIs, a front-end Studio, robust documentation, and SDKs in popular languages, your team is equipped with the flexibility and functionality needed to build something extraordinary. What integrations will you build?

 

Trinsic Studio: An easy-to-use web interface for managing credential exchange with no code. Also serves as the mechanism to acquire API keys and manage billing for paid plans. Try it for yourself completely free, and issue a credential in less than 5 minutes!

Provider API: Our newest API enables developers to programmatically provision issuer and verifier cloud agents. Learn more about the provider API in the recent launch announcement.

Credentials API: Our core API enables developers to have a turnkey way to issue, verify, and manage verifiable credentials on any Hyperledger Indy network. Check out our documentation or one of our reference applications to get started.

Wallet API: An API for creating and managing cloud wallets on behalf of credential holders. It’s the backend of our Mobile SDK, which you can read more about in our recent post about building your own SSI wallets. Get started with the API by checking out the documentation.

The post Trinsic & Zapier Partner to Bring SSI to 2000+ Applications appeared first on Trinsic.


PingTalk

2020 Identity Excellence Award Winners

Each year, I look forward to reviewing the nominations for the Identity Excellence Awards. It’s a privilege to get a behind-the-curtains look at the many ways our customers and partners continue to raise the bar on identity security excellence.    There’s just one downside: that we can’t give everyone an award for their efforts. Trust me when I say it was no easy feat to choose the w

Each year, I look forward to reviewing the nominations for the Identity Excellence Awards. It’s a privilege to get a behind-the-curtains look at the many ways our customers and partners continue to raise the bar on identity security excellence. 

 

There’s just one downside: that we can’t give everyone an award for their efforts. Trust me when I say it was no easy feat to choose the winners from among many worthy nominations. But after many hours of review and deliberation, our judging committee narrowed the field to identify the top entries across eight different categories.

 

The worthy award recipients were celebrated at our annual user conference, IDENTIFY 2020. Like many events this year, IDENTIFY was held virtually, but that didn’t stop us from rolling out the digital red carpet for these deserving companies. It’s my great honor to announce this year’s Identity Excellence Awards winners as follows. 


Otaka

Deploy a .NET Container with Azure DevOps

When I began programming (in the ’80s), computers weren’t equipped with a network card by default. The internet was almost unknown and modems were slow and noisy. The software was installed from stacks of flexible floppy disks. Today, computing resources are virtual. The internet is vital and there is an URL for everything. We live in the *aaS (* as a Service) era, where if you want something,

When I began programming (in the ’80s), computers weren’t equipped with a network card by default. The internet was almost unknown and modems were slow and noisy. The software was installed from stacks of flexible floppy disks.

Today, computing resources are virtual. The internet is vital and there is an URL for everything. We live in the *aaS (* as a Service) era, where if you want something, there is likely one or more something as a Service providers you can easily get out of your favorite search engine.

In the software industry, a new type of figure has emerged and is gaining more and more importance; I am speaking of the *aaS expert, a.k.a. DevOps.

In this post, you’ll put on your DevOps suit and set up a CI/CD automation solution!

The Application

You are going to start by using this already existing ASP.NET Core MVC Web application. This post is not about the application itself though, so here I will only refer to some aspects that are needed to complete the topic. You can find more information in GitHub and in the Okta developers’ blog.

Requirements

You’ll need a machine with Window 10 (at least Professional edition, I am using version 10.0.18363) and the following resources (all are free or offer free editions):

A local copy of the example application A GitHub account An Azure account (Free Tier is ok) An Azure DevOps account An Okta Developer account Visual Studio 2019 (I am using version 16.6.5), with .NET Core and Docker workloads Docker Desktop Git

This application is an example of how to extend a basic ASP.NET Core MVC application with Okta’s implementation of Open ID Connect (OIDC).

The building blocks in this integration are as follows:

Call AddOktaMvc() in Startup.ConfigureServices().

services.AddOktaMvc(new OktaMvcOptions { // Replace these values with your Okta configuration OktaDomain = Configuration.GetValue<string>("Okta:OktaDomain"), ClientId = Configuration.GetValue<string>("Okta:ClientId"), ClientSecret = Configuration.GetValue<string>("Okta:ClientSecret"), Scope = new List<string> { "openid", "profile", "email" }, });

Add a Configuration Entry Object in appsettings.json.

"Okta": { "OktaDomain": "", "ClientId": "{ClientId}", "ClientSecret": "{ClientSecret}" }

Ensure UseAuthentication() and UseAuthorization() are Called in Startup.Configure().

app.UseAuthentication(); app.UseAuthorization();

These calls are not part of the Okta package (they are .NET Core methods) and are required to tell the framework that you want our app to use the authentication and authorization features. The ASP.NET Core runtime automatically manages auth{entication/orization} scenarios, in this case using the Okta middleware, as per your instructions in ConfigureServices().

Finally, there are multiple ways to trigger the authentication/authorization process from your application. In the example I used for this post, there are two triggers:

Explicit (User Requests Sign In/Out From the UI)

The standard top bar created from the .NET Core Template project (_Layout.cshtml) has been enriched with Sign In and Sign Out hyperlinks

<div class="navbar-collapse collapse d-sm-inline-flex flex-sm-row-reverse"> @if (User.Identity.IsAuthenticated) { <ul class="nav navbar-nav navbar-right"> <li><p class="navbar-text">Hello, @User.Identity.Name</p></li> <li><a class="nav-link" asp-controller="Home" asp-action="Profile" id="profile-button">Profile</a></li> <li> <!-- Sign Out --><form class="form-inline" asp-controller="Account" asp-action="SignOut" method="post"> <button type="submit" class="nav-link btn btn-link text-dark" id="logout-button">Sign Out</button> </form> </li> </ul> } else { <ul class="nav navbar-nav navbar-right"> <!-- Sign In --><li><a asp-controller="Account" asp-action="SignIn" id="login-button">Sign In</a></li> </ul> } Implicit (User Navigates to Protected Resources)

When the user accesses a protected feature without being logged-in, she’s automatically redirected to the authentication flow. To achieve this in our example, it is sufficient to mark the protected resource with the standard Authorize attribute (in this case, an MVC controller action):

public class HomeController : Controller { ... [Authorize] public IActionResult Profile() { return View(HttpContext.User.Claims); } ... }

Note: To trigger the implicit flow before logging in, you need to manually access the profile page, typing the ../Home/Profile path suffix in the browser address bar. This is because the only link available to the page is visible only when the user is logged-in.

Prep Your .NET Core application for Azure DevOps

By now, you should have your fresh copy of the application somewhere in your local hard disk. Open the folder in Visual Studio; your solution explorer should appear as in the following picture:

Although not strictly necessary, to avoid confusion I prefer to change the name. The new name is okta-aspnetcore-container-example. Changing the Solution and Project name can be a challenge, especially in big projects. References to files or folders can break, making it impossible for the IDE to load the project correctly. A good way to mitigate this risk is to use the IDE itself to do the renaming. In this case, though, there is only one rename operation that Visual Studio cannot perform automatically: renaming the directory okta-aspnetcore-mvc-example. Even if you do this in Solution Explorer, Visual Studio does not automatically change the solution file. Therefore, you need to manually edit the solution file to make it right:

Now, double-click the solution file, and the solution will open as expected; without modifying the solution file manually, this operation would not succeed.

To complete the rename consistently, rename the Solution and the Project in Solution Explorer.

Change the default namespace for the project.

Change Namespace for the Existing Source Code (Using Visual Studio Rename Feature)

Replace all occurrences of okta-aspnetcore-mvc-example and okta_aspnetcore_mvc_example with the new values.

Now, you should be able to build and run the application without any error. However, the Okta auth flows have not been configured yet.

Create the Okta Application and Bind it to the Project

Okta strives to offer a first-class quality service with tools that make the integration process as easy as possible for your development environment. Specifically, for .NET Core, Okta offers a NuGet package that seamlessly integrates with the middleware pattern popular for this framework. With a few additions to the source files, the full Single Sign On (SSO) experience is nicely embedded in a purely declarative syntax.

In the Okta console, login to your developer account and create an application.

Select the .NET Platform Finally, provide settings as per the following image

The TCP port (5001 here) must be the same value present in launchSettings.json.

Last but not least, the Okta domain assigned to your Okta account, the Client ID, and secret generated by Okta for the new Okta application need to be copied into the placeholders in appsettings.json.

With this, you should be now able to sign in, sign out, and see the user profile.

Note that to have Visual Studio launch the correct environment, you must select the correct profile

Containerize Your Application With Docker

Getting familiar with all the nuances of the Docker CLI and the DOCKERFILE can take some time. Fortunately, Visual Studio offers a scaffolding feature for Docker support. It can be selected when the project is created or added to an already existing project; this is the option to select here:

Since this is a project in .NET Core, Linux or Windows can be selected as the Operating System. I chose Linux here, as normally Linux images have a smaller footprint. As a result of this operation, Visual Studio makes some changes to the project, like adding a DOCKERFILE and a template entry object to launchSettings.json. The only thing you need to do to obtain a locally testable Docker version of your application is to customize the template JSON object.

With this, you can launch and debug your new Docker container in Visual Studio, just by selecting the correct profile.

Deploy Your .NET container with Azure DevOps

Deploying to Azure is also easy with Visual Studio support. Right-click the project in Solution Explorer, and select “Publish…“. You’ll be presented with the Publish Wizard (for good guidance of when to right-click publish your project, check out this article). See below the steps to deploy your container as a Cloud Web Application:

Your Publish page will be populated with the new Publish profile. Note that I did not have to set any value in the wizard forms, I just accepted the default values. Of course in production scenarios, there should be a naming scheme for the resources.

Click the Publish button and, in a short time (roughly 2 minutes in my laptop), your Azure account will be fully instrumented with all re resources needed to have your application up and running:

The resource group containing the newly created resources The Azure Container Registry The Docker image An app service plan The application instance itself, up and running

Last but not least, remember that you need to set up your Okta application with the newly published endpoint. The easiest way is probably to create another application like the one you created at the beginning of this post and use the Azure endpoint (in the example https://localhost/5001 becomes https://okta-aspnetcore-container-example20200729101138.azurewebsites.net)

OAuth Redirection Note

If you try your application from the Azure deployment, you could encounter a problem when the security workflow runs. There is a little nuance occurring in a very specific scenario: ASP.NET Core applications hosted in Linux containers seem to not use the https scheme when redirecting the browser in the middle of an OAuth flow. I did actually have this issue, but only in the Azure deployment, not while running locally on my laptop.

Luckily, there is an easy solution. (ASP.NET Core passing HTTP (not HTTPS) redirect URL in OAuth challenge when hosted in Linux container).

Simply add this code in your Startup.cs/Configure

app.UseForwardedHeaders(new ForwardedHeadersOptions { ForwardedHeaders = ForwardedHeaders.XForwardedProto }); Create an Azure DevOps CI/CD Pipeline For Your Containerised Application

This could be a microservice component of a larger Service Oriented Application solution or a standalone e-commerce web portal. Either way, it would be cool to be able to automate building and deployment. This is what modern DevOps is all about!

At the end of this section, you’ll have automation that—as soon as you push changes to your source control repo—will rebuild and deploy your application! In a few minutes, the changes will be fully deployed and available to your customers.

Note that I didn’t mention testing as part of the pipeline. Good practice in Test Driven Development is to include in the solution automated tests to be executed after a successful build and establish rules to avoid deployment if the results of the tests are unsatisfactory. In this post I am not developing any tests, being the scope to show how to deploy using Azure DevOps.

Visual Studio has a built-in wizard to get you up with a templated pipeline.

The wizard form is fully pre-populated with suitable values. The only missing value is the GitHub Personal Access Token. The wizard shows this miss and provides a handy link to jump directly to your GitHub repo, where you will generate a new PAT token and copy-paste it in the form.

Note that, by default, the wizard would create a new App Service. In this case, you have previously published the application and created an App Service for it, so go ahead and select that App service from the dropbox. To start the process confirm clicking on OK.

After a few minutes, you can open your Azure DevOps account and explore the freshly scaffolded CI/CD Pipeline.

CI Pipeline (Build)

CD Pipeline (Deploy)

Now, as soon as you push changes to your GitHub repo Azure DevOps will detect it, proceed to build, and then deploy the updated application automatically.

Note: You will likely need to tweak some parameters to align the pipeline to your particular situation.

Recap

In this post you’ve seen the basics on how to apply two modernization patterns to a .NET Core application:

Containerisation DevOps CI/CD

Okta authentication/authorization support works just fine. The only thing to remember is that the base URL of the deployed application is different from the URL used when testing locally, and consequently the Okta application configuration must support both scenarios.

What you learned:

How to add Docker support to your ASP.NET Core application Manually deploy your containerized application to Azure App Service Create an Azure DevOps CI/CD pipeline to automate build and deploy workflows Learn More About .NET and OAuth

If you are interested in learning more about security and .NET check out these other great articles:

Deploy Your ASP.NET Core Application to Azure Okta ASP.NET Core MVC Quickstart Deploy a .NET Container with AWS Fargate The Most Exciting Promise of .NET 5 Goodbye Javascript! Build an Authenticated Web App in C# with Blazor + ASP.NET Core 3.0 Create a CI/CD pipeline for .NET with the Azure DevOps Project

Make sure to follow us on Twitter, subscribe to our YouTube Channel and check out our Twitch channel so that you never miss any awesome content!

Tuesday, 06. October 2020

KuppingerCole

Multicloud und Digitalisierung: Wie Sie die Nutzung im Griff behalten

Die Rolle der Cloud für die Digitalisierung kann kaum überbewertet werden. Doch mit zunehmender Cloud-Nutzung sind Organisationen bezüglich der Zugriffskontrolle für Cloud-Plattformen wie AWS, Microsoft Azure und Google Cloud Platform in Verzug geraten. Zwar bieten diese Plattformen Unternehmen Agilität und beschleunigen Innovation durch neue Services wie Sprachverarbeitung oder konfigurierbare Di

Die Rolle der Cloud für die Digitalisierung kann kaum überbewertet werden. Doch mit zunehmender Cloud-Nutzung sind Organisationen bezüglich der Zugriffskontrolle für Cloud-Plattformen wie AWS, Microsoft Azure und Google Cloud Platform in Verzug geraten. Zwar bieten diese Plattformen Unternehmen Agilität und beschleunigen Innovation durch neue Services wie Sprachverarbeitung oder konfigurierbare Dialogsysteme, doch oft fehlt es an der erforderlichen Kontrolle, um den Zugriff auf all diese Cloud-Ressourcen ordnungsgemäß zu regeln.




Martin Kuppinger & Robert Byrne on Privileged Access Management Buzzwords




Martin Kuppinger & Robert Byrne on Identity and Access Management




MATTR

DID Extensibility on the MATTR Platform

At MATTR we’ve been busy building the next generation of solutions for verifiable data and digital trust. Earlier this month we introduced our platform and added experimental support for a new, privacy-preserving method for selective data disclosure. Today, we’ve reached another milestone that gives our users even more choice and transparency by the addition of […] The article DID Extensibility

At MATTR we’ve been busy building the next generation of solutions for verifiable data and digital trust. Earlier this month we introduced our platform and added experimental support for a new, privacy-preserving method for selective data disclosure. Today, we’ve reached another milestone that gives our users even more choice and transparency by the addition of a new way to use Decentralized Identifiers (DIDs).

Modularity and extensibility are key design principles underpinning our core platform architecture. The MATTR Platform is designed to support a wide range of distinct pluggable components, providing our customers with confidence that their technology choices will continue to evolve with new innovations as they emerge.

When it comes to DIDs, there are currently over 50+ DID Methods registered with the W3C. Each DID Method defines a CRUD model that describes how a particular DID scheme works with a specific verifiable data registry such as a distributed ledger or blockchain. The initial group of DID methods was quite small, and has expanded significantly over time as more solutions emerge in this space. While all of these new DID methods theoretically adhere to the DID core specification, each method makes a different set of choices that affect the underlying trust model at play. For instance, DID methods have distinct rules about who gets to add new transactions, what input data is required, where DIDs are anchored, who can view or monitor the DIDs, and more.

In short, there are many factors that affect the choice around which DID method to use, and it’s not a trivial decision.

We believe that DIDs, when deployed responsibly, can be extremely effective at preserving user privacy, enhancing transparency and consent, enabling data portability, and enforcing user control. To learn more about our approach, read our blog, “Intro to DIDs for people”.

In addition to our current support for DID Key (static key-based identifier) and DID Sovrin (ledger-based identifier), we are now proud to add DID Web (domain-based identifier) to our list of supported DID methods. 

DID Web helps to bridge the gap between the way that trust is established on the internet today, namely using domains, and new and emerging ecosystems using DIDs. When using DID Web, rather than anchoring a DID to a decentralized ledger such as a blockchain, the DID is instead associated with a specific domain name, and subsequently anchored to the web host registered with that domain via DNS resolution. Effectively, this allows a DID using this scheme to be resolved as simply as one resolves a web URL, every time they click on a link. For example, we’ve set up a DID Web using our own domain, which can be resolved at did:web:mattr.global.

Users in the emerging world of DIDs can use this mechanism to bootstrap trust by using the reputation associated with public domains. While this solution may not work in every circumstance and lacks some of the resilience and censorship guarantees afforded by DID methods with less centralized dependencies, DID Web provides a practical and useful pathway to adoption, particularly for entities whose data and identity are already public. When used in parallel with more natively decentralized mechanisms, we can help to ensure that the web remains free and open while providing a path for legacy systems to remain interoperable with the emerging distributed web of trust.

By adding support for new DID methods such as DID Web, we are creating optionality and choice for our users. Our products will always be ledger-agnostic. We will also continue to offer support for DIDs which are not anchored to any ledger. We aim to bridge the gap between approaches that are built on top of ledgers and those using domains, key registries, and other methods to establish trust.

We are also actively investigating how to add support for more scalable solutions that use ledgers on an ad-hoc basis, such as DID methods based on the layer two Sidetree Protocol. This open-source protocol provides an abstract layer that sits on top of DLT infrastructure.

The Platform Drivers part of our architecture provides DID Method support in the form of  pluggable integrations that prevent vendor lock-in and enable user choice. To find out more about how the MATTR Platform supports a broad spectrum of DID methods, check out our documentation on MATTR Learn and sign up to get started.

The article DID Extensibility on the MATTR Platform appeared first on MATTR.


Gluu | Blog

Interception Scripts

Customize many aspects of your Gluu Server identity and access management service. Interception scripts can be used to implement custom business logic for authentication, authorization...

Customize many aspects of your Gluu Server identity and access management service.

Interception scripts can be used to implement custom business logic for authentication, authorization and more in a way that is upgrade-proof and doesn’t require forking the Gluu Server code. Each type of script is described by a Java interface — i.e. which methods are required.

In 4.2, we’ve introduced new interception scripts for Post-Authentication Authorization (more details), UMA2 RPT claims (more details), and application session management (more details).

Post-Authentication Authorization

For example: a sensitive web application may wish to force users to re-authenticate even if they present a valid session cookie to reduce the risk of a valid user session at an unattended computer being used by another person to access data inappropriately.

After the browser has a session, if a person visits a website, the RP can obtain a code without the user having to authenticate or authorize. In some cases, it is desirable to insert custom business logic before granting the code or tokens from the authorization endpoint. Post Authn script allows to force re-authentication or re-authorization (even if client is “Pre-authorized” or client authorization persistence is on).

UMA2 RPT claims

UMA2 standard is designed to separate the requesting party and the resource owner (where OAuth2 considers them to be only one person). And this differentiation allows us to address more use-cases than OAuth2.

RPT claims is a special script for UMA 2. It allows an admin to code logic for gathering additional claims (required by UMA RPT Authorization Policy).
This script can be used in an oxAuth application only.

Application Session Management
Session management is used to facilitate secure interactions between a user and some service or application and applies to a sequence of requests and responses associated with that particular user. When a user has an ongoing session with a web application, they are submitting requests within their session and are providing potentially sensitive information. The application may retain this information and/or track the status of the user during the session across multiple requests. More importantly, it is critical that the application has a means of protecting private data belonging to each unique user, especially within authenticated sessions.
This script allows an admin to get notification about various session lifetime events. It’s possible to add multiple scripts with this type. The application should call all of them according to the level.


Client Initiated Backchannel Authentication

The Gluu Server now supports CIBA. Improve the end-user experience during authentication and authorization OpenID Connect Client Initiated Backchannel Authentication Flow is an authentication flow...

The Gluu Server now supports CIBA. Improve the end-user experience during authentication and authorization

OpenID Connect Client Initiated Backchannel Authentication Flow is an authentication flow like OpenID Connect. However, unlike OpenID Connect, there is a direct Relying Party to OpenID Provider communication without redirects through the user’s browser. CIBA enables a Client to initiate the authentication of an end-user by means of out-of-band mechanisms.

CIBA allows a client application, known as a consumption device, to obtain authentication and consent from a user without requiring the user to interact with the client directly. Instead, the client application can initiate a backchannel request to the user’s authentication device, such as a smartphone with an authenticator app installed, to authenticate the user and consent to the operation.

This specification does not change the semantics of the OpenID Connect Authentication flow. It introduces a new endpoint to which the authentication request is posted. It introduces a new asynchronous method for authentication result notification or delivery. It does not introduce new scope values nor does it change the semantics of standard OpenID Connect parameters.


“Financial Grade” OpneID Connect

  Gluu Server 4.2 was certified to conform with the Financial Grade OpenID Provider profile. Called “FAPI” for short, this profile provides detailed requirements for...

 

Gluu Server 4.2 was certified to conform with the Financial Grade OpenID Provider profile. Called “FAPI” for short, this profile provides detailed requirements for the security features needed to perform payments.

Organizations can use OpenID Connect for both high and low assurance use cases. If you don’t need a ton of security, you don’t need to use all the fancy features OpenID Connect provides. But if you need more security, there are several useful risk mitigations. The FAPI profile uses signing and encryption to protect both the Openid Connect request and response, adding additional assurance and transport security.

But FAPI isn’t only for banks! If you want a lot of security, and a high level of assurance that the person authenticated is not a hacker, you may want to use the FAPI profile too.
Digital enterprises need to improve the security of their operations and protect customer data. It is common practice of aggregation services to use screen scraping as a method to capture data by storing users’ passwords. This insecure practice creates security vulnerabilities which require financial institutions to allow an automated attack against their applications and to maintain a whitelist of aggregators. A new draft standard, proposed by this workgroup would instead utilize an API model with structured data and a token model, such as OAuth.

FAPI is a working group of the OpenID Foundation, the body responsible for the development and maintenance of a family of protocol standards centered around OpenID Connect. FAPI was initiated in 2017 and sought to bring enhanced security to the new API standards being created to deliver PSD2 regulations across Europe, and one of the key drivers in open banking.

The Financial-grade API aims to provide specific implementation guidelines for online financial services to adopt by developing a REST/JSON data model protected by a highly secured OAuth profile. The Financial-grade API security profile can be applied to online services in any market area that requires a higher level of security than provided by standard OAuth or OpenID Connect.

This solution will help enterprises enable secure open banking application program interfaces (APIs) available to third parties, which can then use the APIs to seamlessly draw on customer data. Such easy data flow can help expand bank offerings and to quickly access information that would help in verifying applicants’ identities for a higher level of security.


Global ID

The GiD Report#129 — What we learned about digital identity from OnlyFans

The GiD Report#129 — What we learned about digital identity from OnlyFans Welcome to The GiD Report, a weekly newsletter that covers GlobaliD team and partner news, market perspectives, and industry analysis. You can check out last week’s report here. What we have for you this week: Every company is a messaging company What we learned about digital identity from OnlyFans
The GiD Report#129 — What we learned about digital identity from OnlyFans

Welcome to The GiD Report, a weekly newsletter that covers GlobaliD team and partner news, market perspectives, and industry analysis. You can check out last week’s report here.

What we have for you this week:

Every company is a messaging company What we learned about digital identity from OnlyFans Facebook is worried about messaging (and being broken up) Trump administration to sue Google next week The EU and China are getting in on it This week in the app platform wars Stuff happens 1. The pandemic lockdown and the resulting era of remote work (and play) has brought to the forefront our platforms for interaction and bringing us together. There was that a16z saying — that every company is a fintech company. Today, it feels like every company is a messaging and groups company.

Zoom is expanding into messaging and groups. Venmo experimented with groups but is apparently shelving the initiative (via /VS). Discord is expanding groups and community functionality with community servers.

But none of those initiatives move the needle in terms of addressing lessons learned from this last cycle of the internet because none of these platforms address the issue of digital identity in a material way.

Really, it’s more of the same.

We need an interoperable, portable identity in order to best enable all people economically and socially around the world. We need a persistent identity in order to restructure incentives for better outcomes and prevent fraud and abuse. And we need an open platform so that developers can experiment and create with those messaging and groups (and payments) building blocks so that real innovation can emerge.

Fundamentally addressing that remains a huge opportunity.

2. Incidentally, one platform that has addressed the issue of identity more than any other is OnlyFans. (For those out of the loop, OnlyFans has been trending this year as a way premium membership platform for influencers. As you might guess, there’s a lot of “adult”-ish type content there.)

In accordance to Rule 34 of the internet, they’ve also managed digital identity better than most:

First, OnlyFans offers an example of how the desire to get paid for content online smooths the way to validating user identities. There are three major reasons other social services don’t validate the people who participate in their networks. First, the friction of going through the validation process for new accounts prevents people from signing up. Second, it is expensive and time-consuming for services to validate identities. Third, requiring proof of real-world identity is quite exclusionary, as many people can’t easily make that proof. The desire to get paid for content provides a level of motivation that overcomes at least the first two of these hurdles.
At a bigger systems level, the lesson might be that the path toward trusted validation of identities will come through payment systems. I very much hope that we never face a world where someone has to prove their identity in order to share content on the internet; however, the financial system is far more structurally locked down. It seems likely that payments will become the wedge that forces people to validate online identities. That could lead to the creation of an internet environment where it is easier to validate the real-world identity (or at least the valid citizenship) of the people you are interacting with online.

So yeah, it only took OnlyFans to validate GlobaliD’s vision that payments, messaging, and digital identity go hand in hand :)

Relevant:

Lessons From the Rise of OnlyFans Private Social App Clubhouse Courts Fresh Controversy The Architecture of Identity System How self-sovereign identity principles suit the modern world 3. Much has been said about last week’s presidential debate. What’s clear, though, is that the country has never been more divided. What they’re not divided on, however, is the idea that we need to regulate Big Tech.

Sure, each side has its own motivations, but the end result is the same. Big Tech is the new Too Big To Fail. They’ve gotten too powerful, stifling competition and innovation. And they’ve gotten too influential, undermining social cohesion. Not even Apple is safe.

Anyway, Facebook is worried about messaging:

The document was produced by staff at Facebook based on work commissioned from lawyers at Sidley Austin LLP. While light on legal citations and technical language, it offers a window into how Facebook may defend itself if it is sued on antitrust grounds and reflects its lawyers’ sense that any attempts to force a divestiture of WhatsApp or Instagram would be fought in both the public square and the courtroom.
Facebook’s acquisitions of Instagram in 2012 and WhatsApp in 2014 were examined by the Federal Trade Commission, which closed its reviews without issuing an objection. The company made big investments to boost growth on those platforms and they now share numerous operations that are integrated. In the paper, Facebook says unwinding the deals would be nearly impossible to achieve, forcing the company to spend billions of dollars maintaining separate systems, weakening security and harming users’ experience.
“A ‘breakup’ of Facebook is thus a complete nonstarter,” the paper declares.
Tim Wu, Photo: New America

Tim Wu, who I interviewed many years ago (incidentally, also Big Tech monopoly related — yeah, we’ve been on this beat for a while), wasn’t impressed:

Facebook’s contention that past government inaction on the acquisitions should limit current action is “surprisingly weak,” said Tim Wu, a Columbia University law professor, tech critic and author who has said Facebook should be broken up. A government antitrust case against the company would likely rely on the argument that Facebook made serial acquisitions to reduce competition, a question that wasn’t considered when the Federal Trade Commission originally chose not to oppose the Instagram and WhatsApp deals, he said.
“There’s no way a decision on one merger would be preclusive,” he said, noting that the FTC’s reviews of both acquisitions had reserved the right to revisit the deals at a later time.
Facebook’s claim regarding the difficulty of a potential breakup would also be unlikely to carry legal weight. “There is no ‘it’s too hard’ defense,” Mr. Wu said.

The FTC is expected to file a formal complaint by year’s end. Meanwhile, Facebook is racing to integrate its messaging platforms to make it even more difficult to breakup.

Say “hi” to Messenger: Introducing New Messaging Features for Instagram — About Facebook Facebook Says Government Breakup of Instagram, WhatsApp Would Be ‘Complete Nonstarter’ 4. Things are happening more quickly for Google.

Maybe as soon as next week:

The Justice Department and state attorneys general are expected to sue Google as soon as next week for alleged antitrust abuses, people familiar with discussions said Friday, marking a dramatic escalation of Washington’s fight to rein in Silicon Valley’s giants.
The Justice Department — which has been probing Google for 16 months — circulated text of a proposed complaint this week, according to three people with knowledge of the discussion. The complaint could come late next week or just after the Columbus Day holiday, two of the people said. All spoke anonymously to discuss an ongoing investigation.

This suit will mainly focus on search. But the government’s commitment to the cause portends more to come.

Trump administration to launch antitrust suit against Google as soon as next week 5. And I know that we usually speak about things here from a U.S. perspective, but this is apparently a sentiment felt worldwide.

The EU, of course, has long been at the forefront of regulating Big Tech.

U.S. tech giants face curbs on data sharing, digital marketplaces, under draft EU rules

But China wants in on it, too.

Exclusive: China preparing an antitrust investigation into Google — sources 6. This week in the app platform wars: Judge suggests Apple vs Epic should go to jury, trial expected in July 2021 | Appleinsider Google Demands 30% Cut From App Developers in Its Play Store Apple will temporarily stop taking a 30 percent cut on Facebook event fees Google defers 30% in-app commission in India to April 2022 after protests 7. Stuff happens Crypto Co-ops and Game Theory: Why the Internet Must Learn to Collaborate to Survive — CoinDesk Permission Raises $50M to Make Ads Personal and Data Private — CoinDesk California Governor Signs Law Bringing State ‘New Tools’ to Regulate Crypto — CoinDesk Real demand for open banking as user numbers grow to more than two million — Open Banking Federal Reserve Bank reveals details of digital dollar research While Momentum Builds for Real-Time Payments, Research Tags the U.S. Market As a Laggard — Digital Transactions The Web Wasn’t Built For Privacy — But It Could Be

The GiD Report#129 — What we learned about digital identity from OnlyFans was originally published in GlobaliD on Medium, where people are continuing the conversation by highlighting and responding to this story.


Forgerock Blog

ForgeRock Updates GSA Schedule

With the Covid-19 pandemic causing a dramatic shift in how public sector organizations and agencies do their work and provide citizen services, digital transformation has become a priority. Easy, yet secure, remote access for both citizens and employees is no longer a goal for tomorrow, it is a must for today.  Unfortunately, traditional IT environments struggle to accommodate increased ac

With the Covid-19 pandemic causing a dramatic shift in how public sector organizations and agencies do their work and provide citizen services, digital transformation has become a priority. Easy, yet secure, remote access for both citizens and employees is no longer a goal for tomorrow, it is a must for today. 

Unfortunately, traditional IT environments struggle to accommodate increased access demands. For example, legacy identity and access management (IAM) and identity governance and administration (IGA) weren’t designed to provide real-time, continuous enterprise-wide user access visibility, control, and remediation, or to collect and analyze identity data to identify security access and risk blind spots. These shortcomings result in error-prone and time-consuming manual work, poor user experiences, and increased risk — making it difficult for public sector organizations to successfully implement their digital transformation initiatives. 

What’s needed is a comprehensive IAM and IGA platform capable of not only modernizing and filling the gaps of legacy identity systems, but also unlocking their value with artificial intelligence (AI) and machine learning (ML). 

At ForgeRock we’re ready to help. I’m excited to announce that the ForgeRock Identity Governance and ForgeRock Autonomous Identity solutions are now available on the GSA Schedule, which means ForgeRock’s complete IAM and IGA platform can be purchased on the GSA Schedule with Carahsoft. ForgeRock has a long-standing partnership with Carahsoft within the public sector market across US Federal Agencies and State Governments. In fact, Rich Savage, Sales Director at Carahsoft, noted “We pride ourselves on helping government agencies find the best technology solutions available. ForgeRock’s AI-powered platform is exactly what IT teams need in the public sector for solving complex digital identity challenges.” 

Both Identity Governance and Autonomous Identity are fully deployable in a DevSecOps environment. You can view the new SKUs on our Carahsoft microsite under ‘Products.’ 

ForgeRock Identity Governance and Administration

Identity Governance and Administration (IGA) is the ability to manage and reduce the risk that comes with excessive or unnecessary user access to applications, systems, and data. Users want to have easy and rapid access to all of the applications they need to do their jobs. As a security-conscious organization, you need to balance requests for immediate application access with security, while reducing the risk associated with this process. 

The problem is, many public sector organizations use manual processes or scripts to grant immediate access to users. However, this leads to a failure to implement proper monitoring and governance controls on access in order to determine whether users should continue to have access. When auditors ask for proof of proper detective and preventive controls, organizations often resort to even more manual processes that involve spreadsheets and emails. Imagine the worst-case scenario, when a security team is triaging and they have to rely on searching through emails and spreadsheets in order to understand the chain of events. Fortunately, there’s a better way.

ForgeRock Identity Governance and Administration is an integral part of the ForgeRock Identity Platform. It simplifies the manual access request, access approval, certification, and role mining processes while providing full identity lifecycle management for creating, managing, and restricting identity access to accounts, systems, applications, and infrastructure. With ForgeRock IGA, you can strengthen your security posture and automatically drive regulatory compliance.

ForgeRock Autonomous Identity 

Legacy IGA solutions operate in ‘identity silos’ based on static data, including assignments, roles, and entitlements. Combined with the increasing volume and type of identities within the public sector, this can leave your already overburdened risk and security teams struggling to keep up as they manually provision access privileges and rubber stamp access requests and certifications.

ForgeRock Autonomous Identity is an AI-driven identity analytics solution that can be layered on top of, and integrated with, your existing IGA solutions to provide real-time and continuous enterprise-wide user access visibility, control, and remediation. By leveraging machine learning techniques, the Autonomous Identity collects and analyzes identity data, such as accounts, roles, user activity, and entitlements, to identify security access and risk blind spots. As a result, public sector organizations gain wider and deeper insight into the risks associated with user access as well as remediation recommendations.

As these product descriptions exemplify, there’s a better way to do Identity Governance and Administration to improve your overall access and security landscape while reducing manual processes and extending the value of your current investments. These solutions, along with ForgeRock’s comprehensive identity platform capabilities, help you achieve the digital transformation required for today’s remote access demands

For example, the State of Utah wanted to gain greater reliability and scalability in its identity and access management (IAM) infrastructure to integrate more data and applications, and expand the number of online services available to employees, citizens, and businesses. Using the ForgeRock Identity Platform, the state integrated more than 900 applications and online services, providing the flexibility and scalability to support all 1,400 of the states online services and a growing variety of additional applications and services, including those running in the cloud. The above and more resulted in a projected savings of up to $15 million due to operational efficiencies. Read the full State of Utah customer story.

You can learn more about the ForgeRock Identity Platform, as well as our newly added Identity Governance and Administration and Autonomous Identity solutions on our Carahsoft microsite under ‘Products’. And, as always, please reach out to us directly with any questions. We and Carahsoft are here to serve you.

 


KuppingerCole

Commvault Complete™ Data Protection

by Mike Small Business continuity planning is essential to the digital transformation process. This requires the use of data backup products and disaster recovery services which must support today’s multi-cloud hybrid IT environment. This report describes how Commvault Complete™ Data Protection meets these challenges.

by Mike Small

Business continuity planning is essential to the digital transformation process. This requires the use of data backup products and disaster recovery services which must support today’s multi-cloud hybrid IT environment. This report describes how Commvault Complete™ Data Protection meets these challenges.


UNITY: IGA Modernization Framework by Persistent

by Richard Hill Inevitably, every organization with digital security and governance requirements will go through iterations of IAM and IGA system modernization efforts. Persistent Systems' Unity provides the necessary migration framework to facilitate an IGA modernization transition.

by Richard Hill

Inevitably, every organization with digital security and governance requirements will go through iterations of IAM and IGA system modernization efforts. Persistent Systems' Unity provides the necessary migration framework to facilitate an IGA modernization transition.


R&S®Trusted Gate - Secure Glocalization von Rohde & Schwarz Cybersecurity

by Matthias Reinwarth Rohde & Schwarz Cybersecurity ermöglicht eine zuverlässige Verarbeitung regulierter und sensibler Informationen für die Zusammenarbeit und zum Dateiaustausch in gemeinsamen SharePoint-Plattformen für Organisationen in verschiedenen Ländern und Regionen unter Gewährleistung der Konformität zu unterschiedlichen Gesetzen und Vorschriften. Rohde & Schwarz Cybersecurity e

by Matthias Reinwarth

Rohde & Schwarz Cybersecurity ermöglicht eine zuverlässige Verarbeitung regulierter und sensibler Informationen für die Zusammenarbeit und zum Dateiaustausch in gemeinsamen SharePoint-Plattformen für Organisationen in verschiedenen Ländern und Regionen unter Gewährleistung der Konformität zu unterschiedlichen Gesetzen und Vorschriften. Rohde & Schwarz Cybersecurity ermöglicht eine weltweit verteilte, effiziente und sichere Infrastruktur mit zentraler und konsolidierter Verwaltung unter Wahrung von Compliance und Datenschutz.


Oxyliom Solutions GAÏA Advanced Identity Management

by Martin Kuppinger The GAÏA Advanced Identity Management component of the GAÏA Trust Platform by Oxyliom Solutions integrates the key elements of identity management required for regulatory compliance and a modern digital experience, especially in highly regulated industries such as the financial sector.

by Martin Kuppinger

The GAÏA Advanced Identity Management component of the GAÏA Trust Platform by Oxyliom Solutions integrates the key elements of identity management required for regulatory compliance and a modern digital experience, especially in highly regulated industries such as the financial sector.


Nov 23, 2020: Zugriffsmanagement mit künstlicher Intelligenz neu erfinden

Unternehmen müssen in der modernen IT-Welt eine Vielzahl von Angestelltenidentitäten, Benutzerrollen, Zugangspunkten und Endgeräten verwalten. Diese Verwaltung von digitalen Identitäten und den entsprechenden Zugriffsrechten ist für die meisten Firmen nicht nur kompliziert, sondern auch zeitaufwändig. Wenn Unbefugte fälschlicherweise Zugriff zu sensiblen Unternehmens- und Kundendaten erhalten, kann
Unternehmen müssen in der modernen IT-Welt eine Vielzahl von Angestelltenidentitäten, Benutzerrollen, Zugangspunkten und Endgeräten verwalten. Diese Verwaltung von digitalen Identitäten und den entsprechenden Zugriffsrechten ist für die meisten Firmen nicht nur kompliziert, sondern auch zeitaufwändig. Wenn Unbefugte fälschlicherweise Zugriff zu sensiblen Unternehmens- und Kundendaten erhalten, kann dies zu Compliance-Problemen, Bußgeldern und Reputationsverlusten führen.

PingTalk

Meet Your New Chief Identity Champion!

What’s a Chief Identity Champion? Our Chief Identity Champion (CIC) is someone who embodies all the characteristics, personality and values of Ping Identity. He occupies an executive-level role to help us elevate the idea of identity in people’s minds and highlight how we champion identity for our customers.   
What’s a Chief Identity Champion?

Our Chief Identity Champion (CIC) is someone who embodies all the characteristics, personality and values of Ping Identity. He occupies an executive-level role to help us elevate the idea of identity in people’s minds and highlight how we champion identity for our customers. 

 

Monday, 05. October 2020

Caribou Digital

The race to digitize commerce in sub-Saharan Africa

Photo by Shikoh Gitau/ Qhala How Jumia and Facebook are competing, even though they’re playing different games Tiffany* sells makeup, and is evaluating a new display shelf where her product would be at eye level with passing shoppers. That visibility could lead to new customers, but she’s not convinced it’s worth the trouble, because almost all of her customers currently find her on Instagram
Photo by Shikoh Gitau/ Qhala How Jumia and Facebook are competing, even though they’re playing different games

Tiffany* sells makeup, and is evaluating a new display shelf where her product would be at eye level with passing shoppers. That visibility could lead to new customers, but she’s not convinced it’s worth the trouble, because almost all of her customers currently find her on Instagram. And actually, that shelf isn’t even in her store — it’s in the front of a retail locker space where she rents a container in the back to hold her inventory. Tiffany keeps some of her most popular products on hand here so that customers can easily drop by and see the merchandise, but otherwise she runs a completely virtual, just-in-time business. And paying for a physical footprint, even if it’s only a few square feet of shelf space, might not give her the return she needs on scarce capital.

Tiffany is among millions of entrepreneurs in sub-Saharan Africa taking advantage of popular digital tools to transform their business. Formal marketplaces such as Jumia or Takealot have successfully established a traditional model of e-commerce, while new platforms from Paystack, Flutterwave, and others provide the underlying tools for independent sellers. Despite these formal offerings, the vast majority of digitally connected small businesses are using Facebook, WhatsApp, and other consumer platforms to fulfill an increasing number of commercial use cases. This “social commerce” is often heralded as a leveler of the playing field, as even the smallest businesses can typically use social platforms for a low cost.

To help make sense of this mix of formal and informal, of purpose-built and appropriated, I developed a framework for comparing the range of digital solutions available to today’s entrepreneurs. For those with a commercial interest in this sector, the framework offers a lens for understanding how different solutions solve distinct parts of the value chain, and what this means for the likely addressable market of each solution. For policymakers and those interested in inclusive economic growth, the analysis highlights the implications of fundamentally different operational structures between formal ecommerce and social platforms, and why we should worry about how previously informal economic activity becomes formalized.

From analog to digital

Small businesses in sub-Saharan Africa looking to sell online have multiple options to consider; which I’ve simplified here into four categories.[1] These are by no means mutually exclusive, but for clarity I have separated them in the analysis.

Social commerce
The most accessible and likely approach for most small businesses is to use those consumer products that are already familiar and ubiquitous, especially Facebook, WhatsApp, YouTube, and Instagram. Entrepreneurs can create a page on Facebook and/or Instagram, post photos or videos of products, communicate with prospective customers, respond to customer reviews, and use built-in advertising tools to boost her page or specific product posts.

E-commerce marketplaces
Businesses may also sell on formal marketplaces such as Jumia, Kilimall, or Konga. These online marketplaces aggregate supply and demand into a single platform, providing sellers with immediate access to a large audience — for a price. Like Amazon, they offer sales support with payment processing, fulfillment, and delivery, as well as back office business tools such as inventory management and analytics; typically they charge a commission on sales.

Standalone domains
More digitally savvy businesses might go a step further and publish their own standalone website using one of a growing number of e-commerce tools, such as Paystack Commerce or Flutterwave Store, two new solutions from Nigerian fintechs that combine payment processing and back office functions with a website builder into a single platform (essentially, Shopify for Africa).

B2B sourcing
And another option for FMCG (fast-moving consumer goods) retailers are B2B platforms that connect businesses with suppliers and financing, such as Trade Depot in Nigeria or Sokowatch in East Africa, both of which aggregate demand from retailers to provide discounted wholesale pricing and deliver consumer products to the retailer.

The retail value chain

To understand how these digital solutions are reshaping retail in sub-Saharan Africa, it’s helpful to examine the business functions that each solves for. Retail businesses like Tiffany’s have a relatively straightforward value chain: retailers buy goods from upstream suppliers (inbound logistics), manage inventory and resources (operations), engage with customers (marketing), and, when they’re lucky, make sales and deliver product (sales/outbound logistics). Looking at each category of the value chain in turn, we can summarize the benefits different digital solutions provide to offline businesses looking to digitize.

Figure 1. Where different digitization solutions impact the retail value chain

Inbound logistics
The B2B marketplaces or supply chain solutions that service this part of the value chain are primarily trying to secure better pricing for retailers by aggregating demand. For example, Sokowatch in East Africa promises not only lower pricing but also financing and complete last-mile delivery to its retailers. These solutions are currently limited to FMCG, where high sales volume and commodity products make it easier to aggregate orders.

Operations
All of the e-commerce marketplaces provide warehousing services to eliminate the need for physical storage facilities. This is especially convenient if the retailer also pays the platform to handle fulfillment and delivery, but these services eat into margins, and many smaller businesses prefer to handle warehousing and fulfillment themselves. Businesses that sell on a marketplace are often also selling directly to customers via physical shops or social media, so the platform’s inventory tracking systems will only cover a portion of their business, reducing its utility.

Marketing
This is where digital options far outperform anything offline. A small retailer with no online presence can only market to those prospects who pass in physical proximity of the shop. Although advertising costs are low, the maximum size of the funnel is limited. But once the business starts marketing online, they can easily increase the reach of their advertising by many orders of magnitude. This is especially true of social media, where the smallest entrepreneur can put up a Facebook page and start bootstrapping eyeballs via their personal networks before transitioning to paid ad spend to “boost” their page or posts.

Marketplace platforms also offer substantial reach given the installed base of customers who use the site, and sellers pay to promote their products higher in the search or browsing results. One could argue that advertising dollars spent on an e-commerce platform should outperform those on social media, as customers on the former are more likely to be closer to a purchase decision, i.e. deeper in the sales funnel. But marketplace platforms can be brutally competitive for many product categories, and the structure of the market doesn’t allow for much retailer branding or other differentiation. You live and die by the transaction, with little opportunity to build a brand and a following — which is exactly what social media excel at.

Because standalone domains exist on the open web, they rely on driving traffic from other sources — search and social — to the domain, and therefore don’t benefit from any installed base of buyers.

Sales/ Outbound logistics
One of the bigger service offerings from both the marketplace platforms and the independent domain platforms, like Paystack Commerce, is the ability to process different kinds of payments. For some product categories and businesses this is important, but the majority of e-commerce payments in sub-Saharan Africa are still COD (cash-on-delivery).[2] And in markets where mobile money is popular, buyers can make non-cash payments (pre- or post-delivery) directly into mobile money wallets. So the payment processing function is most valuable only for customers who are willing to pay upfront and want to pay with a card or bank transfer.

Last-mile delivery to the customer is infamously challenging in most markets, and one of the key services that marketplace platforms offer to their sellers. But typically the delivery price is passed on to the buyer, not built into the product price. Many businesses work with local couriers or contract with new logistics platforms like Sendy in East Africa; many retailers let the customer choose a preferred logistics provider. Because most small retailers have less expensive options for delivery, paying the marketplace to handle delivery only makes sense if the retailer has sufficient volume such that the overhead of managing independent deliveries outweighs the cost savings.

Follow me for top-line growth

One helpful way to think about how each solution solves a different business function is to categorize it as impacting either (a) top-line growth or (b) bottom-line profitability. The latter category includes solutions that streamline procurement; process payments; digitize inventory management; or provide warehousing, fulfillment, and delivery services. Efficiencies in any of these can lower a retailer’s operational costs, but these will be incremental gains, especially for the smallest retailers.

On the other hand, digital advertising channels enable entrepreneurs to grow their business by reaching many more customers than they would otherwise. Social media platforms in particular allow businesses to substantially and cost-effectively increase their potential customer base, thereby scaling top-line growth and revenue.

In the language of unit economics, this is about very large reductions in CAC (customer acquisition costs) vs. incremental improvements to COGS (cost of goods sold). Obviously both are crucial metrics to a business, but have different implications at different stages of business maturity. For any business, reducing COGS and thereby improving margins, by say 10% can easily be the difference between break-even and profitability. But for the smallest businesses, where sales numbers are low, improving margins by 10% may not actually translate into anything meaningful in terms of absolute value.[3] On the other hand, if a small business can double or triple sales at a lower CAC, that revenue opens up all kinds of opportunities for staffing up or expanding the business in different ways.

Using the framing of the value chain analysis above, here’s a simple 2x2 illustrating which types of businesses are most likely to benefit from each type of digital solution. In addition to the maturity or size dimension, which as described above is a primary factor in determining the relative value of increasing reach and top-line sales vs. operational efficiencies, it plots average sales price on the other axis to serve as a proxy for the types of products and income level of the customers buying them.

Figure 2. The sweet spot for each type of solution

Social commerce occupies the lower left because it’s essentially free to use (minus paid advertising) and provides the best opportunity for growing the customer base with a lower CAC, the two most important factors for the smallest businesses. Social commerce likely has a ceiling in terms of average sale price, partly because of trust (i.e., consumers tend to place more trust in retailers on formal marketplace platforms, which may have buyer protection programs), but also as a function of how prospective buyers are most likely to search for and purchase products: Higher-income consumers are more likely to use a laptop and web-based search engines to find products, making them more likely to shop from marketplaces and standalone website domains. Lower-income consumers are more likely to only have a mobile device and spend more time within the confines of social media feeds, where social commerce reigns.

Ecommerce platforms like Jumia and Konga are used as sales channels by all kinds of businesses, but are most helpful to those that have grown to have multiple employees and higher sales volumes, where the operational and logistics services the platforms offer will have a greater impact on efficiency and overall profitability. Because of the additional formality and high transaction fees — Jumia sellers pay a sales commission of 10%-20% depending on the product category — vendors are less likely to focus on really low-value items, keeping the average sales prices higher than what you see with social media businesses.

Standalone domains supported by e-commerce platforms like Paystack Commerce are most appropriate for the largest businesses with the highest average sales price. Such firms already have established customer bases and are looking to control more of their brand without giving up margin to a marketplace. Because they exist on the open web, they have to drive traffic from search and social, which typically means a higher CAC and thus requires higher average sales prices.

Finally, B2B sourcing platforms have a hard ceiling on average sale price given that (at least in current incarnations) they are limited to FMCG and other high-volume, commodity products. The economics of bulk ordering and delivery costs mean that smaller shops with lower turnover are less likely to enroll given the expected cost benefits.

The future will be formalized

A natural question to ask is to what extent online commercial activity will migrate away from the social commerce model and toward the more formal paradigms of e-commerce marketplaces and standalone domains that are dominant in the West. I would argue that while the business logic that has the vast majority of African businesses choosing informal activity on social platforms won’t change anytime soon, those social platforms will increasingly push that economic activity into more structured and legible forms.

For the most part, social commerce businesses are appropriating tools designed for personal communication to use for business purposes; as such, the platforms don’t have good ways of knowing about these transactions being coordinated through their ecosystem. WhatsApp Pay and the new Facebook Shops are obvious steps toward improving the platforms’ visibility into commercial activity, and much more will come. The goal isn’t necessarily to compete head-to-head with e-commerce platforms with a fully integrated offering, but to capture enough of the activity to develop a completely new purchase behavior dataset that can be monetized via their ad engines.[4] Imagine advertisers being able to target their ads on Facebook to those users who have “purchased a health and wellness product for $10-$20 within the last 30 days from a seller with 5,000–25,000 followers.”

The result will be an asymmetric, top-down vs. bottom-up competition between the formal marketplaces and social media: Formal marketplaces will try to broaden their value proposition to reach lower to the smallest-scale businesses and spin the network effects flywheel, while social media will attempt to formalize its offerings, not to compete directly as a solution provider, but to develop one of the most comprehensive and valuable datasets on consumer economic activity.

The competition is asymmetric in that the two platform types have fundamentally different business models. Social media platforms, and for all intents and purposes this means Facebook, are ad-based; they make activity free in order to accumulate eyeball-hours. The marketplaces rely on the transactional revenue of the commissions they charge sellers on each sale. Put another way, the marketplaces have to make the unit economics work for each transaction on the platform, which Jumia has estimated to require something like a 10%-20% margin, a significant cost that is prohibitive to many smaller retailers.[5] Facebook, on the other hand, can afford to offer (a more limited set of) services for free because the resulting activity will substantially increase the value of its ad inventory. That data and resulting value may take a while to accumulate, but as I’ve argued previously, Facebook can use its tremendous cash reserves from its high-ARPU user base in the West to cross-subsidize unprofitable initiatives in the Global South for as long as it needs. In any war of attrition, Facebook doesn’t lose.

This disconnect between Facebook’s revenue model and its consumer-facing services presents challenges not only for regulation (e.g., antitrust policy based on consumer welfare in the form of pricing), but also assessments of its net benefit to the resource-constrained populations who rely on it. As with Free Basics, its subsidized internet access initiative, Facebook’s commerce offerings provide very real value to millions of individuals with few truly affordable alternatives. Yet the long-term implications of allowing it to continue to envelop and become the de facto provider of adjacent sectors of the digital economy are immense.

In the context of social commerce, that risk may be under-appreciated, given that most of the commercial activity moving onto the platform was informal and unreported to begin with. The sheer size of the informal sector — an estimated 86% of employment in Africa — makes it a key focus for governments grappling with economic growth policies. And digitized network technology is key in state efforts to make legible the invisible, as large-scale identification and currency initiatives have shown. Yet as more informal trade moves onto and is structured by social media, we may face a future in which the most complete and accurate source of informal economic activity is held not in Abuja, Nairobi, or Pretoria, but in Menlo Park. The future of trade may indeed be formalized, but by a platform logic with a distinctly private agenda.

The unique market conditions of sub-Saharan Africa mean the dominant model of digital commerce that emerges won’t look like digital commerce in Southeast Asia, China, or the West. Savvy and ambitious entrepreneurs like Tiffany will continue to cobble together the services they need to grow their businesses digitally, but widespread adoption will require solutions that are accessible and affordable to a broader base of sellers. For the vast majority of these micro businesses and solo entrepreneurs, the driver of value is the ability to cost-effectively increase their reach to a wider market of customers. And if this is indeed the race, it is Facebook’s to lose.

Thanks for input and suggestions from Sammy Muraya (SM Kollectionz), Winnie Makena (EGD Luggage Center), Sabina Naamwinbong (Glammy Accessories), Bode Oyelami (Hekmore Systems), Mavis Dzedu (Hotfro), Annabel Schiff, Grace Natabaalo, Stephen Deng, and Jonathan Donner.

* A fictional persona. At Caribou Digital we often create composite personas to anonymize interviewees. In a prior version of this article, I used a Biblical name for the persona which I now understand could be perceived as offensive. I very much regret this oversight, and have updated the article with a new name.

Notes

[1] A note on scope: Many other models of digitally-enabled commerce exist, especially in China and Southeast Asia, where platforms like Pinduoduo and Go-Jek have successfully proven different operational models. While there are arguments for why group purchasing, influencer-based revenue models, and other “social” practices may gain traction in sub-Saharan Africa, these are currently not widespread and thus not covered here. For simplicity, I also exclude classifieds marketplaces (e.g., Jiji/OLX), which are lower-cost for the retailer but don’t offer the same range of services as formal e-commerce marketplaces.

[2] Jumia reports 65% to 95% of orders are COD; social commerce orders are even more likely to be cash https://newsroom.mastercard.com/mea/press-releases/delivering-cashless-e-commerce-in-africa/.

[3] This is especially true for solo entrepreneurs, where the nature of the hustle makes it difficult to accurately account for time/labor costs. Yes, opportunity costs are real, but for the smallest shops an inefficient process that takes 50% longer but is free will win out over paid services that automate or streamline the task. The cost calculations change once the business is large enough for paid labor.

[4] Facebook already buys purchase behavior data (including offline) from external data aggregators in order to inform its user profiling, but this would be first-party data of transactions on its platform and concretely tied to the individual’s account, so infinitely more valuable.

[5] For reference, Amazon says it averages 8%-15%; https://sell.amazon.com/pricing.html#referral-fees.

The race to digitize commerce in sub-Saharan Africa was originally published in Caribou Digital on Medium, where people are continuing the conversation by highlighting and responding to this story.


Forgerock Blog

Six More Reasons to Love ForgeRock SDKs

The ForgeRock Identity Platform gets better all the time and our focus on delivering software development kits (SDKs) underscores our commitment to helping you build secure apps faster. Earlier this year, we talked about the Six Reasons Why ForgeRock SDKs Make Sense. Today, we are pleased to announce there are now six more reasons you can count on ForgeRock SDKs to make your life easier. 

The ForgeRock Identity Platform gets better all the time and our focus on delivering software development kits (SDKs) underscores our commitment to helping you build secure apps faster. Earlier this year, we talked about the Six Reasons Why ForgeRock SDKs Make Sense. Today, we are pleased to announce there are now six more reasons you can count on ForgeRock SDKs to make your life easier. 

Let’s take a closer look at SDK 2.0. 

Reason 1: Unlock Intelligent Access

Intelligent Access combines Intelligent Authentication (or authentication trees) capabilities that our customers appreciate with Intelligent Self-Service (self-service trees). Intelligent Access includes new journeys for user registration, password reset, and progressive profiling – to name just a few advanced features. With this new release, our SDKs now support Intelligent Access, effectively doubling the number of supported use cases. Now, developers using SDKs can save time and integrate authentication, registration, and self-service journeys into their apps faster than ever before.

Reason 2: Access Device Context

ForgeRock SDKs can collect contextual information from devices (or browsers) and seamlessly integrate with the new Device Profile Nodes of the ForgeRock IAM Platform. Device context can be used to build sophisticated authentication journeys and detect anomalies such as deviations in previously trusted devices, geo-fence breaches, access from tampered devices, and more. By using the SDKs, you can start building better authentication journeys with device context in no time.

Reason 3: Exceptional User Experiences With Usernameless and Passwordless Authentication

 

Say goodbye to usernames and passwords with ForgeRock Go while providing great user experiences without compromising security. Our JavaScript SDK now supports FIDO2-based strong authentication with WebAuthn. This enables you to build this secure and seamless login experience into your single-page apps (SPAs) with ease. Our SDK can help you go passwordless faster.

Reason 4: Improve Application Security

Our SDKs do more than just simplify the integration with the ForgeRock IAM Platform. They have native capabilities to improve application security by implementing industry best practices and adopting the latest technologies in the iOS and Android ecosystem. Starting with this release, the ForgeRock iOS SDK uses Apple's Secure Enclave for hardware-backed encryption and storage of tokens. When you use our SDKs, you can be sure that credentials are in good hands. 

Reason 5: Simplify the User Experience With In-App Authenticator

Using one time passwords generated by soft tokens or push notification based approvals are great ways to improve security by introducing a second factor in the authentication flow. Traditionally however this approach comes at the cost of user experience. End users are forced to download and use a dedicated Authenticator app which introduces a lot of friction in the user experience. With the latest version of the SDK, you can now embed these capabilities into your own mobile apps and provide your users a superior, branded and seamless authentication experience. 

Reason 6: Secure High-Value Transactions

Great experiences and proper security during login are paramount, but your customers have come to expect, and are also looking for, that same level of security during each and every transaction. That’s why they need a Zero Trust model or Continuous Adaptive Risk and Trust Assessment (CARTA). With ForgeRock SDKs you can improve security by requiring the user to perform additional verifications when engaging in a high-risk transaction or while performing an action that deviates from their normal behavior. For example, they must reauthenticate by using a second factor or respond to a push notification on their mobile device.

For more information go to our SDK page or get started today with documents that provide you step-by-step instructions for your next integration project:

SDK Feature Overview How-To Guides iOS SDK Android SDK JavaScript SDK

Authenteq

What’s Privacy-by-Design: An explainer into what it means to design with privacy as a priority

Too many companies think of privacy as a feature that can be added later. And while, yes, it is true these features can […] The post What’s Privacy-by-Design: An explainer into what it means to design with privacy as a priority appeared first on Identity Verification & KYC | Authenteq.

Too many companies think of privacy as a feature that can be added later. And while, yes, it is true these features can always be enhanced and evolve, when it isn’t baked into your product, it will never be the default. Privacy by Design is an approach to make sure that it is.

Originally developed by Dr. Ann Cavoukian, Information and Privacy Commissioner of Ontario, the concept of Privacy by Design was formalized in a report on privacy-enhancing technologies in 1995. Having privacy incorporated by design, and by default, means that the product or solution is designed with privacy as a priority—not an afterthought—and that becomes integral to organizational priorities, development objectives, and design processes.

Dr. Cavoukian’s main thesis was that the future of security can not and could not be assured with compliance to regulatory frameworks, but that “privacy assurance must ideally become an organization’s default mode of operation.”   

The 7 Foundational Principles 

Privacy by Design (PbD) has seven key principles. This is not a pick or choose, each element is as important as the next.

1. Proactive not Reactive; Preventative not Remedial

When building any product, service, or tool, companies should anticipate any and all of the ways that privacy breaches could happen before they happen. That means that PbD focuses on prevention rather than resolution. It comes before the fact, not after. This significantly reduces the potential exposure to risk associated with data breaches and also that less data is exposed in the case of a breach.

2. Privacy as the Default Setting

The burden never falls to the customer or user. That means that no additional steps need to be taken for them to protect their own personal data, the protection is built into the system by default. They don’t need to worry about taking extra steps because they are automatically protected.

3. Privacy Embedded into Design

It becomes an essential component of the core functionality being delivered. That means the privacy embedded in your product, tool, or service is integral to the system without diminishing functionality.

4. Full Functionality – Positive-Sum, not Zero-Sum

There are no trade-offs in PbD. That means it is never privacy or security as if it’s not possible to have both. It also means that having both doesn’t mean that it comes at a cost to the overall functionality. 

5. End-to-End Security – Full Lifecycle Protection 

PbD means security, security, security. From start to finish, all data is securely collected, safely stored, and properly destroyed. 

6. Visibility and Transparency in Privacy – Keep it Open 

Accountability, openness, and compliance are required for an effective and secure system. Adding transparency to the where, how, and why of data collection and processing actually improves the overall system. This breeds confidence amongst all stakeholders that the company is following all promised procedures and keeps companies accountable. 

7. Respect for User Privacy – Keep it User-Centric

Above all, the individual’s data-security should be the number one concern. People are rightly worried about their data ending up in the wrong hands. It should always be clear to them what personal information is being collected, for what purpose, and how long it will be stored. Keep their needs top of mind.

Did you know? 

The EU GDPR incorporates Privacy by Design. Companies who follow these principles have a far lower risk of exposure to GDPR violations and associated fines. 

_____________________________________________________________________________

We will be putting out a comprehensive downloadable version of this guide with what these definitions mean to Authenteq. Follow us on LinkedIn to make sure you get first access! 

The post What’s Privacy-by-Design: An explainer into what it means to design with privacy as a priority appeared first on Identity Verification & KYC | Authenteq.


Self Key

SelfKey Progress Report for September 2020

The progress report for September 2020 is here, an exciting month filled with numerous updates. The post SelfKey Progress Report for September 2020 appeared first on SelfKey.

The progress report for September 2020 is here, an exciting month filled with numerous updates.

The post SelfKey Progress Report for September 2020 appeared first on SelfKey.


MyKey

MYKEY Weekly Report 19 (September 28th~October 4th)

Today is Monday, October 5, 2020. The following is the 19th issue of MYKEY Weekly Report. In the work of last week (September 28th to October 4th), there are mainly 2 updates: 1. Firstpool, the third-party application&MYKEY launched S03 ETH-DeFi mining financial products on September 29 The principal is 100% safe, the annual income of the financial product sold this time is 5–30% and th

Today is Monday, October 5, 2020. The following is the 19th issue of MYKEY Weekly Report. In the work of last week (September 28th to October 4th), there are mainly 2 updates:

1. Firstpool, the third-party application&MYKEY launched S03 ETH-DeFi mining financial products on September 29

The principal is 100% safe, the annual income of the financial product sold this time is 5–30% and the lock-up period is 30 days. Firstpool provides a full BTC risk reserve guarantee, which is managed by MYKEY.

2. The twentieth MYKEY Crypto Stablecoin Report was published

We release MYKEY Crypto Stablecoin Report every week to share our interpretation of the development status of stablecoins and analysis of their development trends to help the participants in the crypto market stay updated on the development status of stablecoin. The twentieth Crypto Stablecoin Report was published on September 30th, click to read: https://bit.ly/3n2tsFl

!!! If you encounter any abnormal situation while using MYKEY, remember not to uninstall MYKEY APP, please contact MYKEY Assistant: @mykeytothemoon in Telegram.

!!! Remember to keep the 12-digit recovery phrase properly from [Me] — [Manage Account] — [Export Recovery Phrase] in MYKEY even if your account is not real-named.

About Us

KEY GROUP: https://keygroup.me/

MYKEY Web: https://mykey.org/

BIHU: https://bihu.com/people/1133973

Telegram: https://t.me/mykey_lab

Twitter: https://twitter.com/mykey_lab

Medium: https://medium.com/mykey-lab

Github: https://github.com/mykeylab

Youtube: MYKEY Laboratory

MYKEY Weekly Report 19 (September 28th~October 4th) was originally published in MYKEY Lab on Medium, where people are continuing the conversation by highlighting and responding to this story.


PingTalk

Access Denied: Token Revocation

Have you ever had a notification about a suspicious login attempt from a digital destination you’ve never been to? Hopefully it came with the option to say it wasn’t you, allowing you to deny access to that specific place and rebuff the user trying to sneak into your account or information. If so, you probably have token revocation to thank for the security measure.    Token revocati

Have you ever had a notification about a suspicious login attempt from a digital destination you’ve never been to? Hopefully it came with the option to say it wasn’t you, allowing you to deny access to that specific place and rebuff the user trying to sneak into your account or information. If so, you probably have token revocation to thank for the security measure. 

 

Token revocation means a token is no longer able to be used for access to a protected digital place, and it’s often behind the scenes in the denying process. It’s similar to session revocation in that you’re blocking access at some level, but there’s a key difference. With session revocation, you can deny one or more sessions and still access protected information. But token revocation is more permanent because it “kills” the token that might otherwise be reused for the duration of the token’s lifetime. 

 

Tokens play a big role in OAuth 2.0, which is designed to protect resources from wandering or malicious hands by using tokens to securely authorize users. You could build your own method of verifying access tokens and get a decent way there with open source packages, but a ready-made solution—token introspection—is easy to use and gives you the ability to offload the work from the app team to the identity platform’s team. PingOne for Customers handles the messy job of building a token verification system by giving you an API endpoint that does the work for you on our backend. In this blog post, I’ll walk you through how to use token revocation in PingOne for Customers.

 


Self Key

Stablecoins: The Underrated Factor in DeFi

To an extend, stablecoins have been effective in eradicating the anxieties about the volatility of crypto investments. With the emergence of DeFi, Stablecoins will have an even more significant role to play in the future. The post Stablecoins: The Underrated Factor in DeFi appeared first on SelfKey.

To an extend, stablecoins have been effective in eradicating the anxieties about the volatility of crypto investments. With the emergence of DeFi, Stablecoins will have an even more significant role to play in the future.

The post Stablecoins: The Underrated Factor in DeFi appeared first on SelfKey.


Smarter with Gartner - IT

What Makes Women in Technology Great CIOs

Expectations of the CIO are changing. C-suite executives now expect CIOs to shape the digital business vision and participate in or lead the digital transformation journey. These new expectations mean that a balance between traditional leadership traits such as strategic thinking, vision and risk taking (often perceived as “masculine”) and traits such as collaboration, coaching and team building

Expectations of the CIO are changing. C-suite executives now expect CIOs to shape the digital business vision and participate in or lead the digital transformation journey.

These new expectations mean that a balance between traditional leadership traits such as strategic thinking, vision and risk taking (often perceived as “masculine”) and traits such as collaboration, coaching and team building (often perceived as “feminine”) is now highly desirable.

Resist the tendency to prove your technical skills

Women should be at an advantage in the new business environment, yet only 11% of CIOs are women. Corporate culture is slow to change, especially in a male-dominated C-suite.

[swg_ad]

“Women CIOs are often the first female C-level executive beyond the chief human resources officer, and they do face unique challenges,” says Deb Curtis, VP Analyst at Gartner. 

“In addition to wearing the CIO hat, they must be prepared to carry the torch for advancing diversity efforts to influence corporate culture, as well as fulfill the long-standing role model vacancy for other women in tech,” she says.

Read more: 3 Actions for the New CIO

How women in technology build trust

Women in technology can prepare for success by demonstrating both traditional leadership skills and high emotional intelligence. Digital CIOs can delegate day-to-day IT operations to their leadership teams and focus instead on building trust and strong business partnerships.

“I’ve had multiple female CIOs as bosses,” says the male successor to the female CIO of a government agency. “They’ve each been inspirational, empowering, approachable and empathetic. They’ve also consistently been tough as nails, with high expectations for me. That balance of traits helped me grow as a leader, especially in terms of my emotional intelligence to establish relationships and network with peers.”

Remember: Your experience qualified you

First-time women CIOs often feel pressure to return to school for additional technical training to prove their worth to C-suite peers.

“You’ve been hired for your strategic thinking and business vision, so resist the tendency to prove your technical skills,” Curtis says. “Instead, count on your staff to have the answers.”

Read more: The First 100 Days of the Office of the CIO

Gain support from a male colleague

Senior women in technology often view their hiring as a big win for workplace diversity and inclusion. They assume the C-suite wanted a fresh perspective, but the same peers who interviewed them later complain that they’re “pushing changes too hard,” faster than they are ready to accept.

To accelerate your transition to CIO, you may need an ally

“From our client interactions, we conclude that many senior male executives are unprepared to embrace working with a female CIO as a peer,” says Curtis. “To accelerate your transition to CIO, you may need an ally.” 

Ask a male colleague from the hiring committee to play an active role and visibly support your leadership, diverse perspectives and alternative ideas, beyond reinforcing your qualifications and experience, especially in executive meetings and interactions.

Visible and proactive support from a male colleague can help you to chart the interpersonal dynamics of a predominantly male C-suite and ensure long-term cultural fit.

Read more: How to Combat Marginalizing Behaviors in the Workplace

Guide behavioral change at the top

Although diversity is embraced in the hiring of a female CIO, the same behavioral traits that contributed to their selection often trigger resistance from C-level male colleagues in day-to-day interactions. Any outspokenness, questioning and risk taking, seen initially as desirable, may be later perceived as offensive. 

When hitting roadblocks in interactions with C-suite peers, it can be helpful to remind your male peers of the reasons you were hired.

Finally, pick your battles

Take time to learn the history, culture, business and C-suite peer personalities in the company and use what you learn to drive behavioral change, starting in the executive ranks. 

Finally, pick your battles. Although progress may seem slow, behavior changes at the C-suite level set the tone at the top and will accelerate change down through the organization. Support and encourage male C-suite peers to build on the first step they’ve taken by hiring you.

The post What Makes Women in Technology Great CIOs appeared first on Smarter With Gartner.


Otaka

Validating Okta Access Tokens in PHP using AWS API Gateway and Lambda Authorizers

Running REST APIs with AWS Lambda and AWS API Gateway has recently become a very popular option. Although AWS provides its own mechanisms to add an authentication and authorization layer to these APIs, you may want to use your Okta centralized user database and credentials instead. Today we’ll talk about how you can use Okta as the authentication and authorization layer of your REST API hosted

Running REST APIs with AWS Lambda and AWS API Gateway has recently become a very popular option. Although AWS provides its own mechanisms to add an authentication and authorization layer to these APIs, you may want to use your Okta centralized user database and credentials instead.

Today we’ll talk about how you can use Okta as the authentication and authorization layer of your REST API hosted in AWS Lambda, validating Okta access tokens using a Lambda authorizer function implemented in PHP.

About Lambda authorizers

Lambda authorizers are the method provided by AWS API Gateway to manage authorization and authentication features. They are independent AWS Lambda methods that are called by the AWS API Gateway in order to validate the provided credentials and provide information about the authorized access level. Like other AWS Lambda methods, Authorizers can be implemented in any language and are run in a limited, managed environment.

This is the workflow of an API call when using an AWS Lambda authorizer:

The client calls a method on an API Gateway API method, passing a bearer token or request parameters. The API Gateway checks whether a Lambda authorizer is configured for the called method. If it is, API Gateway calls the corresponding authorizer Lambda function. The authorizer Lambda function checks the provided token or parameters and determines if the requested API action will be authorized or not (just by analyzing the provided token, or by calling external services). If the API action must be authorized, the Lambda function grants access by returning an output object containing at least an IAM policy object and a principal identifier. The API Gateway evaluates the returned policy object. If access is denied, the API Gateway returns a suitable HTTP error status code—such as a 403 error. If access is allowed, the API Gateway executes the method.

You can add authentication and authorization to your API methods without using a Lambda authorizer, buta Lambda authorizer will allow you to separate and centralize responsibilities in your code. This way, if you ever introduce a change in your auth methods, you’ll only have to change and re-deploy the Lambda authorizer.

Additionally, using a Lambda authorizer will allow the API Gateway to cache the auth result for an hour. This cache will be used by any subsequent API call from the same user. This could represent significant savings both in time and money, especially if the implemented Lambda authorizer launches external calls.

Interface of Lambda authorizers

All Lambda authorizers must be implemented so they receive a defined data structure from the AWS API Gateway. This structure depends on the configured type of Lambda authorizer. There are two possible types: token-based and request-based.

Token-based Lambda authorizers (also known as a TOKEN authorizers) receive the caller’s identity as provided in a bearer token, such as a JSON Web Token (JWT) or an OAuth token. It also receives the Amazon Resource Name (ARN) corresponding to the called API method. { "type":"TOKEN", "authorizationToken":"{caller-supplied-token}", "methodArn":"arn:aws:execute-api:{regionId}:{accountId}:{apiId}/{stage}/{httpVerb}/[{resource}/[{child-resources}]]" } Request-parameter-based Lambda authorizers (also known as REQUEST authorizers) receive the caller’s identity through a combination of headers, query string parameters, and stage and context variables.

After deciding the result of an auth request, the Lambda authorizer method must return an object with the following structure:

{ "principalId": "{uniqueUserId}", "policyDocument": { "Version": "2012-10-17", "Statement": [ { "Action": "execute-api:Invoke", "Effect": "Allow", "Resource": "arn:aws:execute-api:{regionId}:{accountId}:{apiId}/{stage}/{httpVerb}/[{resource}/[{child-resources}]]" } ] }, "context": { "custom": "1", "data": ["custom", "example", "data"], } }

Where:

principalId will be a unique identifier for the user (as the user ID, a unique user name, or a unique email address). policyDocument is an IAM policy document describing the action that we are allowing or disallowing. It includes: Version: must be 2012-10-17. Statement: an object including: Action: What we are allowing or disallowing. In our case, it will be execute-api:Invoke. Effect: This field specifies if the user will be authorized or not to carry out the intended action. Resource: The resource for which we want to allow or deny this action

In our example, we’ll focus on building a token-based authorizer that receives a JWT token in order to allow or deny access.

Generating JWT tokens with client credentials in Okta

In this example, we’ll use an app with client credentials to keep things simple. You may follow these steps.

If you still don’t have one, create your free developer account at developer.okta.com. Create a new application of type Service. Click on API -> Authorization Servers and then click the Scopes tab. Add a new scope called demo

Now, to obtain a JWT token you can call Okta’s token endpoint, giving it the application’s client ID and client secret.

curl -X POST 'https://<Client-ID>:<Client-secret>@<your-okta-server>/oauth2/default/v1/token' -d grant_type=client_credentials -d scope=demo

This will return a JSON structure including a valid access token generated by Okta that we’ll be able to use when accessing our protected API.

{ "token_type": "Bearer", "expires_in": 3600, "access_token": "eyJraWQiOiJQTnk1OGR(...)e6z-UOv4pGUnbIAMAHxmVsb2h4PXpblaH4", "scope": "demo" }

The access_token field is the one we’ll use as a Bearer token in our requests to the API.

In a production environment, you’ll want to use an OAuth 2.0 flow to retrieve a token. I thoroughly recommend this great Illustrated Guide to OAuth and OpenID Connect and all the other articles it references.

Prevent API slowness—Keep things local

It’s very important that we keep the Lambda authorizer quick, as it will be called in every API request. Because of this, our Lambda authorizer should do only local JWT validation.

If we call out to Okta’s token introspection, we add an additional network request to every authorization call. This is not a good idea, especially when gateway requests need to pass through very quickly, and Lambda limits the amount of time that functions can run.

Local JWT validation should be more than enough, as the generated JWT tokens are signed. We just need the Authorizer function to keep a local updated copy of the public part of the key used to generate the signature.

How to write a Lambda authorizer function in PHP

Now that we know the inputs our Lambda authorizer function must process and the outputs it must generate, let’s get it done.

Prepare PHP runtime environment

For the first step, we’ll need to circumvent a little problem: at the time I’m writing this article (September 2020), PHP is not one of the languages AWS Lambda supports out of the box. Because of this, we’ll have to prepare the PHP runtime environment, including the PHP binary that will run inside Lambda’s containers, and its related libraries and extensions. The easiest way to do this is to compile it on the same platform as Lambda, so we’ll use EC2 for this.

In the Amazon EC2 console, choose Launch instance.

Then, when choosing an AMI, you must use the same image that is currently being used by the Lambda Execution Environment (see https://docs.aws.amazon.com/lambda/latest/dg/lambda-runtimes.html).

While writing this article, AWS Lambdas can be configured to use a runtime based on Amazon Linux or a runtime based on Amazon Linux 2. Unfortunately, the Amazon Linux 2 runtime is a custom runtime that is not based on any available AMI image.

In order to avoid problems and build a PHP environment compatible with the Lambda runtime, we need to use the image that is available as an AMI image. When writing this article, this was the Amazon Linux-based AMI image amzn-ami-hvm-2018.03.0.20181129-x86_64-gp2.

You won’t find this image in the default Quick Start list that appears immediately when launching a new instance. Copy the image name and paste it into the search box. You’ll find it as a community AMI.

Launch the EC2 instance and connect to it.

Now we’ll compile the last version of PHP. In order to do that:

Update packages and install needed compilation dependencies sudo yum update -y sudo yum install autoconf bison gcc gcc-c++ libcurl-devel libxml2-devel git openssl-devel -y Then, we need to download the latest stable version of PHP. At the time of writing, it was 7.4.10. mkdir ~/php-7-bin curl -sL https://www.php.net/distributions/php-7.4.10.tar.bz2 | tar -xvj cd php-7.4.10 Let’s compile PHP with OpenSSL and libcurl support and install to /home/ec2-user/php-7-bin ./buildconf --force ./configure --prefix=/home/ec2-user/php-7-bin/ --with-openssl --with-curl --with-zlib --without-sqlite3 --without-pdo-sqlite make install

Here, we are disabling SQLite3 extension compilation, as SQLite3 development library is not installed by default in the selected AMI image. Depending on the requirements of your PHP Lambda functions, you may have to customize the ./configure line to add extra libraries. ./configure --help will give you a full list of parameters.

Once these commands are completed, please run ~/php-7-bin/bin/php -v to verify everything has worked correctly. The runtime environment will be ready if you should see a message similar to this one:

PHP 7.4.10 (cli) (built: Sep 10 2020 23:14:12) ( NTS ) Copyright (c) The PHP Group Zend Engine v3.4.0, Copyright (c) Zend Technologies Validating Okta JWT access tokens in a Lambda function

Now, let’s start building our PHP authorizer. We’ll create a working folder to make the needed infrastructure. In order to do this, download the project from GitHub, and copy the generated PHP runtime into it:

cd ~ git clone https://github.com/oktadeveloper/php-jwt-okta-lambda-auth-validator.git cd php-jwt-okta-lambda-auth-validator cp ~/php-7-bin/bin/php ./bin

Now we’ll install composer, and use it to retrieve the PHP libraries that our bootstrap file uses to call the authorizer and other libraries related to the management of JWT tokens.

curl -sS https://getcomposer.org/installer | ./bin/php bin/php composer.phar install

After this, you should just configure your settings in the classes/Config.php file:

<?php namespace OktaLambdaAuth; class Config { // Define here your Okta server hostname, like 'dev-XXXXX.okta.com' or // 'xxxxx.okta-emea.com' const OKTA_SERVER_HOSTNAME = 'dev-XXXXX.okta.com'; }

Let’s examine the bootstrap file:

#!/opt/bin/php <?php // This invokes Composer's autoloader so that we'll be able to use Guzzle and any other 3rd party libraries we need. // Depending on the configured runtime, its location may vary if ( file_exists(__DIR__ . '/vendor/autoload.php' ) ) { require __DIR__ . '/vendor/autoload.php'; } else { require '/opt/vendor/autoload.php'; } // Initialize signing-key manager, and retrieve the current JWT signing keys from Okta $keyManager = OktaLambdaAuth\KeyManager::instance(); if ( ! $keyManager->updateKeys() ) { die(); } // This is the request processing loop. Barring unrecoverable failure, this loop runs until the environment shuts down. do { // Ask the runtime API for a request to handle. $request = getNextRequest(); require_once $_ENV['LAMBDA_TASK_ROOT'] . '/src/authorizer.php'; // Execute the desired function and obtain the response. $response = authorizer($request['payload']); // Submit the response back to the runtime API. sendResponse($request['invocationId'], $response); } while (true); function getNextRequest() { $client = new \GuzzleHttp\Client(); $response = $client->get('http://' . $_ENV['AWS_LAMBDA_RUNTIME_API'] . '/2018-06-01/runtime/invocation/next'); return [ 'invocationId' => $response->getHeader('Lambda-Runtime-Aws-Request-Id')[0], 'payload' => json_decode((string) $response->getBody(), true) ]; } function sendResponse($invocationId, $response) { $client = new \GuzzleHttp\Client(); $client->post( 'http://' . $_ENV['AWS_LAMBDA_RUNTIME_API'] . '/2018-06-01/runtime/invocation/' . $invocationId . '/response', ['body' => $response] ); }

As we can see, we retrieve the JWT signing keys from the Okta server. These keys will be the ones used to validate the received JWT tokens.

Then we have an endless loop that will execute until the environment is finished. This loop will launch the authorizer function and any other existing functions corresponding to other methods in the API.

Let’s take a look at the KeyManager class (classes/KeyManager.php).

<?php namespace OktaLambdaAuth; use CoderCat\JWKToPEM\JWKConverter; class KeyManager { // Static class protected static $instance; private $base_url; /** * @return KeyManager */ public static function instance() { if ( static::$instance === null ) { static::$instance = new static(); } return static::$instance; }

This class is designed as a singleton, as it will contain the currently valid keys that will be used in all the API calls; we’ll want to access it from anywhere.

private $keys; private $valid_until; public function __construct() { $this->keys = []; $this->valid_until = 0; }

I’ll keep the keys returned by the server in the $keys object. I’ll keep the Unix timestamp when the keys expire in $valid_until, as returned by the Okta server cache headers.

Let’s go with the updateKeys() method. This method retrieves the current JWT-signing keys provided by the Okta server and saves them, along with their expiration timestamp, for future use from other functions.

public function updateKeys() { // Build the URL from Okta that we'll use to retrieve the current set of signing keys $server = Config::OKTA_SERVER_HOSTNAME; $url = 'https://' . $server . '/oauth2/default/v1/keys'; $client = new \GuzzleHttp\Client(); $query_response = $client->get( $url ); $response = json_decode( (string) $query_response->getBody() ); if ( isset( $response->errorCode ) ) { // Error fwrite( STDERR, 'Error retrieving JWT-signing key: ' . $response->errorSummary ); return false; } // Let's convert the received keys into PEM format, usable from the key verifier library $keys = $response->keys; $pem_keys = []; $jwkConverter = new JWKConverter(); foreach ( $keys as $key) { $pem_keys[] = $jwkConverter->toPEM( (array) $key ); } if ( count( $pem_keys ) ) { // Save both the keys and their expiring moment for future use $this->keys = $pem_keys; $this->valid_until = strtotime( $query_response->getHeader('expires')[0] ); } return $this->keys; }

The method getKeys() returns the current JWT-signing key information. But if the expiration time of the current information is in the past, it will refresh the saved keys. The getValidUntil() method will return the timestamp until which the current keys are valid.

public function getKeys() { if ( count( $this->keys ) && $this->valid_until > time() ) { return $this->keys; } return $this->updateKeys(); } public function getValidUntil() { return date('Y-m-d H:i:s e', $this->valid_until ); }

Now, let’s examine the authorizer() function code, in src/authorizer.php:

<?php function authorizer($data) { $type = $data['type']; $jwt = $data['authorizationToken']; $method = $data['methodArn'];

The function receives the data from AWS through these three parameters. The authorization token will include the JWT token, possibly with the “Bearer “ prefix. So first, we’ll remove it.

// Remove the "Bearer " prefix from $jwt, if it exists if ( strpos( $jwt,'Bearer ' ) === 0 ) { $jwt = str_replace( 'Bearer ', '', $jwt ); }

Now it examines the correctness of the received JWT token:

$key_manager = OktaLambdaAuth\KeyManager::instance(); $keys = $key_manager->getKeys(); $decoded_token = null; $jwt_valid = false; foreach ( $keys as $key ) { try { $decoded_token = JWT::decode( $jwt, $key, array( 'RS256' ) ); unset( $error ); $jwt_valid = true; } catch ( ExpiredException | BeforeValidException $e ) { $error = 'Token expired, or used before its validity'; break; } catch ( SignatureInvalidException $e ) { $error = 'Token not valid'; continue; } catch ( Exception $e ) { $error = 'Token problem: ' . $e->getMessage(); continue; } }

If the token was valid, it’ll allow access. If not, it’ll deny it.

if ( ! $jwt_valid || ! $decoded_token) { $result = [ 'principalId' => 'unknown', 'policyDocument' => [ 'Version' => '2012-10-17', 'Statement' => [ [ 'Action' => 'execute-api:Invoke', 'Effect' => 'Deny', 'Resource' => $method, ] ], ] ]; } else { $result = [ 'principalId' => $decoded_token->sub, 'policyDocument' => [ 'Version' => '2012-10-17', 'Statement' => [ [ 'Action' => 'execute-api:Invoke', 'Effect' => 'Allow', 'Resource' => $method, ] ], ] ]; } return json_encode( $result ); } Configuring authorizer Lambda function

So, we have all the code and the environment ready:

We have a PHP binary compiled on the correct supported version of Amazon Linux We have a bootstrap file coded in PHP that will handle the authenticator initialization and will invoke the actual authenticator when receiving requests. We have a vendor directory with all the needed PHP libraries. We have our own src and class directories including the authenticator method itself, and other auxiliary classes.

Instead of uploading our software as a single, monolithic component, we’ll upload it using layers, which will allow us easier and quicker updates if we need them. Let’s build two layers for the code that don’t depend on us—one for the PHP runtime and other for the Composer vendor folder.

After this, we’ll create another ZIP package including our code for the authorizer:

zip -r runtime.zip bin zip -r vendor.zip vendor/ zip -r authorizer.zip bootstrap classes/ src/authorizer.php

From the AWS Lambda console, we’ll create the two layers first.

In Additional Resources > Layers menu, select Create Layer.

Create a Layer Runtime, uploading the generated runtime.zip file. Select Custom runtime as the only compatible runtime.

Create another Layer “Vendor”, uploading the generated vendor.zip file. Select Custom runtime as the only compatible runtime.

Once this is done, let’s create the authorizer Lambda function.

Click on the Create function button Select Author from scratch. Assign a name for the function. We’ll call it “phpAuthorizer”. As Runtime, select Custom runtime / Use default bootstrap.

Add the two already created layers to the function, clicking Layers in the function designer below the name of the function, and then clicking Add a layer.

The created layers will be visible after selecting the Custom layers option.

Then, in the function code frame, click Actions and select Upload a .zip file.

Select the already generated authorizer.zip file. You should be able to see and edit the code of the bootstrap and our other files from the AWS Lambda Console.

After this, the Authorizer Lambda function should be ready. Let’s test it.

Click Test in the header bar. You’ll be able to configure the received test event. You must select a name for it (for example, “idtoken”). Then, use the Event template Amazon API Gateway Authorizer. This will create an event with the following JSON: { "type": "TOKEN", "authorizationToken": "incoming-client-token", "methodArn": "arn:aws:execute-api:us-east-2:123456789012:example/prod/POST/{proxy+}" }

Replace the string incoming-client-token in the generated event with the access token you generated earlier, and create the test event.

After this, you’ll be able to test the function by selecting the configured idtoken event and clicking Test in the header bar.

The test should always be successful, returning Deny for invalid or expired JWT tokens, and Allow for valid, non-expired JWT tokens generated by the Okta server.

Configuring API Gateway to pass through only authenticated requests to the backend API

Once we have our Lambda authorizer, let’s configure an API Gateway to use it.

If you already have an API, you can use it. If not, let’s create a REST example API using the example “PetStore” provided by AWS:

Navigate to the API Gateway AWS service, then click Build under REST API.

Choose the REST protocol, select to use the Example API and the Regional Endpoint Type, and click Import.

Once the API PetStore is created, enter the Authorizers menu, and then click Create New Authorizer.

Select the Lambda type, and use the already configured authorizer Lambda function (phpAuthorizer in our example). Select Token as the Lambda Event Payload. Enter Authorization as the Token Source. Leave the Authorization Caching enabled, with the default TTL of 300 seconds.

Grant all the requested permissions, and the authorizer will be created. Now we just need to configure all the endpoints of the API to use the new authorizer. Before you’ll be able to choose the authorizer, you need to deploy the API. Deploy the API without authentication first—click on Resources in the API menu, and then select the Deploy API action.

You’ll have to create a new Deployment Stage. Let’s use beta as the Stage name.

Next you’ll need to check all the existing methods you want to protect and set the defined Authorizer as its authorization method. Let’s see how this is done for the /pets GET method:

In the created API Resources menu, select the /pets GET method. Then, click on the Method Request definition.

Then click on the pencil icon in the Authorization setting, select the defined Authorizer as the authorization method, and click the check mark icon to save the selection.

You’ll have to repeat all these steps for all the endpoints you want to protect. For this example, I only left the root GET method unprotected, so we can test the difference between protected and unprotected methods.

After this, deploy a new version of the API. This time, you can reuse the previous stage you created in the last deployment.

Congratulations! All the protected API Methods will now require a valid JWT token.

You can test this by making a request from curl using the access token you generated at the beginning of this blog post. To do this, the URL for every API access point can be retrieved from the API Stages section, selecting the stage you just deployed and clicking on the corresponding endpoint. You’ll see the URL in the Invoke URL frame.

If you click on the provided Invoke URL for the / GET method, that we left unprotected, you’ll see the landing page of the Pet Store API which has a short description of the API.

To finish up,, let’s make a request to the Invoke URL for the /pets GET resource, first without an access token, and then with the access token. (You’ll have to retrieve the actual URL from the corresponding endpoint in the last deployed stage.)

curl -i 'https://XXXXXXXX.execute-api.us-west-1.amazonaws.com/beta/pets' {"message":"Unauthorized"} curl -i 'https://XXXXXXXX.execute-api.us-west-1.amazonaws.com/beta/pets' -H "Authorization: Bearer eyJraWQiOiI5VzhZV3Qxc2RqO..." [ { "id": 1, "type": "dog", "price": 249.99 }, { "id": 2, "type": "cat", "price": 124.99 }, { "id": 3, "type": "fish", "price": 0.99 } ] Learn more about web security and PHP

If you’d like to learn more about integrating Okta with your PHP applications, be sure to check out some of these resources:

Build a Simple Laravel App with Authentication Protecting a PHP API Using OAuth Create and Verify JWTs in PHP with OAuth 2.0

If you liked this post, follow @oktadev on Twitter to see when we publish similar ones. We have a YouTube channel too! You should subscribe. 😊

We’re also streaming on Twitch, follow us to be notified when we’re live.


Self Key

The Transition from Centralized to Decentralized

SelfKey Weekly Newsletter Date – 30th September, 2020 Crypto industry is in a path of transition, a transition from centralized to decentralized. The post The Transition from Centralized to Decentralized appeared first on SelfKey.

SelfKey Weekly Newsletter

Date – 30th September, 2020

Crypto industry is in a path of transition, a transition from centralized to decentralized.

The post The Transition from Centralized to Decentralized appeared first on SelfKey.

Sunday, 04. October 2020

KuppingerCole

KuppingerCole Analyst Chat: Policy-based and Dynamic Authorization Management

Dynamic, risk-based, attribute- and context-related authorizations are becoming increasingly important for many enterprises. Graham Williamson and Matthias Reinwarth take a look at the market sector for dynamic authorization management and policy-based permissions in light of the recent publication of a Market Compass on this topic.

Dynamic, risk-based, attribute- and context-related authorizations are becoming increasingly important for many enterprises. Graham Williamson and Matthias Reinwarth take a look at the market sector for dynamic authorization management and policy-based permissions in light of the recent publication of a Market Compass on this topic.



Friday, 02. October 2020

Ontology

A Letter to the Wing DAO Community from Jun Li, Founder of Ontology

Wing, the first credit-based cross-chain DeFi platform built on the Ontology blockchain, has garnered a lot of attention since its launch in early September, laying the groundwork for the development of Ontology’s DeFi ecosystem and for the integration of Ontology’s decentralized identity framework, data, and credit evaluation systems. Meanwhile, our team is delighted to have received feedback fr

Wing, the first credit-based cross-chain DeFi platform built on the Ontology blockchain, has garnered a lot of attention since its launch in early September, laying the groundwork for the development of Ontology’s DeFi ecosystem and for the integration of Ontology’s decentralized identity framework, data, and credit evaluation systems.

Meanwhile, our team is delighted to have received feedback from the global Ontology community, including suggestions and recommendations for Wing’s future growth.

Now, as we celebrate one month since the launch of Wing, we can analyze the project’s performance so far and see what we can learn going forward. We hope this letter will also help keep community members updated on the milestones Wing has met to date, as well as what’s to come over the next few months.

Off To A Promising Start: High TVL and APY

Supported by Ontology’s blockchain infrastructure, Wing got off to a great start, offering lower transaction fees, faster transfer speed, and reduced costs overall. Wing’s TVL (total value locked) rose above 200 million USD on more than one occasion. At present, Wing’s TVL is around 150 million USD a very promising figure.

In addition to retaining a high TVL and providing higher returns to users in terms of APY (annual percentage yield), Wing also provides users with returns on borrowing and lending mainstream coins including ETH, USDT, DAI, and USDC, at some of the highest rates available on the DeFi market.

Currency — Wing Interest Rate

USDT — 11.97%

DAI — 11.96%

USDC — 11.97%

ETH — 11.97%

WBTC — 29.93%

TVL(Total Value Locked)/ — 134 Million USD

*Data Source: Borrow APYs of the above tokens are recorded from respective websites, as of 2:00 a.m. September 28, 2020 (UTC)

Further, Wing is among the very few DeFi lending projects that has completed cross-chain integration with Ethereum assets, which has the potential to provide high yields for Ethereum assets outside of the current Ethereum ecosystem.

WING, Returning to a Governance Token

A token’s price in the secondary market is often regarded as a major indicator of its value. The day WING went live on Binance, the token instantly hit an all-time high of $300, before gradually cooled to $20. Of course, this resulted in some doubting the token’s future.

It’s likely that the recent hype in the DeFi market contributed to a $300 peak so quickly. These prices have, indeed, put both the Wing and Ontology teams under some pressure.

Comparisons can be drawn between WING quickly reaching a high of $300 and COMP’s experience of reaching an almost instantaneous high of $2,000 upon its release. As newly-launched projects are yet to reach proper transaction depths, an all-time high upon release is not a good indicator of its long-term price

In the long run, the market price of a token should be based on its application value and users enthusiasm around growth potential. In relation to WING as a governance token, the governance power and potential returns are substantiated by the ever-increasing TVL (Total Value Locked) on the Wing platform, as well as the transaction fees derived from interest on borrowing, lending and insuring. The Wing team believes that the value of the WING token will be more visible to community members when WING returns to its original position as a governance token.

Improving The Governance Model and Enriching Value

The Wing team greatly values feedback from the Wing DAO community..

In terms of next steps, Wing’s core development team will increase efforts in protocol implementation, product upgrades and improvements to user experience. In addition to further optimizing the Flash Pool UI, the Wing dApp will be released on ONTO, Ontology’s data wallet, at the end of September. In mid-October, a month after Wing’s official dApp launch, Wing.Finance will go through a structural upgrade that will facilitate the community governance framework and expand WING’s value in more dimensions.

As for product upgrades, in October, Wing’s core development team will submit a proposal to the Wing DAO community for a second lending pool incorporating credit, prompting Wing to venture into credit-based lending.

Empowering Wing with a Wing DAO-led Governance Strategy

As we have learned from Ontology’s positive growth history, a community-driven strategy is the cornerstone to a sustainable development of blockchain-based products. For this reason, the Wing community will remain our primary commitment in the next phase, and beyond.

Upon Wing’s launch, a series of mining activities were initiated by the https://gov.wing.finance/ Wing team to ensure the steady growth of the ecosystem. Over the past 15 days, three proposals in the Wing DAO community were initiated with the aim of empowering Wing’s growth with a community governance approach of raising public proposals and seeking a vote from the entire community for execution.

As the governance policies formed by the three Wing DAO community proposals came to fruition, it has been great to see that the community has grown accustomed to taking governance power into their own hands. As the governance model has made a positive impact, the Wing project will prioritize a community-driven strategy in the next stage of Wing’s development. Meanwhile, we are more than happy to see community forums like https://gov.wing.fiannce/,where members can express their opinions, and the community can gather together to brainstorm ideas. Every bit of feedback from the community is of significance to Wing’s growth.

A recent proposal suggested transaction fees from borrowing on WING should be bought back and burnt. The Wing team is currently looking into whether the proposal is technically feasible. If this is the case, the Wing team will initiate a fourth proposal for the community to make a final decision.

Following the Wing DAO-driven strategy, Wing will continue to adhere to an open and transparent governance structure, taking into account the community’s feedback to further improve the governance rules, simplify the process for member governance participation, perfect the governance protocol, and enable the protocol to be applied to more scenarios. We believe that as we progress, WING will be regarded as an increasingly valuable governance token.

Wing’s Strategic Significance to the Ontology Ecosystem

The success of DeFi projects requires ongoing commitment. Compound, a DeFi project released two years ago, has spent two thirds of its life in stealth mode until the unprecedented yield farming spree which took place inJune 2020. Uniswap provides us with another example of long-term users working together and remaining committed, ultimately cumulating in bountiful returns.

As the first credit-based cross-chain DeFi platform, Wing also serves as the first key project in Ontology’s DeFi ecosystem. Ontology’s decentralized identity framework (ONT ID) and decentralized data protocol (DDXF) can support users in managing their own identities and data verification, while also integrating with smart contracts to enable automatic verification of credit data and credit evaluation. A technical infrastructure such as this combined with OScore, Ontology’s credit evaluation system, makes it intrinsically possible for Wing to integrate credit data into its products.

In the next stage, Wing can be be fine-tuned to work as a crucial component of the Ontology DeFi ecosystem. We expect that more premium projects derived from Wing will emerge and enrich the ecosystem, in turn maximizing the project’s significance and value in the DeFi sector.

While we continuously improve our underlying infrastructure, Ontology will prioritize the development of our communities around the world. We’d like to extend our thanks to every community member who presented us with suggestions for Wing, and those who participated in Wing DAO governance in the last month.

Ontology is committed to continuing our efforts to optimize the governance framework, product design, and user experience of Wing. We also plan to explore more innovative DeFi products in a bid to build a bridge between DeFi and traditional finance.

Find Ontology elsewhere

Ontology website / ONTO website / OWallet (GitHub)

Twitter / Reddit / Facebook / LinkedIn / YouTube / NaverBlog / Forklog

Telegram Announcement / Telegram English / GitHubDiscord

A Letter to the Wing DAO Community from Jun Li, Founder of Ontology was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


Otaka

Easy Session Sharing in Spring Boot with Spring Session and MySQL

Session management in multi-node applications presents multiple challenges. When the architecture includes a load balancer, client requests might be routed to different servers each time, and the HTTP session might be lost. In this tutorial, we will walk you through the configuration of session sharing in a multi-node Spring Boot application. Prerequisites: Java 8+ Docker Docker

Session management in multi-node applications presents multiple challenges. When the architecture includes a load balancer, client requests might be routed to different servers each time, and the HTTP session might be lost. In this tutorial, we will walk you through the configuration of session sharing in a multi-node Spring Boot application.

Prerequisites:

Java 8+ Docker Docker Compose

Table of Contents

Session Persistence Session Sharing with Spring Session Learn More about Spring Session and OAuth 2.0 Session Persistence

Session Persistence is a technique for sticking a client to a single server, using application layer information—like a cookie, for example. In this tutorial, we will implement session persistence with the help of HAProxy, a reliable, high performance, TCP/HTTP load balancer.

First, let’s create a web application with Okta authentication and run three nodes with HAProxy load balancing using Docker Compose.

Create a Maven project using the Spring Initializr’s API.

curl https://start.spring.io/starter.zip \ -d dependencies=web,okta \ -d groupId=com.okta.developer \ -d artifactId=webapp \ -d name="Web Application" \ -d description="Demo Web Application" \ -d packageName=com.okta.developer.webapp \ -d javaVersion=11 \ -o web-app.zip

Unzip the project:

unzip web-app.zip -d web-app cd web-app

Run the Okta Maven Plugin to register a new account:

./mvnw com.okta:okta-maven-plugin:register

If you already have an Okta account registered, use login instead of register.

Then, configure your Spring application for authentication using Okta:

./mvnw com.okta:okta-maven-plugin:spring-boot

It will set up a new OIDC application for you and write your Okta settings to your src/main/resources/application.properties file.

Create a GreetingController at src/main/java/com/okta/developer/webapp/controller:

package com.okta.developer.webapp.controller; import org.slf4j.Logger; import org.slf4j.LoggerFactory; import org.springframework.security.core.annotation.AuthenticationPrincipal; import org.springframework.security.oauth2.core.oidc.user.OidcUser; import org.springframework.stereotype.Controller; import org.springframework.web.bind.annotation.GetMapping; import org.springframework.web.bind.annotation.ResponseBody; import org.springframework.web.context.request.RequestContextHolder; import java.net.InetAddress; @Controller public class GreetingController { private static final Logger logger = LoggerFactory.getLogger(GreetingController.class); @GetMapping(value = "/greeting") @ResponseBody public String getGreeting(@AuthenticationPrincipal OidcUser oidcUser) { String serverUsed = "unknown"; try { InetAddress host = InetAddress.getLocalHost(); serverUsed = host.getHostName(); } catch (Exception e){ logger.error("Could not get hostname", e); } String sessionId = RequestContextHolder.currentRequestAttributes().getSessionId(); logger.info("Request responded by " + serverUsed); return "Hello " + oidcUser.getFullName() + ", your server is " + serverUsed + ", with sessionId " + sessionId; } }

Run the application with:

./mvnw spring-boot:run

Go to http://localhost:8080 in an incognito window and you should be redirected to the Okta sign-in page.

If you sign in, you will get a 404 error when you’re redirected back to your Spring Boot app. This is expected because there’s no controller mapped to the / endpoint. You can fix this if you want by adding a method like the following to your WebApplication.java.

@RestController @SpringBootApplication public class WebApplication { public static void main(String[] args) { SpringApplication.run(WebApplication.class, args); } @GetMapping("/") public String hello(@AuthenticationPrincipal OidcUser user) { return "Hello, " + user.getFullName(); } }

Now, let’s configure three Docker containers, one for each application node, and an HAProxy container. In the project root folder, create a docker/docker-compose.yml file, with the following content:

version: '3.1' services: webapp1: environment: - OKTA_OAUTH2_ISSUER=${OKTA_OAUTH2_ISSUER} - OKTA_OAUTH2_CLIENT_ID=${OKTA_OAUTH2_CLIENT_ID} - OKTA_OAUTH2_CLIENT_SECRET=${OKTA_OAUTH2_CLIENT_SECRET} image: webapp hostname: webapp1 ports: - 8081:8080 webapp2: environment: - OKTA_OAUTH2_ISSUER=${OKTA_OAUTH2_ISSUER} - OKTA_OAUTH2_CLIENT_ID=${OKTA_OAUTH2_CLIENT_ID} - OKTA_OAUTH2_CLIENT_SECRET=${OKTA_OAUTH2_CLIENT_SECRET} image: webapp hostname: webapp2 ports: - 8082:8080 webapp3: environment: - OKTA_OAUTH2_ISSUER=${OKTA_OAUTH2_ISSUER} - OKTA_OAUTH2_CLIENT_ID=${OKTA_OAUTH2_CLIENT_ID} - OKTA_OAUTH2_CLIENT_SECRET=${OKTA_OAUTH2_CLIENT_SECRET} image: webapp hostname: webapp3 ports: - 8083:8080 haproxy: build: context: . dockerfile: Dockerfile-haproxy image: my-haproxy ports: - 80:80 depends_on: - "webapp1" - "webapp2" - "webapp3"

Create adocker/.env file with the following content:

OKTA_OAUTH2_ISSUER={issuer} OKTA_OAUTH2_CLIENT_ID={clientId} OKTA_OAUTH2_CLIENT_SECRET={clientSecret}

You can find the issuer, clientId, and clientSecret in the src/main/resources/application.properties, after running the Okta Maven Plugin. Remove the \ in the issuer’s URL after you paste the value. Also, make sure to remove the curly braces around the values.

Create a Dockerfile for the HAProxy container, at docker/Dockerfile-haproxy and add the following:

FROM haproxy:2.2 COPY haproxy.cfg /usr/local/etc/haproxy/haproxy.cfg

Create the configuration file for the HAProxy instance at docker/haproxy.cfg:

global debug daemon maxconn 2000 defaults mode http timeout connect 5000ms timeout client 50000ms timeout server 50000ms frontend http-in bind *:80 default_backend servers backend servers balance roundrobin cookie SERVERUSED insert indirect nocache option httpchk / option redispatch default-server check server webapp1 webapp1:8080 cookie webapp1 server webapp2 webapp2:8080 cookie webapp2 server webapp3 webapp3:8080 cookie webapp3

I’m not going to dive deep into how to configure HAProxy but take note that, in the backend servers section, there are the following options:

balance roundrobin sets round-robin as the load balancing strategy. cookie SERVERUSED adds a cookie SERVERUSED to the response, indicating the server responding to the request. The client requests will stick to that server. option redispatch makes the request be re-dispatched to a different server if the current server fails.

Edit the pom.xml to add the Jib Maven Plugin to the <build> section to create a webapp Docker image.

<plugin> <groupId>com.google.cloud.tools</groupId> <artifactId>jib-maven-plugin</artifactId> <version>2.5.2</version> <configuration> <to>  </to> </configuration> </plugin>

Build the webapp container image:

./mvnw compile jib:dockerBuild

Start all the services with docker-compose:

cd docker docker-compose up

NOTE: If you get a URISyntaxException on startup, remove the \ in the issuer in application.properties.

HAProxy will be ready after you see the following lines in the logs:

haproxy_1 | [WARNING] 253/130140 (6) : Server servers/webapp2 is UP, reason: Layer7 check passed, code: 302, check duration: 5ms. 1 active and 0 backup servers online. 0 sessions requeued, 0 total in queue. haproxy_1 | [WARNING] 253/130141 (6) : Server servers/webapp3 is UP, reason: Layer7 check passed, code: 302, check duration: 4ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue. haproxy_1 | [WARNING] 253/130143 (6) : Server servers/webapp1 is UP, reason: Layer7 check passed, code: 302, check duration: 7ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.

Before you can sign in to your application, you’ll need to go to your Okta developer console and add a Login redirect URI for http://localhost/login/oauth2/code/okta. Otherwise, you’ll get a 400 error in the next step. While you’re in there, add a Logout redirect URI for http://localhost.

In a browser, go to http://localhost/greeting. After you sign in, inspect the request cookie SERVERUSED. An example value is:

Cookie: SERVERUSED=webapp3; JSESSIONID=5AF5669EA145CC86BBB08CE09FF6E505

Shut down the current node with the following Docker command:

docker stop docker_webapp3_1

Refresh your browser and wait a few seconds. Check the SERVERUSED cookie to verify that HAProxy re-dispatched the request to a different node, and the sessionId has changed, meaning the old session was lost.

You can stop the services with CTRL+C.

Session Sharing with Spring Session

Storing sessions in an individual node can affect scalability. When scaling up, active sessions will remain in the original nodes and traffic will not be spread equally among nodes. Also, when a node fails, the session in that node is lost. With session sharing, the user session lives in a shared data storage that all server nodes can access.

Next, for a transparent failover with the redispatch option in HAProxy, let’s add session sharing between nodes with Spring Session. For this tutorial, I’ll show you how to use MySQL for storing the session.

First, add the following dependencies to the pom.xml:

<dependency> <groupId>mysql</groupId> <artifactId>mysql-connector-java</artifactId> <scope>runtime</scope> </dependency> <dependency> <groupId>org.springframework.session</groupId> <artifactId>spring-session-core</artifactId> </dependency> <dependency> <groupId>org.springframework.session</groupId> <artifactId>spring-session-jdbc</artifactId> </dependency> <dependency> <groupId>com.zaxxer</groupId> <artifactId>HikariCP</artifactId> <version>3.2.0</version> </dependency> <dependency> <groupId>com.h2database</groupId> <artifactId>h2</artifactId> <scope>test</scope> </dependency>

Rename src/main/resources/application.properties to application.yml, change your okta.* properties to be in YAML syntax, and add the following key-value pairs:

okta: oauth2: issuer: {issuer} client-secret: {client-secret} client-id: {client-id} spring: session: jdbc: initialize-schema: always datasource: url: jdbc:mysql://localhost:3306/webapp username: root password: example driverClassName: com.mysql.cj.jdbc.Driver hikari: initializationFailTimeout: 0 logging: level: org.springframework: INFO com.zaxxer.hikari: DEBUG

In this example, you are using HikariCP for the database connection pooling. The option initializationFailTimeout is set to 0, meaning if a connection cannot be obtained, the pool will start anyways.

You are also instructing Spring Session to always create the schema with the option spring.session.jdbc.initialize-schema=always.

The application.yml file you just created contains the default datasource properties for the MySQL session storage. As the MySQL database is not up when the tests run, set up an in-memory H2 database so the application tests don’t fail.

Create asrc/test/resources/application-test.yml file with the following content:

spring: datasource: url: jdbc:h2:mem:testdb username: sa password: passord driverClassName: org.h2.Driver

Modify the WebApplicationTests.java class to add the @ActiveProfiles annotation:

package com.okta.developer.webapp; import org.junit.jupiter.api.Test; import org.springframework.boot.test.context.SpringBootTest; import org.springframework.test.context.ActiveProfiles; @SpringBootTest @ActiveProfiles("test") class WebApplicationTests { @Test void contextLoads() { } }

Modify docker/docker-compose.yml to add the database container and the admin application to inspect the session tables. The final configuration should look like the following:

version: '3.1' services: webapp1: environment: - SPRING_DATASOURCE_URL=jdbc:mysql://db:3306/webapp - OKTA_OAUTH2_ISSUER=${OKTA_OAUTH2_ISSUER} - OKTA_OAUTH2_CLIENT_ID=${OKTA_OAUTH2_CLIENT_ID} - OKTA_OAUTH2_CLIENT_SECRET=${OKTA_OAUTH2_CLIENT_SECRET} image: webapp hostname: webapp1 ports: - 8081:8080 depends_on: - "db" webapp2: environment: - SPRING_DATASOURCE_URL=jdbc:mysql://db:3306/webapp - OKTA_OAUTH2_ISSUER=${OKTA_OAUTH2_ISSUER} - OKTA_OAUTH2_CLIENT_ID=${OKTA_OAUTH2_CLIENT_ID} - OKTA_OAUTH2_CLIENT_SECRET=${OKTA_OAUTH2_CLIENT_SECRET} image: webapp hostname: webapp2 ports: - 8082:8080 depends_on: - "db" webapp3: environment: - SPRING_DATASOURCE_URL=jdbc:mysql://db:3306/webapp - OKTA_OAUTH2_ISSUER=${OKTA_OAUTH2_ISSUER} - OKTA_OAUTH2_CLIENT_ID=${OKTA_OAUTH2_CLIENT_ID} - OKTA_OAUTH2_CLIENT_SECRET=${OKTA_OAUTH2_CLIENT_SECRET} image: webapp hostname: webapp3 ports: - 8083:8080 depends_on: - "db" db: image: mysql command: --default-authentication-plugin=mysql_native_password restart: always environment: MYSQL_ROOT_PASSWORD: example MYSQL_DATABASE: webapp ports: - 3306:3306 adminer: image: adminer restart: always ports: - 8090:8080 haproxy: build: context: . dockerfile: Dockerfile-haproxy image: my-haproxy ports: - 80:80 depends_on: - "webapp1" - "webapp2" - "webapp3"

Delete the previous containers and previous webapp Docker image with the following commands:

docker-compose down docker rmi webapp

In the root folder of the project, rebuild the webapp docker image with Maven:

./mvnw compile jib:dockerBuild

Start all the services again (docker-compose up from the docker directory), and repeat the re-dispatch test (go to http://localhost/greeting then shutdown the active node with docker stop docker_webapp#_1). You might see a lot of connection errors until the database is up.

Now the session should be the same after changing the node. How cool is that?!

You can inspect the session data in the admin UI at http://localhost:8090. Log in with root and the MYSQL_ROOT_PASSWORD value that you set indocker-compose.yml.

Learn More about Spring Session and OAuth 2.0

I hope you enjoyed this tutorial and could see the advantages of the session sharing technique for multi-node applications. You can find all the code for this tutorial in GitHub.

Know that there are multiple options for session storage—we selected a database because of the ease of setup—but it might slow down your application. To learn more about session management, check out the following links:

Spring Session What’s New with OAuth and OpenID Connect? Build Single Sign-on in Java HAProxy

If you liked this post, follow @oktadev on Twitter to see when we publish similar ones. We have a YouTube channel too! You should subscribe. 😊

We’re also streaming on Twitch, follow us to be notified when we’re live.

Thursday, 01. October 2020

KuppingerCole

Martin Kuppinger: Where ITSM is Heading – and the Impact on IAM

ITSM is going well beyond ITIL and IT ticketing these days: It’s becoming the portal and workflow platform Not that long ago, ITSM (IT Service Management) was what the name means: A technology used within IT to manage IT services and facing to the end user when it comes to IT requests. IT requests led to tickets as the tasks to be performed by workers in IT. And yes, there was and is ITIL (IT In

ITSM is going well beyond ITIL and IT ticketing these days: It’s becoming the portal and workflow platform

Not that long ago, ITSM (IT Service Management) was what the name means: A technology used within IT to manage IT services and facing to the end user when it comes to IT requests. IT requests led to tickets as the tasks to be performed by workers in IT. And yes, there was and is ITIL (IT Infrastructure Library) describing common IT processes, there were and are Service Catalogs, and there were and are CMDBs (Configuration Management Databases).

However, this is changing. ITSM platforms are shifting from IT solutions to business solutions and becoming strategic tools for organizations, for service delivery (and thus service definition, service management, and so on) across a range of business functions. They have become a widely used interface for users to a wide range of services, and they support the workflows and process automation behind these interfaces.

With IAM providing interfaces and with workflows and processes being a vital part of every IAM, it is obvious that there is a logical link between IAM and ITSM (or the other way round).

Martin Kuppinger, Principal Analyst at KuppingerCole, in his talk will look at the journey of ITSM and where ITSM is heading. He will look at the overlaps and links between IAM and ITSM. And he will take a high-level perspective on where integration is expected to become deeper and where IAM capabilities might shift to ITSM, specifically in the context of IAM evolving from monolithic platforms to modern, microservice-based architectures that can well make use of existing ITSM services, microservices, and APIs.




Panel - Integrating IGA and ITSM - Key Benefits and Main Challenges




Warwick Ashford: Don’t Reinvent the Wheel – Align ITSM with IAM/IGA Instead

As ITSM platforms evolve into strategic tools for service deliver across a range of busines functions, it is tempting for organizations to build in identity access management, governance, and administration functionality to provide a one-stop-shop for all employee requests and eliminate the cost of a separate IAM/IGA system.  Warwick Ashford, senior analyst at KuppingerCole will explain why

As ITSM platforms evolve into strategic tools for service deliver across a range of busines functions, it is tempting for organizations to build in identity access management, governance, and administration functionality to provide a one-stop-shop for all employee requests and eliminate the cost of a separate IAM/IGA system. 

Warwick Ashford, senior analyst at KuppingerCole will explain why this is a risky strategy and discuss the benefits of and some use cases for aligning ITSM with IAM/IGA systems instead.




Jackson Shaw: Is it Time for an Identity Revolution?

Why have things like cell phones and automobiles become more advanced, intuitive and cost effective over time while managing Identity, particularly Identity Governance, has remained complex and expensive? The time and resources it takes to implement an identity project hinders the business and slows any hope of digital transformation. The frustration is real and ripping and replacing has not prove

Why have things like cell phones and automobiles become more advanced, intuitive and cost effective over time while managing Identity, particularly Identity Governance, has remained complex and expensive? The time and resources it takes to implement an identity project hinders the business and slows any hope of digital transformation. The frustration is real and ripping and replacing has not proved to be the answer. So what’s it going to take to truly get IGA right? In this thought-provoking session, Jackson Shaw, an experienced thought leader in IGA will discuss the need to rethink the core of identity and why it’s time for an IDENTITY REVOLUTION.




Jessica Constantinidis: Who Are Your Customers, and Why Do They Need You or Your Product? Find the Real Business Goal You Need to Achieve by Thinking Differently

Today, if you want to respond to new competitors or communicate with people who might buy your product, you must refresh your view of who your internal and external customers are and how you can reach them. Similar questions apply to all business models : What are you doing today to make a difference? How can you optimize in ways that match our new reality? Is agility at your business core?  

Today, if you want to respond to new competitors or communicate with people who might buy your product, you must refresh your view of who your internal and external customers are and how you can reach them. Similar questions apply to all business models : What are you doing today to make a difference? How can you optimize in ways that match our new reality? Is agility at your business core?  Nowadays, nobody can afford to make assumptions as time and money are restricted, and most are in a cost optimization phase. However with the right focus and right lens it might actually be the innovation break your company needed. Let’s see how to make lemonade in 2020.




Pavel Volosen: An Implementer’s Perspective to Traditional Identity Access Governance (IAG) vs Identity Access Governance (IAG) on ServiceNow

This session will compare and contrast characteristics of Identity Access Governance built on traditional platforms, with those built on top of ServiceNOW, taken from an field perspective. Session will review implementation costs, common outcomes, and ultimately how to decide which is the most appropriate solution based on business needs.

This session will compare and contrast characteristics of Identity Access Governance built on traditional platforms, with those built on top of ServiceNOW, taken from an field perspective. Session will review implementation costs, common outcomes, and ultimately how to decide which is the most appropriate solution based on business needs.




Interview with Jackson Shaw




Gillan Ward: The Use of Real World Evidence (and Identities) in Support of Identity and Access Management

Central to the ability to identify, authenticate and authorise individuals and allow them access to resources is the validation of the requirements to ensure that someone is who they claim to be, possess the requisite academic or professional qualifications, work experience, skills and understanding their competency within any given skill. Ie. I may have a driving licence with allows me the right

Central to the ability to identify, authenticate and authorise individuals and allow them access to resources is the validation of the requirements to ensure that someone is who they claim to be, possess the requisite academic or professional qualifications, work experience, skills and understanding their competency within any given skill. Ie. I may have a driving licence with allows me the right to drive but if subsequent to a test I have never had the opportunity my competency will be almost non-existent. And of course ensuring the binding of the identity throughout the lifecycle of an individual to the claimed identity from onboarding through operation and eventual retirement, along with the credentials I’ve just highlighted.




Todd Wiedman: IGA with the Power of ServiceNow

With all of the different IGA approaches available these days, have you ever wondered how global companies have success in centrally and seamlessly managing their mountains of requests while still maintaining critical workflows and compliance standards? Get the strategies you need to navigate and win from Todd Wiedman, Chief Information Security Officer, Landis+Gyr. Todd will be sharing insights a

With all of the different IGA approaches available these days, have you ever wondered how global companies have success in centrally and seamlessly managing their mountains of requests while still maintaining critical workflows and compliance standards? Get the strategies you need to navigate and win from Todd Wiedman, Chief Information Security Officer, Landis+Gyr. Todd will be sharing insights and learnings from his successful implementation using the Clear Skye IGA solution natively running on the ServiceNow (NOW) Platform in this ‘not to be missed’ session.




Forgerock Blog

ForgeTalks: What are Containerized Directory Services?

Welcome back to another episode of ForgeTalks. This week we tackle how to help organizations prepare for unexpected spikes in consumer demand. I sat down with ForgeRockers Jeff Carpenter, director of product marketing and Ludovic Poitou, director of product management, to discuss the importance of scalability for millions of identities. They explained how our Containerized Directory Services

Welcome back to another episode of ForgeTalks. This week we tackle how to help organizations prepare for unexpected spikes in consumer demand. I sat down with ForgeRockers Jeff Carpenter, director of product marketing and Ludovic Poitou, director of product management, to discuss the importance of scalability for millions of identities. They explained how our Containerized Directory Services can help you handle massive transaction volumes and millions of identities at thousands of transactions per second. 

We'll be answering questions like: 

What role does Directory Services play in identity? What are the risks of ignoring Directory Services? How is ForgeRock enabling a secure and reliable transition to the cloud with Containerized Directory Services? 

I hope you enjoyed this episode of ForgeTalks. And if you want to check out any of our other episodes you can do so here.


KuppingerCole

The Story of Edge AI

by Alexei Balaganski Whether you are a firm believer in the bright future of Artificial Intelligence or somewhat of a skeptic like me – you simply cannot ignore the great strides AI technologies have made in recent years. Intelligent products powered by machine learning are everywhere: from chatbots to autonomous vehicles, from predicting consumer behavior and detecting financial fraud to cancer

by Alexei Balaganski

Whether you are a firm believer in the bright future of Artificial Intelligence or somewhat of a skeptic like me – you simply cannot ignore the great strides AI technologies have made in recent years. Intelligent products powered by machine learning are everywhere: from chatbots to autonomous vehicles, from predicting consumer behavior and detecting financial fraud to cancer diagnostics and crop harvesting. There is however a major factor limiting even further applications of AI and ML in almost any industry: AI algorithms are very computationally intensive and until quite recently, prohibitively expensive to operate “on-premises”.

The rise of cloud computing has given a major boost to AI technologies in recent years, and to this day, it is the cloud that powers the majority of AI applications, both for consumers and for enterprise applications. However, what works perfectly well for business analytics, language translation services or, say, virtual assistants, might be not enough for more mission-critical scenarios.

The Challenges of AI in the Cloud

The biggest challenge of cloud-based AI are potential connectivity issues: if a device is only occasionally connected to the Internet, it won’t be able to utilize cloud-based machine learning services efficiently. For real-time scenarios like self-driving cars, even the additional milliseconds of network latency can be a deal-breaker. Another major problem of cloud-based AI is the sensitivity of the information that has to be sent to the cloud for processing. Companies operating under strict regulations (which nowadays includes anyone dealing with personally identifiable information) simply cannot afford the risks for a security breach or a compliance violation fine.

These problems are nothing new, of course – they’ve been the biggest obstacle to adopting any cloud services for over a decade already. To address this challenge and to expand their customer base to these “not cloud-ready” companies, cloud service providers have been pushing the concept of Edge Computing. At its face value, edge computing simply means bringing computing and storage resources closer to the customer’s geographical location to reduce network latency. This has started over 20 years ago with CDNs (content delivery networks) but expanded to mobile phones, IoT devices and network gateways.

Nowadays, these devices are powerful enough to implement a substantial part of the cloud service stack locally, creating a distributed computing platform between corporate networks and clouds, dramatically improving response rates and reducing the strain on the core cloud infrastructure. It is important to understand that this trend does not by any means undo the achievements of the cloud model. Edge computing isn’t returning to on-prem data centers; it is simply another phase in the evolution of the cloud. Even the name itself implies it: the edge, of course, refers to the cloud, its proverbial “silver lining” …

In a sense, edge computing is an alternative approach towards hybrid architectures, one that tries to blur the massive divide in on-prem and cloud technology stacks and to expand the reach of cloud service providers even deeper into their customers’ networks. Bringing AI capabilities closer to their consumers is just another logical step in that direction.

The Future of the Intelligent Edge

It is well known that preparing data for machine learning and training of ML models is the part of AI that requires the most computing resources. The actual inference (i.e., putting a model to productive use) is much lighter. Optimizing ML models for running on edge devices is one of the primary methods of decoupling them from the cloud. Solutions like AI-powered antiviruses that only require updates every six months are already a reality. Simple image recognition can run directly on your mobile phone, helping you take better pictures of your food.

More sophisticated models require specialized hardware to run. Major semiconductor manufacturers like Intel and Nvidia and even cloud service providers themselves like AWS have specialized AI chips in their portfolios; smaller companies are offering solutions for developers to simplify and optimize the usage of this hardware. In other words, a whole ecosystem is being born, bringing AI/ML closer to customers and enabling usage scenarios previously impossible.

The field is still far from mature, however. Perhaps the best indication of this is a lack of standardization developments and a very fragmented market, where very few vendors can deliver a full technology and service stack to satisfy different use cases. Will we see the emergence of a standardized abstracted architecture similar to VMware or Kubernetes for traditional computing? Perhaps: companies like Run:AI are already working on AI orchestration platforms.

One thing is clear though: cloud AI isn’t going anywhere. For the foreseeable future, we’re going to deal with a multitude of hybrid architectures. History has a tendency to repeat itself, after all.


One World Identity

Omidyar Network: Good ID

Omidyar Network Beneficial Technology Principal Govind Shivkumar joins State of Identity to share the story of his venture investments in identity, and the future of digital identity itself. Additionally, we discuss the Omidyar Network's Good ID program, the potential for responsible biometrics, and why incorporating privacy and security by design into identity infrastructure is so critical.

Omidyar Network Beneficial Technology Principal Govind Shivkumar joins State of Identity to share the story of his venture investments in identity, and the future of digital identity itself. Additionally, we discuss the Omidyar Network's Good ID program, the potential for responsible biometrics, and why incorporating privacy and security by design into identity infrastructure is so critical.


PingTalk

How Freddie Mac Overhauled Its Workforce Identity and Access Management

The Federal Home Loan Mortgage Corporation, better known as Freddie Mac, is the backbone of the United States housing sector. The government-sponsored public enterprise has provided more than $10 trillion to help more than 67 million homeowners and 11 million renters establish their homes since its inception 50 years ago, fulfilling its mission to provide liquidity, stability and affordability to

The Federal Home Loan Mortgage Corporation, better known as Freddie Mac, is the backbone of the United States housing sector. The government-sponsored public enterprise has provided more than $10 trillion to help more than 67 million homeowners and 11 million renters establish their homes since its inception 50 years ago, fulfilling its mission to provide liquidity, stability and affordability to the U.S. housing market.

Recently, the Freddie Mac identity team determined it was time to modernize their legacy web access management (WAM) infrastructure. The system was nearing end of life and presented numerous security vulnerabilities that weren’t easily remedied. But given the number of resources involved and the 400+ applications that had to be migrated, the switch to a modern identity and access management (IAM) platform would be a massive undertaking.
 

Wednesday, 30. September 2020

KuppingerCole

How Security and Identity Fabrics Work to Help Improve Security

Many organizations struggle or even fail because they overcomplicate the implementation and extension of their cybersecurity toolset. Most do not have a central approach on security, and often use a set of tools that are not well-integrated with each other.

Many organizations struggle or even fail because they overcomplicate the implementation and extension of their cybersecurity toolset. Most do not have a central approach on security, and often use a set of tools that are not well-integrated with each other.




Ontology

Ontology Monthly Report — September 2020

Ontology Monthly Report — September 2020 September 2020 has been one of Ontology’s most successful months to date, as we continue our endeavor to accelerate the DeFi movement. Some highlights include: Wing, the first credit-integrated DeFi project based on the Ontology blockchain, released its whitepaper ONT and ONG are now listed on Uniswap Three votes completed by the Wing DAO Th
Ontology Monthly Report — September 2020

September 2020 has been one of Ontology’s most successful months to date, as we continue our endeavor to accelerate the DeFi movement. Some highlights include:

Wing, the first credit-integrated DeFi project based on the Ontology blockchain, released its whitepaper ONT and ONG are now listed on Uniswap Three votes completed by the Wing DAO The launch of Wing’s Flash Pool Mining Celebration through which users can gain an up to ten-fold increases in WING rewards Multiple swaps now listed on Wing

You can find more detailed updates below.

中文

繁體中文

한국어

日本語

русский

Tiếng Việt

Tagalog

বাংলা

slovenský

සිංහල

हिंदी

Español

Development Progress
MainNet Optimization
- Ontology v2.1.1 Alpha released
- Ontology GraphQL interface development completed
- Rust Wasm contract development hub released ontio-std v0.4

Product Development
ONTO
- ONTO v3.3.0 and v3.4.0 launched
- Integration with TRON and Polkadot completed
- Upgraded credential functions and asset score functions, and updated the asset score tutorial and credential tutorial
- Supported access to Wing from ONTO and depositing ONT on Wing through ONTO. ONT deposited from ONTO now takes up 55% of the total amount
- New ONTO users increased 3000% from last month’s newcomers

dApp
- 82 dApps launched in total on MainNet
- 6,120,263 dApp transactions completed in total on MainNet

Community Growth
- We have onboarded 1,253 new members across Ontology’s global communities, with a noticeable growth in our Vietnamese, Trading, Swedish, and Tagalog communities.

Bounty Program
- We are seeking SDK developers from our community.
- We are collecting suggestions for new bounties.
- 518 applications, 5 new additions to existing bounties.
- 38 tasks, 50 teams in total: 31 teams have finished their tasks and have received rewards in return, while 19 teams are still working on tasks.

Latest Release
- Wing, the first credit-integrated DeFi project based on the Ontology blockchain, has released its first whitepaper.
- Ontology’s digital assets, ONT and ONG, are now both listed on Uniswap, having completed mapping to the Ethereum blockchain platform.
- Ontology continued performance testing with bloXroute, with a focus on BDN performance during periods of slow internet. Ontology improved the speed of block propagation, block recovering speed, and transaction stream speed, and we also announced a partnership with Chainstack.
- Starting from September 8th, ONT can now be deposited in Wing Flash Pool. Flash Pool started releasing WING tokens as incentives for mining in the genesis pool on September 12th. Users can participate in Flash Pool by using Cyano Wallet or ONTO Wallet.
- By September 15th, ONT worth over 100,000,000 USD had been deposited in Wing. The Wing project offers a collateral rate that is significantly lower than similar platforms.
- On September 17th, Wing DAO, the Wing community, initiated its first vote. As a result, the WING reward distribution rate would change to twice, compared with the previous ten times, effective from 00:05, September 18th.
- From September 18th to 29th, through the Flash Pool Mining Celebration, users who deposited their WING in the Wing DAO Supply Pool would receive a ten-fold increase in rewards between September 18th and 24th, and a five-fold increase in rewards between September 25th and 29th.
- On September 20th (UTC), the WING/EWING swap was launched in Flash Pool of Wing, with the aim of increasing the project’s liquidity. Also, multiple swaps were listed on Wing, including ETH/pETH, DAI/pDAI, and WING/pWING.

Events

From September 2nd-4th, Ontology co-organized the event ‘Cointelegraph China’s DeFi Marathon’. On September 2nd, Jun LI, Founder of Ontology, addressed the audience during “Ontology: Empowering DeFi with Credit” and joined a panel discussion themed “Public Chain’s Choice amid DeFi Pressure”. He stated, “By empowering DeFi with credit, we are poised to build the next ‘Super-Oracle’”. In addition, during the event, Erick Pinos, Americas Ecosystem Lead, was engaged in a discussion with the lead of Aave, as well as other renowned projects. On September 4th, Jun LI, Founder of Ontology, participated in a live AMA panel hosted by HyperPay on the topic of “Braving the DeFi-spurred Storm: Where Will Public Chains Head”. During the panel discussion, he unveiled Ontology’s blueprint for DeFi, and briefed the community users on the innovative and transformative features of Wing. On September 18th, Jun LI took part in a panel discussion themed “Prospects of Decentralized Finance” at 2020 International Fintech Innovation Conference (IFIC) in Shanghai. During the panel discussion, Li stated that should Ontology wish to stand out in the DeFi industry, while avoiding the transaction-fee issue as faced by Ethereum, it needs to work harder on its strategy to support different fees in scenarios of different technical complexities. On September 3rd, Kendall MAO, Dean of Ontology Institute, discussed the dilemma posed by DeFi to public chains at a SheKnows live panel. The panel was organized by 8BTCnews, along with representatives from Bytom Blockchain. Kendall mentioned, “Wing, Ontology’s latest cross-chain DeFi project, is integrated with the element of credit. We believe that a dynamic integration with the OScore system will create new paradigms for DeFi innovations, thus opening up new opportunities for Ontology.”

Recruitments
- Solution Architect
- Business Sales
- Global Development Manager
- Global Marketing Manager
- Social Media Associate
- Product Manager
- Product Operations

New Team Members
- 1 Marketing Intern

Contact Us
- Contact us: contact@ont.io
- Apply now: careers@ont.io Find Ontology elsewhere

Ontology website / ONTO website / OWallet (GitHub)

Twitter / Reddit / Facebook / LinkedIn / YouTube / NaverBlog / Forklog

Telegram Announcement / Telegram English / GitHubDiscord

Ontology Monthly Report — September 2020 was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


Self Key

KuCoin Hack: SelfKey Official Statement

SelfKey’s official statement regarding the recent KuCoin security incident. The post KuCoin Hack: SelfKey Official Statement appeared first on SelfKey.

SelfKey’s official statement regarding the recent KuCoin security incident.

The post KuCoin Hack: SelfKey Official Statement appeared first on SelfKey.


Smarter with Gartner - IT

How to Combat Marginalizing Behaviors in the Workplace

From taking credit for someone’s idea to pet names, marginalization can take many forms in the workplace. A set of small actions that individually are annoying, but not egregious, can create a “death by 1,000 cuts” phenomenon — they collectively add up to an environment that does not feel inclusive to some members. Kasey Panetta, Gartner Senior Content Marketing Manager, interviews Christie Stru

From taking credit for someone’s idea to pet names, marginalization can take many forms in the workplace. A set of small actions that individually are annoying, but not egregious, can create a “death by 1,000 cuts” phenomenon — they collectively add up to an environment that does not feel inclusive to some members.

Kasey Panetta, Gartner Senior Content Marketing Manager, interviews Christie Struckman, VP Analyst, to talk about confronting behaviors that marginalize underrepresented employees. Christie explains that any employee can marginalize or be the target of marginalization and once these behaviors are recognized, they must be confronted.

Listen to interview

For the full interview, listen to the podcast below or read the transcript that follows, which has been edited for clarity and length.
[swg_ad]

What is marginalization, and what does it look like in the workplace?

Marginalization is when someone feels their contribution is not valued, or their idea or specific recommendation is not valued, or because they as an individual, with a demographic that they cannot change, is being devalued. An example is when women feel that simply because they are female, they are not being valued by their peers or they're being treated differently because of that. I'll give an example of three types of marginalizing behaviors.

Unequal personality trait assessment

Let's say that we were in a meeting with three male cohorts, and as a group getting really frustrated because we were struggling to make a decision. It felt like we were caught in this sort of endless debate. And one of our male cohorts slammed his fist on the table and said something like, ‘Come on, we have to get our act together and make a decision. Let's move forward.’

The assessment would be that he is trying to take charge and move forward. But if a woman had done that exact same thing, she would be assessed as exhibiting the behaviors of a “B word.” So think about how uncomfortable it is when you feel that people think that you are a B when you're trying to move the conversation forward, like your other male cohorts have done on occasion.

Lack of confidence assumption

This occurs when there is a difference between what we think confidence looks like from men and from women. A woman who chooses to be a little bit more quiet in how she approaches things might come across as somebody who's not as competent. So there's this linkage between how confidence is exhibited and therefore somebody's competence, but confidence can look different.

Confidence can look different between men, of course, as well, but there tends to be some sort of natural differences between men and women. And so that difference in how we exhibit confidence being correlated to our competence can be very frustrating.

The “taking credit”

For example, in a meeting someone volunteers an idea that doesn't get a lot of discussion, but then five minutes later, another team member offers up the exact same suggestion, not giving me credit for the first suggestion, but instead takes it as their own and takes credit.

Read more: Making the Case for Diversity, Equity and Inclusion During Disruption

Do people see this as an HR problem to solve? Do they distance themselves from it at all?

Yes, and I think organizational processes encourage that because it then gets tied to their performance assessments. But when you do that, there's this huge separation between when the behavior happens and then the HR processes.

And I want to be really clear: The HR processes are necessary, but there's a lag. I refer to marginalization as a death of 1,000 cuts — no one instance in and of itself is going to inspire someone to reach out to HR and to say, “I feel like I'm being marginalized.” But collectively, they create an environment where people don’t feel supported.

What is psychological safety? Why is it so important?

Psychological safety means creating an environment that encourages, recognizes and rewards individuals for their contributions and their ideas and makes individuals feel safe enough that they'll take interpersonal risks. The biggest contributor to psychological safety is the relationship between employees and their managers.

Those employees who feel they can have an open and honest conversation may be more comfortable bringing up work challenges. And it is an important part of the management role to create that environment, which is why I've put together five steps that managers can do when in the moment they see a marginalizing behavior.

What are the 5 steps to combat marginalization? Recognize the behaviors. And here's the hardest part. If you're not being marginalized, you might not recognize it. So part of the research was going through and creating those 12 behaviors. If you can pay attention to these, you will be making a difference. And so I gave an example of taking credit, that unequal personality trait assessment, the lack of confidence assumption, but there's pet names, tokenism, sexist statements and “mansplaining,” or what I call overexplaining. Address those behaviors publicly. You have to do it in the moment. When the situation occurs, quickly call it to attention. It doesn’t need to be a long lecture, but it signals to the marginalized employee that it was noticed. Coach privately. If you're managing the employee it’s easier because you can have a conversation. If not, perhaps approach the employee’s manager to share the situation. Don’t assume why the employee behaved that way — it could have been an oversight. Often, once a manager starts paying attention to the behaviors, they'll realize that it has actually been happening all along, so it’s important to coach the employee privately to change the behavior. Support the employee who was marginalized privately. Managers need to demonstrate empathy and to spend a little time with the employee so that they feel recognized. There is a chance that the employee didn’t feel marginalized, but it’s better to have the conversation so they feel that the manager is looking out for them. Affirm the commitment to inclusion publicly. This doesn't have to be done in the moment, but reaffirm that you care about inclusivity at some point shortly thereafter. This could take the form of reminding people about employee resource groups.

I'm a big proponent of what I call diversity and inclusion norms. These norms are the socially accepted ways that we're going to work with each other. The value of norms is letting people know what is expected, by giving a language when we need to call somebody out on whether they're following that. Diversity and inclusion norms can be ones like seek to understand, take turns, listen generously and remember that words matter.

Learn more: Embed Greater Diversity, Equity and Inclusion in Your Organization

What other mistakes do you commonly see when it comes to dealing with marginalization?

I find two big mistakes. One is making an attribution about the person who made the marginalized behavior. Keep it to what you heard. For example, ‘Christie stated that I'm worried that we're not listening to each other, right?’ That's not attribution about why the behavior happened, but that's an observation about the behavior. I find that when you keep the conversation to observe behaviors that really can't be debated, it helps the conversation move forward.

The second mistake is assuming the impact to the woman. That's why the coaching privately is important to understand how that woman is feeling. Did she feel marginalized? Is this a pattern? Maybe this is happening in many more places. That lets you know where you need to spend more time so that you can help to confront and ferret it out. Or maybe the woman doesn't feel marginalized. And so it’s important to keep the conversation open and allow employees to have the freedom and the safety to open up.

Are there any final thoughts that you would like to share with our audience?

When I use the language to “‘confront behaviors,” I think that's very uncomfortable for leaders, especially if it's around marginalizing. So I expect that this is an uncomfortable proposal, but I've had many conversations with clients who told me that it was amazing how impactful confronting a behavior once or twice was. And my assumption is that most of these behaviors are a lack of attention versus an intention to hurt somebody's feelings.

The good news is that while it might be uncomfortable, I haven't come across clients who said that they felt like they had to continually confront. And then frankly, what it also does is it narrows down where you might have some issues where you need to get HR involved.

[swg_ad id = "36700"]

The post How to Combat Marginalizing Behaviors in the Workplace appeared first on Smarter With Gartner.


MyKey

The market capitalization of stablecoins reached $20.62

The market capitalization of stablecoins reached $20.62 billion, Stablecoin regulatory policy of the United States Original link: https://bihu.com/article/1915489812 Original publish time: September 30, 2020 Original author: HaiBo Jiang, researcher of MYKEY Lab We released MYKEY Crypto Stablecoin Report to share our interpretation of the development status of stablecoins and ana
The market capitalization of stablecoins reached $20.62 billion, Stablecoin regulatory policy of the United States

Original link: https://bihu.com/article/1915489812

Original publish time: September 30, 2020

Original author: HaiBo Jiang, researcher of MYKEY Lab

We released MYKEY Crypto Stablecoin Report to share our interpretation of the development status of stablecoins and analysis of their development trends to help the participants in the crypto market stay updated on the development status of stablecoin. The MYKEY Crypto Stablecoin Report will be published every week, looking forward to maintaining communication with the industry and exploring the development prospects of stablecoin together.

Quick Preview The market capitalization of major stablecoins has increased by $757 million to $20.62 billion. Last week, the circulation of USDT, USDC, and DAI increased by 150 million, 334 million, and 213 million. OCC issued the regulations on stablecoins for the first time, confirming that the Commonwealth Bank can provide custody services for legal-collateralized stablecoins. The bank has the right to deposit reserves of certain ‘stablecoin’, which refers to stablecoin collateralized 1:1 by the single legal currency. The Bank verifies at least once a day that the balance in the reserve account is more than or equal to the amount of stablecoins outstanding with issuers. FinHub encourages all parties involved in the construction and sale of digital assets to contact FinHub through the official website to help them ensure that such digital assets comply with the Federal Securities Act. If appropriate, a ‘no action’ position will be considered. 1. Overview of Stablecoin Data

First, let’s review the changes in the basic information of the various stablecoins in the past week(September 19, 2020 ~ September 25, 2020, same below).

Market Circulation

Source: MYKEY, CoinMarketCap, Coin Metrics

At present, the market capitalization of major stablecoins has increased by $757 million to $20.62 billion.

Source: MYKEY, Coin Metrics

In the past week, Tether additionally issued 150 million USDT on Ethereum. The circulation of USDC, DAI, BUSD, TUSD, and GUSD increased by 334 million, 213 million, 64.98 million, 490,000 and 570,000. The circulation of PAX and HUSD decreased by 1.09 million and 4.51 million.

The Number of Holding Addresses

Source: MYKEY, DeBank

Last week, the number of main stablecoin holding addresses on Ethereum all increased by 12,892.

Source: MYKEY, DeBank

The number of holding addresses of USDT, USDC, TUSD, and DAI increased by 4,714, 4,790, 329, and 3,311. The number of holding addresses of PAX decreased by 232.

The Number of Active Addresses

Source: MYKEY, Coin Metrics

The number of active addresses of stablecoins last week increased by an average of 8.98% compared to the previous week.

The Number of 24-hour Transactions on the Public Blockchains

Source: MYKEY, Coin Metrics

Compared with the previous week, the number of daily transactions of major stablecoins increased by an average of 3.1%.

The Number of 24-hour Volume of Transactions on the Public Blockchains

Source: MYKEY, Coin Metrics

Source: MYKEY, Coin Metrics

The daily volume of transactions of major stablecoins last week increased by an average of 0.74% from the previous week.

2. Stablecoin Regulatory Policy of the United States

As early as June, the OCC issued a ‘pre-notification of the proposed rulemaking’, asking the public to weigh how to use cryptocurrency and other financial technology tools in the financial sector before August 3. Several banks have indicated that they may be interested in providing crypto custody and other services to customers. Dominic Venturo, the chief digital officer of the National Association of American banks, said in response that the OCC should distinguish between utility tokens, stablecoins, and Exchange platform tokens.

Last week, the OCC issued regulations on stablecoins, confirming that the Commonwealth Bank can provide custody services for legal-collateralized stablecoins. This is the first time the OCC has issued regulations regarding stablecoins. The OCC is a federal agency responsible for overseeing the implementation of laws related to the National Bank and supervising the Federal Bank and the agencies of foreign banks. Subsequently, the SEC FinHub Staff stated that it could consider adopting a ‘no action’ position on specific digital asset activities under appropriate circumstances.

The OCC released regulations on stablecoins for the first time

On September 21, the OCC issued a press release and subsequent explanatory letter stating that banks have the right to deposit certain ‘stablecoins’ reserves. The stablecoin mentioned here specifically refers to stablecoin collateralized 1:1 by the single legal currency and does not include other stablecoins backed by commodities, cryptocurrencies, or other assets. The demand for stablecoins is increasing day by day, with multiple applications, such as large-scale transfers, which have great potential. Two technologies for realizing cryptocurrency, encryption technology and distributed ledger technology, are developing rapidly.

The National Bank has a clear authorization to accept deposits, which is also the core business of the bank. Certain stablecoin issuers may also wish to deposit the cash reserves supporting their issuance of stablecoins in the National Bank. As long as stablecoin issuers can effectively manage risks and comply with relevant laws (including the ‘Bank Secrecy Law’ and anti-money laundering laws), banks can provide services for their legitimate businesses, including cryptocurrency businesses. Therefore, banks may obtain deposits from stablecoin issuers, including stablecoin reserve deposits related to custodian wallets.

Like other deposits, stablecoin reserves are also subject to laws and regulations related to deposit insurance coverage. The stablecoin reserve account can be the deposit of a stablecoin issuer or the deposit of a single stablecoin holder. Banks and the FSA should provide accurate and appropriate disclosures on deposit insurance coverage to ensure compliance with the BSA and other regulations, including but not limited to the customer due diligence under the BSA, customer identification requirements under the Patriot Act, requirements under the Federal Securities Act, and identification of the beneficial owners of the accounts enabled.

Reserves related to stablecoins may bring significant liquidity risks. New banking activities should be developed and implemented under sound risk management principles, and banks should establish appropriate risk management procedures for the development of new businesses. OCC hopes that all banks manage liquidity risks at the same level as the risks they assume. The protocol between the bank and the stablecoin issuer may include restrictions or requirements for holding assets in the reserve account, and the protocol may stipulate the responsibilities of the parties. For example, the bank should sign an appropriate protocol with the issuer to ensure that the balance in the reserve account is more than or equal to the amount of stablecoins outstanding with issuers. The agreement should include a mechanism for the bank to periodically check the amount of outstanding stablecoin.

To summarize, national banks are authorized to deposit stablecoin reserves, subject to the following restrictions:

Only support stablecoin collateralized 1:1 by the single legal currency.
The Bank verifies at least once a day that the balance in the reserve account is more than or equal to the amount of stablecoins outstanding with issuers.
Banks should establish appropriate risk management procedures for the development of new businesses.
Banks must comply with all applicable laws and regulations.
Banks must establish appropriate control measures and conduct adequate due diligence.
The bank must confirm the beneficial owners of the accounts enabled.

The Statement of SEC

After the OCC issued the explanatory letter, the SEC FinHub Staff issued a statement.

According to the Federal Securities Law, whether a specific digital asset (including stablecoins) should be considered security depends on actual conditions. The nature of the asset needs to be carefully analyzed, including the right to hold the asset and how to provide and sell the asset.

SEC FinHub believes that as long as it meets the relevant registration, reporting, and other requirements of the Federal Securities Law, market participants may construct and sell digital assets in a way that does not constitute securities. However, the terms used to describe digital assets or financial activities involving digital assets may not be consistent with the definitions of SEC in relevant laws and regulations. FinHub encourages all parties involved in the construction and sale of digital assets to contact FinHub through the official website to help them ensure that such digital assets can comply with the Federal Securities Act. FinHub staff will always be ready to assist participants. If appropriate, a ‘no action’ position will be considered.

Tips

To better communicate with industry insiders, we decided to add two sections for questions of readers and opinions of guests. If readers have questions about stablecoins, please contact us. We will pick meaningful questions to answer in the next issue. At the same time, welcome guests from the industry to share your views on stablecoins. Contact information: jianghb@mykey.org.

This is what we’re sharing in this MYKEY Crypto Stablecoin Report, welcome to stay tuned for follow-up crypto stablecoin reports. Starting from the next crypto stablecoin report, the format of this report will be changed from weekly to monthly, that is, the 21st stablecoin report will cover the development of stablecoins in October 2020. We will provide more interpretations of the development status of stablecoins and analysis of their development trends to help readers stay updated on the development status of stablecoin in the follow-up report.

PS: MYKEY Lab has the final right to interpret the content of the article, please indicate the source for the quotation. Welcome to follow our official account — MYKEY Lab: MYKEY Smart Wallet.

Past review

Crypto Stablecoin Report 15: The market capitalization of stablecoins increased to $15.961 billion, On-chain usage of stablecoins

Crypto Stablecoin Report 16: The connection between stablecoins and real assets

Crypto Stablecoin Report 17: The market capitalization of stablecoins increased to $17.544 billion, Decentralized payment protocol Celo

The market capitalization of stablecoins increased to $18.53 billion, The rise of CBDC

The market capitalization of stablecoins reached $19.86 billion, The latest development of several decentralized stablecoin protocols

About Us

KEY GROUP: https://keygroup.me/

MYKEY Web: https://mykey.org/

BIHU: https://bihu.com/people/1133973

Telegram: https://t.me/mykey_lab

Twitter: https://twitter.com/mykey_lab

Medium: https://medium.com/mykey-lab

Github: https://github.com/mykeylab

Youtube: MYKEY Laboratory

The market capitalization of stablecoins reached $20.62 was originally published in MYKEY Lab on Medium, where people are continuing the conversation by highlighting and responding to this story.

Tuesday, 29. September 2020

KuppingerCole

The Evolution of Encryption: Getting Ready for the Quantum Watershed

The relentless move towards the digital transformation seems unstoppable. Organizations must secure their trust and reputation in the face of increasing attacks, advances in technology, increased regulation and compliance, and the continued shift to the cloud and cloud services. Unfortunately, not a single week goes by without another large-scale data breach or leak reported by the media – it seem

The relentless move towards the digital transformation seems unstoppable. Organizations must secure their trust and reputation in the face of increasing attacks, advances in technology, increased regulation and compliance, and the continued shift to the cloud and cloud services. Unfortunately, not a single week goes by without another large-scale data breach or leak reported by the media – it seems that a company of any size or industry can fall victim to insufficient or ineffective data protection controls.




Global ID

The GiD Report#128 — What the Apple App Store backlash tells us about this moment in time

The GiD Report#128 — What the Apple App Store backlash tells us about this moment in time Welcome to The GiD Report, a weekly newsletter that covers GlobaliD team and partner news, market perspectives, and industry analysis. You can check out last week’s report here. What we have for you this week: The era of David v. Goliath The era of moderation Stuff happens 1. I’ll
The GiD Report#128 — What the Apple App Store backlash tells us about this moment in time

Welcome to The GiD Report, a weekly newsletter that covers GlobaliD team and partner news, market perspectives, and industry analysis. You can check out last week’s report here.

What we have for you this week:

The era of David v. Goliath The era of moderation Stuff happens 1. I’ll just talk about two main things this week — both of which speak to the moment that we are currently in. The first is the Apple App Store backlash.

It’s worth noting — it’s still pretty unclear how any of this plays out. Apple is in a strong position in terms of its loyal user base, the quality of its products and ecosystem, and also possibly in the eyes of the law (disclaimer: I am not a lawyer). It’s Apple’s App Store and Apple gets to call the shots.

Plus, I’ve been an iPhone user since the 5S and I don’t see myself switching anytime soon. (Not that it would make a huge difference since Google — aside from allowing you to access apps outside the Play Store, essentially follows the same type of model.)

And Apple has taken criticism before from disgruntled developers.

But what all of this shows is that this time, it’s different.

Photo: Fortune Global Forum

Whatever you want to call it — the establishment, the status quo, Big Tech monopolies — people have never been more distrustful and/or critical of the large institutions that rule over us. As one of the world’s largest multinational corporations with a market cap of ~$2 trillion, Apple easily falls into that camp.

That distrust has been brewing for some time, but for the most part, I’m not sure people had a real sense of belief that things could change. It’s Apple. What are you going to do about it?

That’s what makes this moment different — the growing sense of belief that change is suddenly possible. It’s not just about Apple and the App Store, either. You don’t have to look around very much to see that this moment applies to everything — our platforms, our politics, our society in general.

Epic Games founder Tim Sweeney certainly senses the moment. His latest coalition is a call to arms.

The NYTimes reports:

For months, complaints from tech companies against Apple’s and Google’s power have grown louder.
Spotify, the music streaming app, criticized Apple for the rules it imposed in the App Store. A founder of the software company Basecamp attacked Apple’s “highway robbery rates” on apps. And last month, Epic Games, maker of the popular game Fortnite, sued Apple and Google, claiming they violated antitrust rules.
Now these app makers are uniting in an unusual show of opposition against Apple and Google and the power they have over their app stores. On Thursday, the smaller companies said they had formed the Coalition for App Fairness, a nonprofit group that plans to push for changes in the app stores and “protect the app economy.” The 13 initial members include Spotify, Basecamp, Epic and Match Group, which has apps like Tinder and Hinge.
“They’ve collectively decided, ‘We’re not alone in this, and maybe what we should do is advocate on behalf of everybody,’” said Sarah Maxwell, a spokeswoman for the group. She added that the new nonprofit would be “a voice for many.”

What the coalition is fighting for:

At the heart of the new alliance’s effort is opposition to Apple’s and Google’s tight grip on their app stores and the fortunes of the apps in them. The two companies control virtually all of the world’s smartphones through their software and the distribution of apps via their stores. Both also charge a 30 percent fee for payments made inside apps in their systems.
App makers have increasingly taken issue with the payment rules, arguing that a 30 percent fee is a tax that hobbles their ability to compete. In some cases, they have said, they are competing with Apple’s and Google’s own apps and their unfair advantages.
On Thursday, the coalition published a list of 10 principles, outlined on its website, for what it said were fairer app practices. They include a more transparent process for getting apps approved and the right to communicate directly with their users. The top principle states that developers should not be forced to exclusively use the payments systems of the app store publishers.

Reality check: At best, this is simply the beginning of a movement that is fundamentally a story of David v. Goliath. Despite the success of Fortnite, Apple is still 200x the size of Epic Games. And more recently, Apple’s has centered the narrative of the company’s future around “services.” Expect Tim Cook to stand his ground.

And the Goliaths of the world have been steadfast and reaping the rewards for some time.

But like I said, this time, it’s different:

Scrutiny of the largest tech companies has reached a new intensity. The Department of Justice is expected to file an antitrust case against Google as soon as next week, focused on the company’s dominance in internet search. In July, Congress grilled the chief executives of Google, Apple, Amazon and Facebook about their practices in a high-profile antitrust hearing. And in Europe, regulators have opened a formal antitrust investigation into Apple’s App Store tactics and are preparing to bring antitrust charges against Amazon for abusing its dominance in internet commerce.

Grab some popcorn. Whatever happens, it’s a bit thrilling that the Davids of the world are starting to believe again, no?

In the meantime, you can check out the coalition’s site here, where they outline their 10 principles.

Relevant:

Facebook Opens New Fight With Apple Over Messaging Justice Dept. Case Against Google Is Said to Focus on Search Dominance Coalition for App Fairness To Fight Apple and Google, Smaller App Rivals Organize a Coalition Briefing: Epic Games, Spotify Form Coalition for App Fairness Payments giant Paytm says Google’s Android monopoly is of grave concern to Indian start-ups 2. The second thing we’ll highlight this week is really about the challenges of operating a platform in this current climate. Which is sort of a catch-22 seeing as the nature of our platforms have themselves deeply contributed to where we’ve ended up. Just look at Facebook.

And I’m not talking about the Facebook that you and I use. I’m talking about the internal Facebook that Facebook employees use.

Ina Fried:

Driving the news: As political arguments on Facebook’s employee discussion boards have grown more heated and divisive, the company ordered new restrictions on the forums earlier this month, which run on Facebook’s Workplace platform.
Last week, the company banned employees from replacing their profile photos on the system with activist images.
Why it matters: Facebook’s difficulty managing its own employees using its own social network suggests that the company’s problems might be rooted more in the conception and design of its products than in the fiery state of today’s public discourse.
Catch up quick: Facebook, like most Silicon Valley companies, has long prided itself on an open culture embracing free debate, epitomized by regular staff meetings where CEO Mark Zuckerberg answers employee questions.

Mark’s solution has been to create “dedicated spaces … for discussing charged topics, with clear rules and strong moderation.”

Which makes you wonder — how will that make Facebook think about actual Facebook? The difference, of course, is that divisive engagement internally is bad because it makes the company dysfunctional. But divisive engagement on Facebook proper, despite contributing to a more dysfunctional society, is super aligned with their core ad-based business model.

Because what we’re learning — and frankly, what people have intuitively known for some time — is that the engagement that these platforms promote is probably not that great:

Between the lines: What’s bugging so many Facebook workers is the same problem some of Facebook’s 3 billion users identified a long time ago.
By shunting so many different dimensions of our online communications onto the same platform — from school connections to family members and workplace acquaintances to people with shared interests — Facebook makes it impossible to keep different facets of our lives from overlapping.
The scholar Danah Boyd popularized the phrase “context collapse” to describe the tendency of social networks to flatten our lives.
Now that dynamic is pushing Facebook, which has staked out a content moderation policy that treads lightly on limiting political speech in public, to curtail the political speech of its own employees.
The catch: Facebook’s internal moderators will now have to figure out how to draw a clean line between “charged topics” and everything else.
Everything Facebook touches today has a political dimension, and everything from user interface design to machine-learning code to marketing materials has social and political dimensions that can trigger “charged” exchanges.

I’m not going to put too much blame on Facebook for this one. They’re figuring things out along with the rest of us. But it certainly raises a bunch of important questions as we look to build the future.

And in an age when every company is also a messaging company (see: Zoom, below), these are questions that every platform will need to ask itself (and in a way, they’ll be able to come at it from a different perspective since companies like Zoom and Discord aren’t built on ad-based business models.)

Because how do we want to shape the incentives that define our communities such that we can constructively evolve as a society?

In any case, the road ahead won’t be easy or straightforward.

Certainly not for Facebook:

A group of high-profile Facebook critics on Friday announced the launch of what they are calling the “Real Facebook Oversight Board.” As Sara Fischer reports, it’s an effort that aims to counter an independent board established by Facebook last year to oversee its decisions on content moderation.
Why it matters: The opposing effort represents how political the fight between Facebook and its critics has become in the lead-up to the presidential election.
Driving the news: The group includes leaders from the Stop Hate for Profit boycott, like Rashad Robinson, president of Color of Change, and Jonathan Greenblatt, CEO of the Anti-Defamation League, as well as prominent Facebook critics like Roger McNamee and some journalists and pundits.

So yeah, tomorrow’s presidential debate will be interesting.

Relevant:

Zoom Invests in Big Messaging Upgrade in Challenge to Slack America’s Tech Billionaires Could Help Protect the Election. If They Wanted To. A Guide to Litigating Identity Systems: The Right to Privacy and National Identity Systems ‘We Blew It.’ Douglas Rushkoff’s Take on the Future of the Web — CoinDesk 3. Stuff happens: Jack Dorsey Details Twitter’s Blockchain Strategy at Oslo Freedom Forum — CoinDesk Chapter 7: US-China Relations and Wars How Twitter Survived Its Biggest Hack — and Plans to Stop the Next One The World Is Losing the Money Laundering Fight An LG executive dumped Google’s Chrome for a surprising reason | ZDNet The Currency Cold War: Four Scenarios — CoinDesk PayPal and Mastercard Expand Debit Card Offering to More European Businesses

The GiD Report#128 — What the Apple App Store backlash tells us about this moment in time was originally published in GlobaliD on Medium, where people are continuing the conversation by highlighting and responding to this story.


Evernym

Two days, 11 hacks: A recap of the NHS Staff Access Hackathon

Last Tuesday, we wrapped up a two-day hackathon focusing on the role of verifiable credentials in digital staff access and other innovations across the healthcare space. The event was organized by INTEROPen and the United Kingdom’s National Health Service (NHS), with Evernym providing technical support through training workshops, developer office hours, free access to Verity, […] The post Two da

Last Tuesday, we wrapped up a two-day hackathon focusing on the role of verifiable credentials in digital staff access and other innovations across the healthcare space. The event was organized by INTEROPen and the United Kingdom’s National Health Service (NHS), with Evernym providing technical support through training workshops, developer office hours, free access to Verity, […]

The post Two days, 11 hacks: A recap of the NHS Staff Access Hackathon appeared first on Evernym.


COMUNY

Wie Versicherer Kundeninteraktion mit Hilfe verifizierter Daten verbessern

29.09.2020. InsurLab Germany: comuny im Videoclip als Startup der Woche. comuny ist Teilnehmer des Accelerator #Batch20, diese werden wöchentlich vom InsurLab Germany vorgestellt. In dieser Woche ist comuny an der Reihe. Wir sind sehr stolz, dabei zu sein, denn wir haben ein hilfreiches Produkt für Versicherungen, die konsequent mobil erreichbare Services anbieten möchten und dabei…
29.09.2020. InsurLab Germany: comuny im Videoclip als Startup der Woche.

comuny ist Teilnehmer des Accelerator #Batch20, diese werden wöchentlich vom InsurLab Germany vorgestellt. In dieser Woche ist comuny an der Reihe. Wir sind sehr stolz, dabei zu sein, denn wir haben ein hilfreiches Produkt für Versicherungen, die konsequent mobil erreichbare Services anbieten möchten und dabei auf regulatorische Anforderungen achten müssen. Mit comuny gelingen Rechtssicherheit und kundenfreundliche Prozesse bei optimierten Kosten und geringen Projektlaufzeiten. Für Vertrauen im richtigen Moment!

Im Video erzählen wir für alle, die uns noch nicht kennen, in 1 Min 30 Sek was comuny macht. Video anschauen unter https://insurlab-germany.com/de/batch-20/

Über das InsurLab Germany e.V.: Der InsurLab Germany e.V. ist eine gemeinsame Initiative aus Wirtschaft, Wissenschaft und öffentlichen Institutionen zur Stärkung des Versicherungsstandorts Deutschland. Hier treffen Versicherungsunternehmen auf InsurTechs. Zielsetzung ist die nachhaltige Entwicklung innovativer Lösungen durch die enge Vernetzung von Startups und Hochschulen mit etablierten Versicherungen, führenden Dienstleistern und Branchenexperten.

Das Ziel der Initiative ist es, Versicherungsunternehmen einen einfachen, direkten Zugang zu nationalen und internationalen Startups zu ermöglichen und sie aktiv bei der Planung und Umsetzung gemeinsamer Projekte zu unterstützen und somit die deutsche Versicherungswirtschaft zu transformieren und zu digitalisieren. Das Accelerator-Programm richtet sich sowohl an InsurTechs als auch an Startups, die in der Wertschöpfungskette eines Versicherungsunternehmens Mehrwerte generieren können. Namhafte Mitglieder, spannende Events und spannende Startups!

Mehr über das InsurLab https://insurlab-germany.com

Mehr comuny Produktinformationen https://www.comuny.de/produkt/

#insure #eHealth #versicherer #operator #future-of-trust #ecosystems #trust-services #mobile-health #regulatorik #rechtssicherheit #kundeninteraktion #ilg#20 #InsurLabGermany


Forgerock Blog

ForgeRock Consumer Survey: The New Normal

Suddenly, everybody was home. You became a homeschool teacher and you learned how to host a corporate happy hour over Zoom. Your new puppy started making guest appearances in your team calls. You downloaded a new app to deliver your groceries. Covid-19 impacted the entire world as online apps and services became our primary way to get things done. Businesses had to figure out how to serve their

Suddenly, everybody was home. You became a homeschool teacher and you learned how to host a corporate happy hour over Zoom. Your new puppy started making guest appearances in your team calls. You downloaded a new app to deliver your groceries. Covid-19 impacted the entire world as online apps and services became our primary way to get things done.

Businesses had to figure out how to serve their customers and employees remotely as much as possible through new channels at unprecedented scale, regardless of age and geography, and quickly realized the importance of digital experiences

Now the question is when will things get back to “normal?” What will normal even be? And will our new digital habits stick?

The New Normal – Living Life Online

ForgeRock just completed a global survey that captures how the pandemic is affecting consumer behavior. Here are four key findings

Nearly half of all consumers polled say they will use more online services even with things return to normal. The second point, which should be a warning, is that more than one-third (35%) say a difficult log-in process would cause them to cancel their account, while 32% said they would look for another service. Third, consumers 65 and older are really embracing the new digital lifestyles, with 31% saying they will only shop online when this is all over. Finally,  this shift isn’t just among retirees. A third of consumers ages 18-24 say they won’t go back to stadiums or theatres, instead they’ll keep watching sports, concerts, and movies online.

So, about that new normal, while no one knows exactly what’s next, it’s clear that we have very low tolerance for poor digital experiences and will, without hesitation, switch to apps and services that deliver easy, productive user experiences. This provides the greatest opportunity for businesses to thrive through any disruption.

Check out our report for great insight into what consumers have to say about their online experiences and how their behavior is changing now and in the future. And, don’t forget to keep an eye out for more insights from ForgeRock. 

 


PingTalk

Ensuring the Security, Compliance and Agility of Digital Initiatives with Dynamic Authorization

Even before 2020, digital transformation initiatives were monopolizing the attention of most enterprises. Lines of business were attempting to outmaneuver the competition with experiences that would win customer loyalty and steal market share. IT departments were focusing on access and data security and technical enablement of the initiatives. And compliance departments were striving to ensure adh

Even before 2020, digital transformation initiatives were monopolizing the attention of most enterprises. Lines of business were attempting to outmaneuver the competition with experiences that would win customer loyalty and steal market share. IT departments were focusing on access and data security and technical enablement of the initiatives. And compliance departments were striving to ensure adherence to GDPR, CCPA and other regulatory requirements. 

 

Then earlier this year the pandemic appeared. Suddenly, everyone was working from home and any customers still physically interacting with brands shifted to 100% digital—or very close to it. Organizations with five-year digital transformation plans have been forced to get their digital infrastructure in order much faster than previously anticipated.


Smarter with Gartner - IT

Prioritize Digital Business Initiatives to Accelerate Into the Future

The events of 2020 have caused many organizations to push digital initiatives forward at unprecedented speed. This acceleration of digital business means making choices about which initiatives move forward or slow down. Kasey Panetta, Gartner Senior Content Marketing Manager, interviews Hung LeHong, VP and Gartner Fellow, to discuss the framework for digital business acceleration and how to use

The events of 2020 have caused many organizations to push digital initiatives forward at unprecedented speed. This acceleration of digital business means making choices about which initiatives move forward or slow down.

Kasey Panetta, Gartner Senior Content Marketing Manager, interviews Hung LeHong, VP and Gartner Fellow, to discuss the framework for digital business acceleration and how to use the five lanes to prioritize your digital initiatives for better results.

Listen now

For the full interview, listen to the podcast or read the transcript that follows, which has been edited for clarity and length.

[swg_ad]

We're going to start with the basics. Can you explain the general concept of accelerating digital and why it's so important now?

Probably the best way to think about it is what we've actually been through, what a lot of us would call the lockdown. I mean, the speeds at which we moved digital forward were incredible. I've talked with enterprises that have set up e-commerce in two weeks, which is incredible. We've never seen that kind of speed and acceleration before. So we're starting with a base point of incredibly fast acceleration for digital. Gartner recently surveyed boards of directors, and essentially 69% of the board of directors said that they intend to accelerate digital business.

And so the concept of digital business acceleration says that, “Can I keep that momentum going? Can I keep that same level of speed and urgency? Digital helped me during the lockdown, everything from making things virtual, to reducing costs and reaching out to customers. Now customers are used to these things. Can I keep that going?”

Gartner created a framework to help clients think about this concept of accelerating digital. Can you explain it?

The framework has two parts. The first one is about setting the organization up for speed to actually accelerate. For example, funding may be very slow, there are a lot of approvals to go through. Or our decision making process is really slow. So how can organizations streamline to diffuse those drags to actually make and act on decisions faster. Can we amplify or what we call apply force multipliers to things? Can we change to more of an agile based of thinking, not just within IT, but in the rest of our business so that we can accelerate.

It’s also imperative to redirect resources, coming to terms with the fact if we want to focus on speed we must focus on certain initiatives because we can't do everything. Organizations have to be willing to take resources, like people or time or partners and so forth, and actually put them toward the areas to accelerate. So for example, taking away resources from upgrading some kind of technology system and putting it toward customer-facing digital interactions. That would be a good example of redirecting resources.

We surveyed a number of enterprises and some of them answered that 100% of their interactions with customers recently have actually been digital and they've set it up that way

Preparing for speed is not so much a methodology or a process. It's actually a mindset, and we call it rethinking value. Organizations and leadership teams must consider the way that they deliver value or what value looks like in the eyes of a customer in the future.

The second part of the framework  is the immediate direction. If you're going to accelerate, if you're set up for speed, well, what directions should you take? We use a car analogy with different lanes on a highway. So imagine dividing digital investments into five lanes on a highway, each having a specific purpose.

Let’s discuss the first lane — the fast lane. You mentioned work from home, but what other kinds of initiatives should live in this lane?

You want to look at everything that you did during lockdown as an enterprise at lightning speed. And the reason we call it the fast lane is because your leadership already granted the funding, right? Your leadership already said it's okay to reach out to all customers using digital. And you want to take advantage of that funding, that acceptance from leadership and of course the investments in the technology, and essentially take them to the finish line.

Stabilize them. As you mentioned, work at home or the hybrid variation of that is a great example: What is that percentage of work from home versus hybrid versus work in the office? A second example is the nature of customer interactions during the lockdown. We surveyed a number of enterprises and some of them answered that 100% of their interactions with customers recently have actually been digital and they've set it up that way.

Again, that was the lockdown version. What is it going to look like in the new normal? It's probably not going to be 100%, but go ahead and figure out what that will be and actually stabilize it. So that's another example of something that's in the actual fast lane.

The second lane is called the growth lane. So what kind of initiatives should digital leaders be looking for in this area?

The growth lane is for enterprises that maybe are more aggressive. They want to leapfrog others in the pandemic. They want to take advantage of opportunities that exist out there because of the pandemic, simple examples, as customer needs start to change. The example of curbside pickup, maybe that becomes really important, maybe drive-through becomes really important. Maybe consumers actually don't necessarily want to go into a restaurant as much as they do.

And the concept of what some people call cloud kitchens, which are really shared kitchens where you make the meal, but you don't actually have customers come in and sit, becomes more popular. Or maybe if you're in the public sector, the notion of renewing your driver's license by going into office just no longer exists and we can get a lot more volume, a lot more efficiency by doing it remotely. These are all examples of what we would call the growth lane.

A company that's quite aggressive, that wants to take advantage of digital, can really leapfrog

And so it's not just about customer needs changing. I think that's the more obvious thing. What we've also seen is our regulations start to change. For example, in the U.K., the land registry is now taking e-signatures. So aggressive companies in the real estate market can take advantage of that, “Hey, now we're allowed e-signatures, so we can make the process of buying a home a lot smoother, a lot faster.”

Some of the stock markets all over the world have removed the regulation to have physical stock trading, opening up the door for many different kinds of things. Some of our clients have the okay for their buyer, who happens to be the government, where instead of coming in for the inspection of something that's being built and actually visually looking at it, they've had the okay to do it via video for the first time ever.

These changes of regulations or these roadblocks against digital that existed in the past that have been removed. A company that's quite aggressive, that wants to take advantage of digital, can really leapfrog. So those are examples of what we would classify within that growth lane.

How do you make a business case for a risky investment in this kind of climate?

This is why I actually use the word aggressive. It has to do a lot with what your board of directors, your investors, your CFO think about future growth. A lot of companies that are aggressive have those reserves to make those leapfrog actions. So for organizations that have the cash, or reserves, the business case is fairly straightforward. I think that's a very best case scenario.

If you're in a situation where the cash reserves are fairly limited, or you have a lot of problems actually funding it, it actually becomes quite problematic. To continue momentum, we encourage more of a portfolio-style thinking.

For example, maybe 80, 90, 95% of your investment is in the fast lane or the fix-it lane, but you hedge your bets a bit and maybe have a little bit in that growth lane. So that relative thinking of portfolio thinking allows companies that have a hard time finding the funding and the resources to pursue the growth plane to have a foot in that door.

How do you identify what initiatives would be best for your organization? It seems like there would be a lot of different options and opportunities.

I'd say the first thing, usually the first thing in all cases, is following the customer, where's the actual customer going? For aggressive companies, that may mean being in places before the customer wants or knows to want to be there.

The second thing, frankly, and, and I know this sounds a little bit loosey goosey for me to say this, but opportunistically, think about mergers and acquisitions. Think about situations where other companies or competitors, or even fintechs, you know, some of these startups, it's a tough climate for everybody. They of course are in a financially bad situation. So the opportunity comes up to be merged, acquired, converged, whatever language you want to use.

Because the opportunities are not defined, because you don't know ahead of time, we recommend reserving some capabilities or capital essentially to do these kinds of things.

Read more: Use COVID-19 Downtime to Upskill for Digital

The third lane is the fix-it lane. It seems like it'd be very relevant to businesses right now. Can you explain a little bit about who should be thinking about these types of investments and initiatives?

A lot of organizations in various stages of pandemic recovery are thinking as an enterprise, ”We're just not set up correctly for this new world.” For example, the cost structure may be too high and digital tools, such as automation and self-serve tools that help reduce the cost, might not be there. That's where the fix-it lane starts to come in.

But what's also interesting is the mirror image of that, where we have enterprises out there that are actually going through the exact opposite of bigger reductions in operations or sales. They're seeing increases in volumes and growth. So a very simple example of this is maybe the digital commerce business unit, or some parts of healthcare. Their volumes went through the roof and they'll probably continue to go that route. So the fix-it lane goes both ways. Are you too big? Are you not big enough? Can you scale up, can you scale down?

The fourth lane is the slow lane. What kind of initiatives belong in the slow lane?

First of all, why is there a slow lane? The slow lane is there because to go fast, often our clients have to slow some things down. The slow lane is about looking at the list of digital initiatives that you were working on even before the pandemic, and of course through the pandemic itself, and determining resources. For those initiatives that shouldn’t be killed, consider treading water with them, meaning keep them going, but at a minimal level so that resources can be redirected toward other things.

So for example, maybe you were looking at something like an ERP upgrade and on top of your ERP upgrade, you had all these bells and whistles that you wanted to pursue. By putting the ERP upgrade in the slow lane, maybe you do the upgrade, but you hold off on the bells and whistles.

I'm not saying that you're going to stop them. What you're going to do is you're going to meter them so that they get implemented over a much longer period of time, because they're not as critical as working on a digital initiative with customers like creating a customer portal, mobile app or customer-facing things.

You mentioned completely stopping and killing, which is not the idea in the slow lane, but is the idea for the final lane — the exit lane. So what kind of initiatives are you seeing in this area?

That's exactly the right understanding of the exit lane. You will kill some projects. The first category of projects that you're likely going to kill are those risky bets that you made pre-pandemic. Maybe you were going to move into a new adjacent industry with a new digital product of some kind, but because of the pandemic, you put it on hold.

And then as you're emerging out of the pandemic, you're seeing that the market is just not ready to absorb that new digital product or service. So you actually stop it and take those resources that you've now canceled and move those people, that funding and the time toward other projects.

Do you find executives really struggle to identify areas that they can put in the exit lane?

Absolutely. I think all of us, as leaders, get attached to our pet projects and so that becomes very complicated. To be able to do this whole portfolio-type exercise, you must consider the bigger picture. If you go into the weeds and ask middle management, of course they're going to say we can't kill that project. You actually have to bring it up a notch for the better of the company, or even more specifically for the better of the customers or citizens and look at it within that light.

What about companies that want to embrace digital acceleration, but they might not be able to afford investing in new initiatives at this point?

For those organizations, there are a couple of alternatives. The first one, of course, is they can do nothing. In other words, go very, very slowly. The danger in that is that they upset their customers. They're not in tune with what the market wants, which in a private sector situation means competition gets a lot harder for them and bad things happen.

The second thing is to look for funding elsewhere. We've had companies that maybe the culture and even the way that they actually fund is not pro digital, but by creating a separate group — and in some cases, literally a separate legal entity — that allows you to do two things:

Contain the risk only to this little separate entity Allow the separate entity to move forward unimpeded by the culture or the ways of funding or whatever it is that was slowing down digital before, and they can move on their own

And then after that, what the organization can do is take a look at that separate entity and say, yeah, here's some things that we can take within our organization. Or you can even fold the entire organization back into that organization to then kick-start the legacy” organization.

So those are rather extreme ways to get digital going for a company that's really not into digital, but frankly it has to be extreme. And I don't have to point any more to lockdown to say that that was an extreme situation that got a lot of companies going.

Read more: Lead Through Volatility With Adaptive Strategy

Is there any final thought that you want to share?

Remember the acceleration that happened during lockdown. We've had some of our clients even call the lockdown the honeymoon period because for the first time ever, projects and initiatives were pursued in a matter of days, if not a week or two, because the leadership team was aligned. Everybody knew that they had to do it because the funding was there and at no other point in history have we had something that easy for digital. So remember that and try to port some of that into the near-term future so that you can actually accelerate. It's that kind of speed and urgency that you want to capture.

The post Prioritize Digital Business Initiatives to Accelerate Into the Future appeared first on Smarter With Gartner.

Monday, 28. September 2020

Forgerock Blog

Myth Busting at Identity Live: Cloud Edition

This month we announced some exciting enhancements to ForgeRock Identity Cloud. All of the updates we make to our platform are done with your realities and requirements in mind. The year 2020 has taken a toll on many businesses all over the world, and this has put increasing pressure on IT teams. Our customers are seeing online traffic like never before. The journey towards digital transforma

This month we announced some exciting enhancements to ForgeRock Identity Cloud. All of the updates we make to our platform are done with your realities and requirements in mind. The year 2020 has taken a toll on many businesses all over the world, and this has put increasing pressure on IT teams. Our customers are seeing online traffic like never before. The journey towards digital transformation has been turbo-charged as we move through the pandemic and prepare for what’s next. 

Organizations are doing everything they can to go digital while prioritizing the delivery of  exceptional user experiences. At the same time, security and trust remain critical to keep customers, partners, and employees safe online and in person. And while there is a big rush to the cloud, firms may be struggling with how to do it safely, securely, and without disrupting business. 

As companies weigh their cloud decisions, they are starting to raise critical questions about commonly held myths regarding cloud migration: Is the cloud really less secure and compliant? Does everything have to go to the cloud? Is it more expensive? 

Spoiler alert: The answer is absolutely not. We busted these myths last week during Identity Live: Cloud Edition

To kick off our event, ForgeRock CEO Fran Rosch and retired U.S. General and KKR partner David H. Petraeus discussed the CIA’s journey to the cloud, which began in 2013. At one time, security was one of the main reasons that many organizations elected not to adopt cloud solutions. Today, security is one of the many benefits of the cloud due to the scale of investment in security that cloud service providers have made – investments far beyond the scope of individual organizations. Knowing that the CIA has trusted the cloud for nearly 10 years reinforces this point.

While many companies have cloud-first strategies, we recognize that not all workloads are created equal. We were excited to chat with Amol Kabe, senior director of product management at Google Cloud, about the need for choice and flexibility. We’re here to help our customers embrace the power of the cloud and also work within their own hybrid realities. We polled the Identity Live audience and found that 80% expect to remain in a hybrid cloud world for at least five more years. On premises, your cloud, or our cloud – we will make it work for you.

We will keep working to enable excellent digital experiences. General Petraeus believes the login experience will be a differentiator, and we could not agree more. Personalization matters. Ease of use matters. Security matters. The outcome for a great user experience? It’s pretty simple. Do it fast, do it right, and do it now. We understand the need for a superior experience and, at the same time, ensure that this will not diminish security in any way. 

We also had the pleasure of hearing from Daryl Robbins, senior director of global architecture at Calabrio about their journey to the cloud. One of the reasons Calabrio chose ForgeRock to manage their 1.3 million digital identities is because of our full tenant isolation security capability. With ForgeRock, they never have to worry about their data being commingled with other customer data. From the administrator login screen to API endpoints and from the data to the application stack itself, there is no data traversal across those planes. We live, breathe, and sleep security, and we pledge to do that for you and all our customers. While we securely manage their IAM, Calabrio can focus on building incredible experiences for their customers. 

ForgeRock delivers simple truths – with no surprises. Moving to the cloud does not have to be more expensive. Having an unplanned uptick in traffic should not be costly. In times of uncertainty, we’ll provide you with more certainty. If you’re experiencing Black Friday-like numbers every day, we are here to help you scale up  – at no additional expense to you.

Cloud without compromise. Great experiences. No surprises. That’s what ForgeRock delivers. 

We know that you’re facing immense pressure to adapt and respond to a new normal. We are here to help you plan your IAM future along the way. 

Thanks to all of our customers and partners for attending Identity Live: Cloud Edition!  Want to revisit the action from our event? Watch each of the replays here.


KuppingerCole

The Role of Data-Centric Security in the Cloud

As modern businesses across all verticals continue their rapid digitalization, the need to store, process and exchange data securely is becoming an essential factor for any company. However, this is particularly challenging for high-tech companies dealing with highly-sensitive R&D data.

As modern businesses across all verticals continue their rapid digitalization, the need to store, process and exchange data securely is becoming an essential factor for any company. However, this is particularly challenging for high-tech companies dealing with highly-sensitive R&D data.




MyKey

MYKEY Weekly Report 18 (September 21th~September 27th)

Today is Monday, September 28, 2020. The following is the 18th issue of MYKEY Weekly Report. In the work of last week (September 21th to September 27th), there are mainly 5 updates: 1. MYKEY Lab issued 100 million TRC20-KEY on TRON KEY is the utility token of BIHU (the world’s largest Chinese crypto community) and MYKEY (the largest multi-chain smart wallet). MYKEY Lab issue 100 millio

Today is Monday, September 28, 2020. The following is the 18th issue of MYKEY Weekly Report. In the work of last week (September 21th to September 27th), there are mainly 5 updates:

1. MYKEY Lab issued 100 million TRC20-KEY on TRON

KEY is the utility token of BIHU (the world’s largest Chinese crypto community) and MYKEY (the largest multi-chain smart wallet). MYKEY Lab issue 100 million KEY on TRON in a migrated way, hope to bring more scenarios for users on TRON ecology, for details, please click: https://bit.ly/301e16g

2. The nineteenth MYKEY Crypto Stablecoin Report was published

We release MYKEY Crypto Stablecoin Report every week to share our interpretation of the development status of stablecoins and analysis of their development trends to help the participants in the crypto market stay updated on the development status of stablecoin. The nineteenth Crypto Stablecoin Report was published on September 24th, click to read: https://bit.ly/3cpw8I6

3. Ricky and Justin Sun were live on bihu.com on September 24

Ricky, the co-founder of MYKEY and Justin Sun, the founder of TRON were live on bihu.com and talked about the entire ecological layout of DeFi on TRON at 8:00 p.m.(UTC+8) on September 24.

4. MYKEY smart contract has passed the audit of Trail of Bits

The Ethereum and EOS smart contracts have recently passed the security audit of the top security team Trail of Bits, for details, click to read: https://bit.ly/2EvWQ5m

5. MYKEY Lab added 1 billion KEY TOKEN to the “multiple chains exchange pool”

Due to the demand of KEY TOKEN on multiple chain, MYKEY Lab locked 1 billion KEY TOKEN on EOS to the exchange pool on September 27, for detail, click to read: https://bit.ly/30d211F

!!! If you encounter any abnormal situation while using MYKEY, remember not to uninstall MYKEY APP, please contact MYKEY Assistant: @mykeytothemoon in Telegram.

!!! Remember to keep the 12-digit recovery phrase properly from [Me] — [Manage Account] — [Export Recovery Phrase] in MYKEY even if your account is not real-named.

About Us

KEY GROUP: https://keygroup.me/

MYKEY Web: https://mykey.org/

BIHU: https://bihu.com/people/1133973

Telegram: https://t.me/mykey_lab

Twitter: https://twitter.com/mykey_lab

Medium: https://medium.com/mykey-lab

Github: https://github.com/mykeylab

Youtube: MYKEY Laboratory

MYKEY Weekly Report 18 (September 21th~September 27th) was originally published in MYKEY Lab on Medium, where people are continuing the conversation by highlighting and responding to this story.

Sunday, 27. September 2020

KuppingerCole

KuppingerCole Analyst Chat: Access policies as the Common Language for Defining Access

Access management and access governance in many companies are still largely based on traditional authorization concepts. Thus defining and thinking access management is often rooted in a rather one-dimensional paradigm. Martin and Matthias talk about access policies as a common language for defining and maintaining rules for access, independent of the actual implementation of access control.
Access management and access governance in many companies are still largely based on traditional authorization concepts. Thus defining and thinking access management is often rooted in a rather one-dimensional paradigm. Martin and Matthias talk about access policies as a common language for defining and maintaining rules for access, independent of the actual implementation of access control.


MyKey

Announcement: MYKEY Lab adds 1 billion KEY TOKEN to the “multiple chains exchange pool”

Due to the demand of KEY TOKEN on multiple chain, MYKEY Lab locked 1 billion KEY TOKEN on Ethereum to the exchange pool on September 27: 0xc4947bf8c74033c7079f6780460e72e82a8df33c Exchange pool change record: Announcement: MYKEY Lab adds 1 billion KEY TOKEN to the “multiple chains exchange pool” Announcement: MYKEY Lab adds 1 billion KEY TOKEN to the “multiple chains exchange po

Meeco

FinTech Australia announces Meeco as a Finnies 2020 finalist

September 23, 2020, FinTech Australia announced the finalists for this year’s Finnies. The Finnies recognise and reward the Australian FinTech community by honouring innovation, growth and collaboration. For Meeco, collaboration is at the heart of everything we do. We’re humbled (and a little excited) to be a finalist in the ... Read More The post FinTech Australia announces Meeco as a Finnie
September 23, 2020, FinTech Australia announced the finalists for this year’s Finnies. The Finnies recognise and reward the Australian FinTech community by honouring innovation, growth and collaboration. For Meeco, collaboration is at the heart of everything we do. We’re humbled (and a little excited) to be a finalist in the “Excellence in Industry Collaborations & Partnerships” category. Our submission was based on our partnerships in Belgium with KBC Bank, KBC Brussels and CBC Bank. As an Australian FinTech, we’re thrilled to have forged this partnership along with expanding our operations into Belgium and establishing a team on the ground. KBC was recently voted the most valuable brand in Belgium for the 5th consecutive time, along with also being voted the best Digital Bank for 2019. Known for digital innovation, KBC has created an eco-system of value added services for customers, all available from within their mobile banking app.

In December 2019, three of KBC’s retail bank brands added the “Digital Safe powered by Meeco”. A secure data enclave that enables customers to capture, store and share personal data and documents. Available in four languages (Dutch, French, German and English) and leading the way in customer data privacy and personal data mobility. Despite the global challenges of 2020, our engineering teams in Australia and Belgium have been focussed on making Meeco’s technology available to more partners. Our APIs and SDKs will soon be available in Australia, to support the Consumer Data Right, in the same way we have been helping our European partners with GDPR and Open Banking. We would also love to give a shout out to our newest partners developing amazing application powered by Meeco.

Starting with mIKs-it “The safe multimedia app for kids”. mIKs-it is a protected space for kids to experience play and joy through photos, videos and audio. Enabling kids and their trusted grown-ups to safely connect to daily life always with complete privacy. mIKs-it is being developed in partnership with Heder VZW in Belgium, and will launch in December 2020. Closer to home, Meeco is privileged to have been selected by Pam Moorhouse, founder and CEO to provide the privacy, security and consent platform for My Life Capsule. My Life Capsule empowers family connection & organisation across multiple generations, relieving day-to-day and time-critical administration. It helps families capture and share memories, important information and data assisting with the emotional and logistical requirements at every stage of life. The Finnies category winners will be announced at the virtual ceremony on Wednesday 28 October (AEST). Also, Stone & Chalk in Sydney, will be hosting a special in-person event on the night.
“We are thrilled with the quality of applications and the finalists, which are a testament of the growth, and strength of the fintech ecosystem. We are excited to celebrate the hard-fought achievements of the ecosystem at this year’s Finnies” 

Rebecca Schot-Guppy, CEO, FinTech Australia
This will be the first virtual Finnies event, starting at 18:00 AEST on October 28. People will be able to watch from home, which means our team in Belgium can also join in the celebrations. From all of us at Meeco, huge congratulations to all the wonderful finalists.
It’s an honour to be counted in such great company, thank you!

The post FinTech Australia announces Meeco as a Finnies 2020 finalist appeared first on The Meeco Blog.

Saturday, 26. September 2020

COMUNY

comuny Startup-Story im 5-HT Digital Hub Chemistry & Health

26.09.2020. Für einen sicheren und vertrauensvollen Datenaustausch im Gesundheitsbereich. Für einen Blogbeitrag beschreiben Dominik und ich die Herausforderungen für Krankenkassen bei der Einführung der elektronischen Patientenakte zum 01.01.2021, die sensiblen Gesundheitsdaten ihrer Versicherten bestmöglich zu schützen. Wer den Zugriff auf seine ePA über comuny realisiert, erhält innovative Unters
26.09.2020. Für einen sicheren und vertrauensvollen Datenaustausch im Gesundheitsbereich.

Für einen Blogbeitrag beschreiben Dominik und ich die Herausforderungen für Krankenkassen bei der Einführung der elektronischen Patientenakte zum 01.01.2021, die sensiblen Gesundheitsdaten ihrer Versicherten bestmöglich zu schützen.

Wer den Zugriff auf seine ePA über comuny realisiert, erhält innovative Unterstützung. Denn über unseren Trust Data Operator lassen sich Identitätsmerkmale von Nutzern digitaler Dienste rechtssicher verifizieren. Anschließend werden die Daten anwendungs- und prozessübergreifend als Service in den benötigten Use Case geliefert.

Unsere Lösung werden Dominik und ich im Oktober bei dem Programm „Insuring Digital Health“ vorstellen, mit dem 5-HT Startups und ihre innovativen digitalen Lösungen mit Versicherungen und Krankenkassen zusammenbringt.

Im Interview erzählen wir, wie comuny im Gesundheits- und Versicherungsbereich einen vertrauensvollen Datenaustausch zwischen Nutzern und Unternehmen ermöglicht.

Publiziert auf https://www.5-ht.com/2020/09/25/fuer-einen-sicheren-und-vertrauensvollen-datenaustausch-im-gesundheitsbereich/.

 

Der 5-HT Digital Hub Chemistry & Health ist Teil der vom Bundesministerium für Wirtschaft und Energie initiierten Digital Hub Initiative (de:hub) zur Förderung digitaler Innovation in Deutschland. Ziel ist es ein internationales Ökosystem von Startups, Investoren und Unternehmen aufzubauen, um digitale Innovation in den Branchen Chemie und Gesundheit voranzutreiben.

 

#insurance #eHealth #operator #future-of-trust #ecosystems #trust-services #comuny  #eIDAS #eSignatur #Personenverifizierung #Authentifizierung #mobil #Identität

Friday, 25. September 2020

Trinsic (was streetcred)

Trinsic Basics: The Three Models of Digital Identity

Digital identity has advanced over time, most recently culminating in self-sovereign identity (SSI). In this Trinsic Basics post, we are going to briefly cover the different models of digital identity and how SSI is the next step in the digital identity evolution. The content in this post is inspired by a blog post written by […] The post Trinsic Basics: The Three Models of Digital Identity appe

Digital identity has advanced over time, most recently culminating in self-sovereign identity (SSI). In this Trinsic Basics post, we are going to briefly cover the different models of digital identity and how SSI is the next step in the digital identity evolution. The content in this post is inspired by a blog post written by digital identity expert Timothy Ruff in 2018.

Model #1: Siloed identity

The first model of digital identity is the siloed model, and its name describes exactly how it works. In the siloed approach, in order to interact with another company or institution digitally, you must open an account, typically with a username and password. Through this process, the company becomes your identity provider. Over time, you’ll end up managing hundreds of accounts from hundreds of identity providers.

 

Your identity provider stores your username and password and other personal data in large data silos. In other words, your personal data is stored with all of the other customers’ personal data in one central location controlled by the company.

The siloed model is broken because it is:

 

Centralized. You are not in control of your digital identity—the company or institution is. If they choose to block your account or if they shut their doors, your account, along with all of your data, would be gone. Not secure. Data silos are gold mines for hackers since they essentially only have to crack one silo to gain access to the personal data of a lot of people. Besides this, account security usually relies on a username and password, and research shows that people do not typically create secure passwords and often use the same username and password for many of their accounts. Poor user experience: How many times have you forgotten your password and had to recreate it? Having so many different accounts with (supposedly) unique usernames and passwords has led to a frustrating customer experience when trying to access services digitally. Model #2: Federated identity

Digital identity advocates saw the flaws of siloed identity and sought to create a more user-centric approach.

 

Instead of every company needing to be its own identity provider, anyone could be an identity provider! You could, theoretically, be your own identity provider! But ultimately, this approach to “federated identity” was taken over by big tech companies.

 

Although we may not be familiar with the term, we are all very familiar with how federated identity works. The most common federated identity example is the “Login with Google” or “Login with Facebook.” With federated identity, large tech monoliths act as middlemen between you and the company you are interacting with digitally. Instead of the company managing your identity data, the middlemen do.

Although the federated identity model improves the user experience, it too suffers from some problems:

 

Reliance: In this model, technology monoliths provide us a digital existence. Instead of being independent and in control, we subject ourselves to the whims of the middleman. This has led to anti-trust and anti-competitive behavior that is problematic. Universality: Logging in with Facebook might work for buying shoes or signing up for a newsletter. But business tools, regulated industries (financial, healthcare, government), and privacy-focused industries will never accept your personal social media account as authentication because it has almost no assurance—anyone can create multiple Google accounts with ease. Surveillance: When you login with Facebook, Google, or some other service, you give that middleman more data points they can use to track and spy on you so that they can increase their revenue. Interests are fundamentally not aligned between people and the middleman. Model #3: Self-sovereign identity

Various approaches have been tried over the years to put individuals in more control of their identity, but they’ve struggled to get adopted. Finally, self-sovereign identity appears to be the solution.

 

Various technical innovations including blockchain, verifiable credentials, digital wallets, DIDs, and more gave rise to a new, decentralized identity that not only improves the user experience but also the security of the internet. SSI can be summed up as a movement that claims digital identity should be just as legitimate and nuanced as a person’s human identity, while being accessible to all, privacy-preserving, and not reliant on a single government or corporation.

With SSI, you’re your own identity provider because, as an individual, you have a human identity! Nobody gave you your real-world identity, so likewise, you don’t need someone to give you a digital one. Instead, you control the information that makes up your identity information as well with whom you share that information with. For the first time ever online, SSI makes it so that you interact as a “peer” with other people, organizations, and more.

Trinsic is the easiest way to implement SSI

At Trinsic, we are trying to make this third model of digital identity a reality. Our company’s mission is to make the world and its services more accessible by making self-sovereign identity more accessible. We’ve done that by creating the world’s most advanced SSI platform. Experience SSI firsthand by creating a free Trinsic Studio account today. For a deeper explanation of SSI, check out this blog post.

The post Trinsic Basics: The Three Models of Digital Identity appeared first on Trinsic.


Mythics Blog

Why Oracle Cloud Infrastructure (OCI)? - Part 1

A question I get every day from customers is, "Why should I move to Oracle's Cloud?" There…

A question I get every day from customers is, "Why should I move to Oracle's Cloud?"

There…


IDnow

Why you should sign your documents digitally.

Printing and signing documents takes time and resources. Especially in the current situation, a digital process is more important than ever. Since...

Printing and signing documents takes time and resources. Especially in the current situation, a digital process is more important than ever.

Since 2016 electronic signatures gained more and more traction. In July 2016, the qualified electronic signature (QES) completely entered the mainstream following the EU regulation known as eIDAS. Since then, it carries the same significance throughout the entire European Economic Area.

The benefits of signing contracts digitally speak for themselves:

Saving time and transport costs. No storage space needed. Reduced risk of incomplete, inaccurate, or lost documents.

Besides purely monetary advantages, other aspects are also convincing:

Higher customer satisfaction. Improved compliance through fast response time for legal audits and revision requests. More agile and flexible work environment.

Studies have shown companies save costs from 5 to 36 Euro and processing time of 1.5 hours per contract. Time to digitize your contract management!

Interested in learning more about electronic signatures? Download our Quick Guide to electronic signatures.


Self Key

Synthetic Identity Theft: Latest Scam in Identity Thefts

SelfKey Weekly Newsletter Date – 23rd September, 2020 Identity theft has been an ever-evolving crime, and the latest of the lot is synthetic identity theft. The post Synthetic Identity Theft: Latest Scam in Identity Thefts appeared first on SelfKey.

SelfKey Weekly Newsletter

Date – 23rd September, 2020

Identity theft has been an ever-evolving crime, and the latest of the lot is synthetic identity theft.

The post Synthetic Identity Theft: Latest Scam in Identity Thefts appeared first on SelfKey.


MyKey

MYKEY Smart Contract has Passed the Audit of Trail of Bits

The official news, MYKEY smart contracts on Ethereum and EOS have recently passed the security audit of the top security team Trail of Bits. The details of the report are as follows: https://github.com/mykeylab/keyid-eth-contracts/blob/master/reports/Trail%20of%20Bits%20Verification%20Report%20for%20MYKEY(2020-09-14).pdf Trail of Bits is a top security team established in 2012 with the best

The official news, MYKEY smart contracts on Ethereum and EOS have recently passed the security audit of the top security team Trail of Bits. The details of the report are as follows:

https://github.com/mykeylab/keyid-eth-contracts/blob/master/reports/Trail%20of%20Bits%20Verification%20Report%20for%20MYKEY(2020-09-14).pdf

Trail of Bits is a top security team established in 2012 with the best foundational tools for evaluating the security of smart contracts and deep expertise in reverse engineering, cryptography, virtualization, malware, and software exploits.

MYKEY is a multi-chain smart wallet and the first application example based on the KEY ID protocol. Each MYKEY account exists as a smart contract address. It has the characteristics of separated permissions, recoverable account, and meta-transaction. For the MYKEY team, protecting the security of smart contracts is a top priority. Therefore, we chose Trail of Bits for the contract security audit.

Besides, MYKEY released the first version of MYKEY protocol specification, which explained the features and the technical architecture of the KEY ID protocol in detail.

https://github.com/mykeylab/keyid-eth-contracts/blob/master/specifications/MYKEY%20Specification.pdf

About Us

KEY GROUP: https://keygroup.me/

MYKEY Web: https://mykey.org/

BIHU: https://bihu.com/people/1133973

Telegram: https://t.me/mykey_lab

Twitter: https://twitter.com/mykey_lab

Medium: https://medium.com/mykey-lab

Github: https://github.com/mykeylab

Youtube: MYKEY Laboratory

MYKEY Smart Contract has Passed the Audit of Trail of Bits was originally published in MYKEY Lab on Medium, where people are continuing the conversation by highlighting and responding to this story.


Ontology

Daimler Mobility Partners With Ontology To Leverage High-Performance Technologies and Transform…

Daimler Mobility Partners With Ontology To Leverage High-Performance Technologies and Transform the Driving Experience With Ontology’s decentralized blockchain solutions, drivers can enjoy a suite of new, highly-secured digital driving services! Singapore — 24 September 2020 — In a move that will transform the experience of drivers all over the world, the Daimler Mobility AG Blockchain Factory,
Daimler Mobility Partners With Ontology To Leverage High-Performance Technologies and Transform the Driving Experience With Ontology’s decentralized blockchain solutions, drivers can enjoy a suite of new, highly-secured digital driving services!

Singapore — 24 September 2020 — In a move that will transform the experience of drivers all over the world, the Daimler Mobility AG Blockchain Factory, which was set up to bring the benefits of blockchain innovation and application to the automotive finance and mobility industry, is partnering with Ontology, the high performance, open-source blockchain specializing in digital identity and data.

Today, at Daimler’s Startup Autobahn EXPO Day, a live event bringing together the most innovative minds in mobility solutions, the companies unveiled the fruits of their first collaboration — ‘Welcome Home’, a self-sovereign in-car personalization and management solution.

Daimler Mobility AG’s Blockchain Factory and Ontology have joined forces to develop MoveX, a first-of-its-kind blockchain-based mobility platform for the automotive and mobility industry. Utilizing ONT ID, Ontology’s decentralized identity framework, MoveX will empower user roaming, bundling, and sharing, as well as facilitate the integration of services across providers — one of the main barriers for adoption.

Welcome Home, the first product on the MoveX platform, solves the trade-off for users who want to have highly personalized experiences and diverse service integrations while also preserving their data privacy and control. The Welcome Home solution will not only be applicable to in-car experiences but to any smart device as they are becoming increasingly intelligent, connected and data-rich.

Commenting on the partnership, Andy Ji, Co-founder of Ontology stated, “In choosing to partner with Ontology, Daimler Mobility has demonstrated its commitment to innovation and to leveraging advanced emerging technologies such as ONT ID which deliver the best possible experience for drivers around the world. Our decentralized identity solution can offer a range of benefits that focus on security, verification, and trusted data management for not just drivers, but the entire automobile ecosystem.”

He continued, “To date, Daimler Mobility’s efforts to implement blockchain-based solutions have focused primarily on improving efficiencies in back-end departments and in the supply chain. However, what makes this partnership exciting is that Ontology’s ONT ID integration with Daimler Mobility on MoveX will open doors to allow for inherent interoperability that encompasses offline data like parking tickets or claims, but also digital data like access tokens or driving history.

The news comes as Ontology continues to strengthen ties with the German enterprise and moves forward with plans to open an office in Berlin.

Harry Behrens, Head of The Daimler Mobility AG Blockchain Factory, said, “Welcome Home combines mobility with social networking. By using Ontology’s unique framework for DID and highly secure data access we have combined usability with the highest standards for data sovereignty and privacy.”

Find Ontology elsewhere

Ontology website / ONTO website / OWallet (GitHub)

Twitter / Reddit / Facebook / LinkedIn / YouTube / NaverBlog / Forklog

Telegram Announcement / Telegram English / GitHubDiscord

Daimler Mobility Partners With Ontology To Leverage High-Performance Technologies and Transform… was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.

Thursday, 24. September 2020

KuppingerCole

Application Access Governance for SAP Environments and Beyond

For many enterprises, SAP systems are an essential part of their corporate IT infrastructure, storing critical business information and employee data. SAP systems have traditionally been a major focus area for auditors. It is therefore essential that all existing SAP systems are covered by an effective solution for managing risks, including managing access controls and SoD controls, and implementi

For many enterprises, SAP systems are an essential part of their corporate IT infrastructure, storing critical business information and employee data. SAP systems have traditionally been a major focus area for auditors. It is therefore essential that all existing SAP systems are covered by an effective solution for managing risks, including managing access controls and SoD controls, and implementing adequate Access Governance.




Secure Key

A year of critical change: Checking in on cyber trends mid-way through 2020

As we approach the final quarter of a truly historic year, it’s no surprise the last eight months have seen cyber trends emerge that will have lasting effects on the world’s technology ecosystem – but what if these aggressive changes were needed all along? From artificial intelligence (AI) and machine learning (ML) to decentralized digital […] The post A year of critical change: Checking in on c

As we approach the final quarter of a truly historic year, it’s no surprise the last eight months have seen cyber trends emerge that will have lasting effects on the world’s technology ecosystem – but what if these aggressive changes were needed all along? From artificial intelligence (AI) and machine learning (ML) to decentralized digital […]

The post A year of critical change: Checking in on cyber trends mid-way through 2020 appeared first on SecureKey Technologies Inc..


One World Identity

Yoti + Synectics Solutions Launch Project Endeavor

Yoti Commercial Director Gareth Narinesingh and Synectics Solutions Head of Presales Chris Lewis join State of Identity to discuss the launch of Project Endeavor, a pilot program to revolutionize electronic digital onboarding, identity verification and risk assessment for the financial services sector in the United Kingdom.

Yoti Commercial Director Gareth Narinesingh and Synectics Solutions Head of Presales Chris Lewis join State of Identity to discuss the launch of Project Endeavor, a pilot program to revolutionize electronic digital onboarding, identity verification and risk assessment for the financial services sector in the United Kingdom.


MyKey

The market capitalization of stablecoins reached $19.86

The market capitalization of stablecoins reached $19.86 billion, The latest development of several decentralized stablecoin protocols Original link: https://bihu.com/article/1485006994 Original publish time: September 22, 2020 Original author: HaiBo Jiang, researcher of MYKEY Lab We released MYKEY Crypto Stablecoin Report to share our interpretation of the development status of stabl
The market capitalization of stablecoins reached $19.86 billion, The latest development of several decentralized stablecoin protocols

Original link: https://bihu.com/article/1485006994

Original publish time: September 22, 2020

Original author: HaiBo Jiang, researcher of MYKEY Lab

We released MYKEY Crypto Stablecoin Report to share our interpretation of the development status of stablecoins and analysis of their development trends to help the participants in the crypto market stay updated on the development status of stablecoin. The MYKEY Crypto Stablecoin Report will be published every week, looking forward to maintaining communication with the industry and exploring the development prospects of stablecoin together.

Quick Preview The market capitalization of major stablecoins has increased by $1.334 million to $19.86 billion. Last week, 1 billion USDT migrated from Tron to Ethereum. Besides, Tether additionally issued 500 million USDT on Tron twice and 150 million USDT on Ethereum. From September 13th to 20th, the supply of DAI rose from $475 million to $800 million, an increase of 71%. Pickle developed the feature using Ethereum and stablecoin exchange pairs as LP Tokens to aggregate mining in Uniswap. On September 20, the one-day transaction volume in Curve set a record of $519 million. After the SWRV lower the inflation rate, the total lock-up volume in Swerve was less than $20 million. 1. Overview of Stablecoin Data

First, let’s review the changes in the basic information of the various stablecoins in the past week(September 12, 2020 ~ September 18, 2020, same below).

Market Circulation

Source: MYKEY, CoinMarketCap, Coin Metrics

At present, the market capitalization of major stablecoins has increased by $1.334 million to $19.86 billion.

Source: MYKEY, Coin Metrics

In the past week, the circulation of USDT has increased by 650 million. 1 billion USDT migrated from Tron to Ethereum. Besides, Tether additionally issued 500 million USDT on Tron twice and 150 million USDT on Ethereum. The circulation of USDC increased by 260 million, and the circulation of DAI increased by 230 million. The circulation of PAX, BUSD, TUSD, and GUSD increased by 23.59 million, 93.97 million, 77.86 million, and 1.35 million. The circulation of HUSD decreased by 1.39 million.

The Number of Holding Addresses

Source: MYKEY, DeBank

Last week, the number of main stablecoin holding addresses on Ethereum all increased by 104,702.

Source: MYKEY, DeBank

The number of holding addresses of USDT, USDC, TUSD, and DAI increased by 90,882, 7,319, 570, and 5,947. The number of holding addresses of PAX decreased by 16.

The Number of Active Addresses

Source: MYKEY, Coin Metrics

The number of active addresses of stablecoins last week decreased by an average of 2.43% compared to the previous week.

The Number of 24-hour Transactions on the Public Blockchains

Source: MYKEY, Coin Metrics

Compared with the previous week, the number of daily transactions of major stablecoins decreased by an average of 1.18%.

The Number of 24-hour Volume of Transactions on the Public Blockchains

Source: MYKEY, Coin Metrics

Source: MYKEY, Coin Metrics

The daily volume of transactions of major stablecoins last week decreased by an average of 0.71% from the previous week.

2. The latest development of the decentralized stablecoin protocol

The development of the blockchain industry is changing with each passing day, and so are the stablecoin projects. Last week, several decentralized stablecoin protocols have made great progress. This report will introduce these projects to you.

MakerDAO

MakerDAO is a representative of the decentralized autonomous blockchain project. The stablecoin DAI is the largest on-chain collateralized stablecoin. The governance token MKR can also well realize the unification of power, responsibility, and benefit. With the increase in Dapps on Ethereum, the supply of DAI appears to be insufficient. In the last three months, the supply of DAI has increased by 545%, it still can’t avoid the premium when DAI is in high demand. Last week, MakerDAO passed a series of important votes, which can solve the problem of DAI premium to a large extent.

The clearing lines of USDC-A and PAX-A have gone through two rounds of voting, from 110% and 120% to 101%. The debt ceiling of USDC-A was raised from $100 million to $400 million after two votes. Activate TUSD as collateral, the debt ceiling is 50 million, and the clearing line is 101%. The global debt ceiling of DAI is raised to 1.2 billion.

After a series of expansionary votes, the supply of DAI rose to $800 million, a 70% increase from a week ago. The market capitalization of DAI also exceeds the market capitalization of its governance token MKR. The price of DAI will be closer to the centrally issued USDC, PAX, and TUSD. When DAI has a premium of more than 1%, arbitrageurs can borrow DAI near the clearing line and sell it on the secondary market and can ignore liquidating risks.

This also brings centralization risk to DAI. Previously, because DAI was the only large-scale decentralized stablecoin issued on Ethereum, the application of DAI on the blockchain was generally better than centralized stablecoins. After that, the boundary between centralization and decentralization may become blurred.

Pickle

When the demand for a certain stablecoin increases, a premium of more than 3% may appear in the short term. This situation has become frequent in recent yield farming, and users need to switch between different stablecoins. When the popularity of mining passes, the premium will disappear, and users may suffer a 3% principal loss in a short period. As the profitability of yield farming decreases, this loss will be unacceptable for many people.

Pickle Finance wants to use incentives to allow stablecoins such as DAI and sUSD to return to their anchored value ($1). In the beginning, PICKLE tokens will be distributed to the pledgers of Uniswap LP Tokens of ETH and DAI, USDC, USDT, and sUSD. The LP corresponding to the stablecoin with the lower price will be rewarded with more PICKLE tokens, which will encourage liquidity providers to sell stablecoins with higher prices and buy stablecoins with lower prices.

The Pickle Swap feature will help users complete the conversion of LP Tokens with one click, including the conversion between platforms (now only supports the conversion from SushiSwap to Uniswap) and the conversion of LP Tokens between different stablecoins and ETH in the platform.

pJar is the valut in Pickle. Users can pledge Uniswap LP Tokens. pJar will help users automatically mine and sell UNI to exchange for more underlying assets. pJar realizes the aggregate mining using Ethereum and stablecoin as LP Tokens on Uniswap. After users pledge Uniswap LP Tokens in pJar, they will get pToken, such as staking UNIV2 DAI/ETH LP will get pUNIDAI. Over time, 1 pUNIDAI will correspond to more UNIV2 DAI/ETH LP. Staking pUNIDAI will also receive additional PICKLE rewards.

Although Pickle Finance may not be able to achieve the goal of returning stablecoins to anchored value, this direction is worth the effort. Because of the token incentives of Uniswap, more and more stablecoins and ETH exchange pairs will be used for mining in Uniswap, and pJar aggregate mining will be more useful.

Curve

Curve was originally a stablecoin exchange protocol. With the development of cross-chain assets, there are now many Bitcoin anchor tokens on the platform. When trading in Curve, the platform only charges a 0.04% commission. In August, after the release of Curve’s platform token CRV, the funds in Curve rose from less than $300 million to more than $1 billion. Currently, Curve has a total of $1.6 billion in assets. On September 20, the one-day transaction volume in Curve set a record of $519 million.

The price of CRV has dropped from $30 at the initial issuance to $1.4 now, and the annualized rate of return in the y pool has also dropped to 14.39%, but there are still more than $600 million stablecoins in the y pool. The decrease in mining yield did not reduce the amount of funds and transactions in Curve. It can be seen that the token incentives of Curve are effective. After the revenue declines, users have not lost a lot.

Swerve

Two weeks ago, Swerve forked from Curve. Its governance token SWRV has no team reservations and no pre-mining. Swerve has also optimized the gas fee in use. Because of the high incentives at the early stage, the total lock-up volume in Swerve reached up to $900 million. However, the economic model of Swerve has a fatal problem. It allocates 9 million SWRV in the first two weeks and 9 million SWRV in the following year. At the end of last week, SWRV completed its first token-issued cut, and now the total lock-up volume in Swerve is less than $20 million. A decrease in the lock-up volume will increase transaction slippage, which will affect the experience of Swerve.

Tips

To better communicate with industry insiders, we decided to add two sections for questions of readers and opinions of guests. If readers have questions about stablecoins, please contact us. We will pick meaningful questions to answer in the next issue. At the same time, welcome guests from the industry to share your views on stablecoins. Contact information: jianghb@mykey.org.

This is what we’re sharing in this MYKEY Crypto Stablecoin Report, welcome to stay tuned for follow-up crypto stablecoin reports. We will provide more interpretations of the development status of stablecoins and analysis of their development trends to help you stay updated on the development status of stablecoin in the follow-up report.

PS: MYKEY Lab has the final right to interpret the content of the article, please indicate the source for the quotation. Welcome to follow our official account — MYKEY Lab: MYKEY Smart Wallet.

Past review

Crypto Stablecoin Report 14: The increase of Ethereum Gas Fee makes the transfers of stablecoin transactions on the blockchain

Crypto Stablecoin Report 15: The market capitalization of stablecoins increased to $15.961 billion, On-chain usage of stablecoins

Crypto Stablecoin Report 16: The connection between stablecoins and real assets

Crypto Stablecoin Report 17: The market capitalization of stablecoins increased to $17.544 billion, Decentralized payment protocol Celo

The market capitalization of stablecoins increased to $18.53 billion, The rise of CBDC

About Us

KEY GROUP: https://keygroup.me/

MYKEY Web: https://mykey.org/

BIHU: https://bihu.com/people/1133973

Telegram: https://t.me/mykey_lab

Twitter: https://twitter.com/mykey_lab

Medium: https://medium.com/mykey-lab

Github: https://github.com/mykeylab

Youtube: MYKEY Laboratory

The market capitalization of stablecoins reached $19.86 was originally published in MYKEY Lab on Medium, where people are continuing the conversation by highlighting and responding to this story.


PingTalk

Secure Hybrid Access for Azure AD B2C

  Should you fall short and get the customer experience (CX) wrong, the penalties are dramatic. All it takes is one bad customer experience for 32% of consumers to completely abandon a brand. But the benefits of getting it right are equally impactful. It’s estimated that multi-billion dollar companies who make CX improvements recognize an average $258M in increased revenue.   

 

Should you fall short and get the customer experience (CX) wrong, the penalties are dramatic. All it takes is one bad customer experience for 32% of consumers to completely abandon a brand. But the benefits of getting it right are equally impactful. It’s estimated that multi-billion dollar companies who make CX improvements recognize an average $258M in increased revenue. 

 

Wednesday, 23. September 2020

KuppingerCole

IAM Essentials: Identity Data Integration




Seamless Connectivity: Why You Need It and How to Get It Right

Businesses are increasingly embracing new technologies to enhance existing processes and enable new business models and revenue streams through Digital Transformation. Service-based business IT, however, is not without challenges, particularly around access governance and control. Digital Transformation is all about enabling everyone and everything to connect seamlessly to new digital services, to

Businesses are increasingly embracing new technologies to enhance existing processes and enable new business models and revenue streams through Digital Transformation. Service-based business IT, however, is not without challenges, particularly around access governance and control. Digital Transformation is all about enabling everyone and everything to connect seamlessly to new digital services, to facilitate value exchange while still ensuring regulatory compliance, faster product innovation, secure remote working capabilities, and multi-channel consumer access despite ever-tightening budgets.




Cloud Access Security Brokers

by Mike Small The KuppingerCole Market Compass provides an overview of the product or service offerings in a certain market segment.  This Market Compass covers CASB (Cloud Access Security Broker) solutions that help to secure the organizational use of cloud services.

by Mike Small

The KuppingerCole Market Compass provides an overview of the product or service offerings in a certain market segment.  This Market Compass covers CASB (Cloud Access Security Broker) solutions that help to secure the organizational use of cloud services.


Nyheder fra WAYF

Kom til WAYF-erfamøde på Zoom!

Torsdag den 29. oktober 2020 kl. 10.30-12.30 og evt. fortsat fra kl. 13 holder WAYF erfamøde. Pga. coronakrisen sker det for første gang på Zoom – som et spændende eksperiment. Link til møderum offentliggøres her på sitet forud for mødet. Language Danish Read more about Kom til WAYF-erfamøde på Zoom!

Torsdag den 29. oktober 2020 kl. 10.30-12.30 og evt. fortsat fra kl. 13 holder WAYF erfamøde. Pga. coronakrisen sker det for første gang på Zoom – som et spændende eksperiment. Link til møderum offentliggøres her på sitet forud for mødet.

Language Danish Read more about Kom til WAYF-erfamøde på Zoom!

Authenteq

Mastering the abbreviations, keywords, and phrases surrounding digital identity and identity verification

We know that the universe of knowledge surrounding identity verification and security can feel vast. We’re here to help! Consider this glossary your […] The post Mastering the abbreviations, keywords, and phrases surrounding digital identity and identity verification appeared first on Identity Verification & KYC | Authenteq.

We know that the universe of knowledge surrounding identity verification and security can feel vast. We’re here to help! Consider this glossary your one stop shop, a way to help equip you with the ABCs of KYC. Share it with your friends or colleagues, or let it be your secret weapon as you become the resident expert—we won’t tell. 

The Basics  Biometric is the “measurement of the human body”, the science of analyzing physical or behavioral characteristics specific to each individual.

KYC means Know Your Customer. It is the process of a business verifying the identity of its customer and assessing their suitability.

Identity Verification is the process of confirming a person’s identity. Easy!

Selfie Authentication uses facial recognition and biometric data to allow users to verify and match themselves to the piece of identity used in the verification process.  The Tech AI stands for artificial intelligence and is the simulation of human intelligence processes such as learning, reasoning, and self-correction by machines.  Facial Recognition is a technology capable of identifying or verifying a person from a digital image or video. In identity authentication or verification, this is usually used to compare a selfie or with an image on file to see if the faces match.

Machine Learning is a category of algorithms that allows software applications to become more accurate in predicting outcomes without being explicitly programmed.

 For the Experts

AML stands for Anti-Money Laundering and refers to a set of laws, regulations, and procedures intended to prevent criminals from disguising illegally obtained funds as legitimate income.  GDPR means General Data Protection Regulation in the EU law on data protection and privacy for all individuals within the EU and the EEA.  Hashing is meant to verify that a file or data is authentic and has not been altered. It uses an algorithm to map data of any size to a fixed length called a hash value/hash code and is one-way.

Privacy by design means privacy is incorporated into technology and systems by default when designing a product.

Private key is a type of lock used in asymmetric encryption that is used to decrypt the message by converting the received message back to the original message and is kept secret.

Public key is a type of lock used in asymmetric encryption that is used to encrypt the message by converting it to an unreadable form and is widely distributed.

Spoofing is a fraudulent attempt to obtain someone’s personal information by pretending to be a source known to the receiver. For tips to keep yourself and your company safe, head here.

We will be putting out a comprehensive downloadable version of this glossary with these definitions and many, many more. Follow us on LinkedIn to make sure you get first access!

The post Mastering the abbreviations, keywords, and phrases surrounding digital identity and identity verification appeared first on Identity Verification & KYC | Authenteq.


Ontology

Ontology Weekly Report (September 16–22)

This week Ontology made impressive strides in DeFi as Wing, the cross-chain DeFi platform, initiated its first Flash Pool vote on whether to change the WING reward distribution rate during the WING Mining Celebration — resulting in increased rewards for users who participated in the Flash Pool Mining Celebration. Further, in a move aimed at further increasing WING’s liquidity, the WING/EWING swap

This week Ontology made impressive strides in DeFi as Wing, the cross-chain DeFi platform, initiated its first Flash Pool vote on whether to change the WING reward distribution rate during the WING Mining Celebration — resulting in increased rewards for users who participated in the Flash Pool Mining Celebration. Further, in a move aimed at further increasing WING’s liquidity, the WING/EWING swap was launched in the Wing Flash Pool.

Back-end
- Ontology v2.1.1-Alpha version released
- Rust Wasm contract development hub released ontio-std v0.4

Product Development
ONTO
- ONTO v3.4.0 released
- ONTO v3.5.0 under development, expected to be released next week
- New ONTO users increased 400% from last week

dApp
- 84 dApps now live on Ontology
- 6,117,733 dApp-related transactions since genesis block
- 29,825 dApp-related transactions in the past week

Bounty Program
- 1 new application for the Technical Documentation Translation
- 1 new application for SDKs

Community Growth
- 594 new members onboarded across Ontology’s Sinhala, Hindi and Bengali communities.

Newly Released
- On September 17, Wing DAO, the Wing community , initiated its first vote. Community members were invited to vote for a change to the WING reward distribution rate during the WING Mining Celebration. Over 1/3 of all current WING in circulation participated in the vote. As a result, the distribution rate will change to twice, compared with the previous ten times, effective from 00:05, September 18th.
- From September 18th to 29th, through the Flash Pool Mining Celebration, users will be rewarded extra incentives. Users who deposit their WING in the Wing DAO Supply Pool during this period will receive a ten-fold increase in rewards between September 18th and 24th, and a five-fold increase in rewards between September 25th and 29th.
- On September 20th (UTC), the WING/EWING swap was launched in Flash Pool of Wing, with the aim of increasing the project’s liquidity . Recently, project teams such as OpenOcean.finance also announced that WING-related swaps would soon be launched online, opening new channels for WING’s circulation and increased scenarios where WING could be used.

Global Events
- On September 18th, Jun LI (founder of Ontology) took part in a panel discussion themed “Prospects of Decentralized Finance” at 2020 International Fintech Innovation Conference (IFIC) in Shanghai. During the panel discussion, Li stated that should Ontology wish to stand out in the DeFi industry while avoiding stepping into the transaction-fee issue as faced by Ethereum, it needs to work harder on its strategy to support different charges in scenarios of different technical complexities. He said, “Ontology is determined to integrate all types of data, including real-life identity data, credit data and asset data of various groups of people, on the blockchain to build a ‘Super Oracle’ that provides the whole package of data and identity information in various DeFi services.”

Find Ontology elsewhere

Ontology website / ONTO website / OWallet (GitHub)

Twitter / Reddit / Facebook / LinkedIn / YouTube / NaverBlog / Forklog

Telegram Announcement / Telegram English / GitHubDiscord

Ontology Weekly Report (September 16–22) was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


Self Key

Yield Farming: The Crowd-Puller for DeFi

Yield farming has been a crowd-puller for DeFi. In this article, we try to explain what is yield farming and its role in DeFi. The post Yield Farming: The Crowd-Puller for DeFi appeared first on SelfKey.

Yield farming has been a crowd-puller for DeFi. In this article, we try to explain what is yield farming and its role in DeFi.

The post Yield Farming: The Crowd-Puller for DeFi appeared first on SelfKey.


MyKey

Announcement: MYKEY Lab issue 100 million KEY on TRON in a migrated way

MYKEY Lab officially announced on September 23 that we have transferred 100 million KEY tokens on Ethereum to the locked address: 0xc4947bf8c74033c7079f6780460e72e82a8df33c The 100 million KEY tokens locked up will be used to guarantee the issuance of an equal amount of KEY assets on TRON. The total amount of KEY remains unchanged at 100 billion. The official multi-chain exchange is under d

MYKEY Lab officially announced on September 23 that we have transferred 100 million KEY tokens on Ethereum to the locked address:

0xc4947bf8c74033c7079f6780460e72e82a8df33c

The 100 million KEY tokens locked up will be used to guarantee the issuance of an equal amount of KEY assets on TRON. The total amount of KEY remains unchanged at 100 billion. The official multi-chain exchange is under development. During this period, only bulk exchanges of more than 10 million KEY are available. If needed, please contact the MYKEY official.

Trc20-KEY contract address:

https://tronscan.io/#/token20/TLzA9pHSLbjbbmbA2p7gyEnvXy4mHbrZPv

Contract code:

https://tronscan.io/#/token20/TLzA9pHSLbjbbmbA2p7gyEnvXy4mHbrZPv/code

MYKEY Lab

2020.9.23

About Us

KEY GROUP: https://keygroup.me/

MYKEY Web: https://mykey.org/

BIHU: https://bihu.com/people/1133973

Telegram: https://t.me/mykey_lab

Twitter: https://twitter.com/mykey_lab

Medium: https://medium.com/mykey-lab

Github: https://github.com/mykeylab

Youtube: MYKEY Laboratory

Announcement: MYKEY Lab issue 100 million KEY on TRON in a migrated way was originally published in MYKEY Lab on Medium, where people are continuing the conversation by highlighting and responding to this story.


Forgerock Blog

Thinking of Modernizing Your CA SiteMinder Deployment? Now May Be the Perfect Time

New CA SiteMinder Plug-Ins Enable Coexistence and Just-In-Time User Migration Between SiteMinder and ForgeRock The Agonizing Decision to Modernize or Stay Put Whether it’s to adapt the realities of a digital transformation program, addressing problems with scaling to provide access to new apps and services, managing the proliferation of Internet of Things (IoT), or the handling of challenges
New CA SiteMinder Plug-Ins Enable Coexistence and Just-In-Time User Migration Between SiteMinder and ForgeRock The Agonizing Decision to Modernize or Stay Put

Whether it’s to adapt the realities of a digital transformation program, addressing problems with scaling to provide access to new apps and services, managing the proliferation of Internet of Things (IoT), or the handling of challenges associated with a growing number of CIAM (customer IAM) and workforce users due to COVID-19, many organizations are currently exploring the options to update or “modernize” their existing legacy identity and access (IAM) systems. 

What we now refer to as “legacy” IAM systems are, in fact, platforms developed 10 to 15 years ago, back when most applications were on-premises and built on a client-server model. Fast forward to today, and these systems are struggling to keep up with cloud-first enterprise application needs. Support options are dwindling because vendors and developers are not keeping pace to support the latest feature sets. 

Legacy IAM systems are functionally at their end-of-life. While they may continue to crank along and seem to perform the old workhorse identity functions, they are unable to meet new business needs. For example, updates on these legacy systems are expensive and time-consuming, and stability challenges can arise as more identities and attributes are added. Integrating new apps is cumbersome. As many of these legacy solutions live on premises, the timeframe for new apps to “go live” is often measured in months, if not years. This is not exactly what you would call “agile IT.”

Modern identity platforms in contrast, are built to truly enable digital transformation, operate at IoT scale, provide continuous security, support cloud and hybrid deployments, seamlessly integrate new applications, and support security models like Zero Trust and the Gartner CARTA (Continuous Adaptive Risk and Trust Assessment) model. They are continuously adding new features and staying on the cutting edge of the market to keep organizations at a competitive advantage. 

Not a “Rip and Replace” Decision

But why might now be a good time to consider making this move? Well, for starters, everything is going digital, and your users and customers are expecting an optimized, online digital channel experience. Access problems, slow app performance due to identity-related issues, and poor authentication experiences will send them looking for workarounds--which your competitors are more than happy to provide.

But just as importantly, the decision to migrate at this time isn’t a binary one. As “rip and replace” is an option for the few, what is needed is a more seamless, step-by-step approach that allows you to go at your own pace, migrate the apps you need to migrate, and achieve a smooth transition to a modern IAM solution with a committed innovator in this space.  

ForgeRock and our partners are here to help. We have assisted countless organizations on the journey from legacy to modern. We stand ready to do the same for your organization, no matter how difficult the challenge or how complex your IAM system may be. We have built a robust set of tools, guides, and documentation to help you make the transition to modern IAM. And it’s all available to you for free. 

Making It Easy: New CA SiteMinder Plug-Ins from ForgeRock

ForgeRock is excited to announce the latest of these tools – a new set of Open Source CA SiteMinder Plug-Ins joining the existing Oracle Plug Ins as part of its portfolio of Modernize IAM Accelerators. The CA SiteMinder Plug-Ins enable coexistence and just-in-time user migration between SiteMinder and ForgeRock, so you can migrate at your own pace. For instance, you can choose to migrate 10 apps per week, one app per month, or whatever your organization may call for. ForgeRock Accelerators enable this migration to occur without any disruption to your customers or employees. It’s all done in a totally transparent manner. One of the benefits of this approach is that you can make immediate use of the new capabilities of the ForgeRock platform –  like Intelligent Access, self-service trees, and passwordless authentication.  

Specifically, the new SiteMinder Plug-Ins for the Modernize IAM Accelerators can help in the following areas:

Authenticate in SiteMinder and single sign-on (SSO) to ForgeRock Authenticate in ForgeRock and SSO to SiteMinder SSO to legacy apps integrated via CA agents Migration of user profiles Secure migration of user passwords Just-in-time (JIT) provisioning  Modular and extensible for easier integration into current environments Open source so it relies on industry standard protocols and libraries

In the end the SiteMinder Plug-Ins are designed to help make migration seamless and invisible to the user, while having a significant impact in time to value around the design and build of coexistence and user migration strategies. 


Learn more about modernizing legacy systems here, or contact your ForgeRock sales rep or partner today.

Tuesday, 22. September 2020

KuppingerCole

Information Protection in Cloud Services

Today’s economy is clearly driven by data. The most successful companies are those that can use this data to create useful information that enables them to get closer to their customers, to create new products and be more efficient. Cloud services are a key enabler in this, they allow the capture, storage, and exploitation of vast amounts of data without the need for capital expenditure. They enab

Today’s economy is clearly driven by data. The most successful companies are those that can use this data to create useful information that enables them to get closer to their customers, to create new products and be more efficient. Cloud services are a key enabler in this, they allow the capture, storage, and exploitation of vast amounts of data without the need for capital expenditure. They enable the rapid development and deployment of new applications as well as the modernization of existing one.




Global ID

The GiD Report#127 — Amid TikTok deal, Ray Dalio explains China’s differences

The GiD Report#127 — Amid TikTok deal, Ray Dalio explains China’s differences Welcome to The GiD Report, a weekly newsletter that covers GlobaliD team and partner news, market perspectives, and industry analysis. You can check out last week’s report here. ICYMI: In case you missed it, Greg’s latest Linqto panel appearance is on YouTube — his chat alongside Tim Draper focused on the crypto
The GiD Report#127 — Amid TikTok deal, Ray Dalio explains China’s differences

Welcome to The GiD Report, a weekly newsletter that covers GlobaliD team and partner news, market perspectives, and industry analysis. You can check out last week’s report here.

ICYMI:

In case you missed it, Greg’s latest Linqto panel appearance is on YouTube — his chat alongside Tim Draper focused on the crypto landscape. Greg’s piece about Apto’s upcoming developer-first, instant issuance program — Partner Spotlight: Introducing Developer First, Instant Card Issuance from Apto

What we have for you this week:

TikTok/WeChat and the fundamental differences between China and the U.S. according to Ray Dalio “Common Code: An Alliance Framework for Democratic Technology Policy” We need to rethink privacy and security for pretty much everything post-pandemic Stuff happens 1. With President Trump giving his blessing to the TikTok/Oracle deal and his WeChat ban running into some legal challenges, Ray Dalio has published a timely post on China, his latest in his Big Cycles series. Chapter 6: The Big Cycle of China and Its Currency Photo: Web Summit

We won’t dive too deep into this one, but he provides a neat overview on how the Chinese historically think and operate by drilling down on its core philosophies:

The Chinese culture developed as an extension of the experiences the Chinese had and the lessons they learned over the millennia. They were set out in philosophies about how things work and what ways work best in dealing with these realities. These philosophies made clear how people should be with each other, how political decision making should be done, and how the economic system should work. In the Western world the dominant philosophies are Judeo-Christian, democratic, and capitalist/socialist. Each person pretty much chooses from these to come up with the mix that suits them. In China, the main ones were Confucian, Taoist, and Legalist until the early 20th century when Marxism and capitalism entered the mix. The most desired mix to follow has historically been the emperor’s most desired mix. Emperors typically study Chinese history to see how these have worked and come up with their own preferences, put them into practice, learn, and adapt. If the mix works, the dynasty survives and prospers (in their parlance it has the “Mandate of Heaven”). If it doesn’t, it fails and is replaced by another dynasty. This process has gone on from before history was recorded and will go on as long as there are people who have to decide how to collectively do things.
While I can’t do these philosophies justice in a couple of sentences each without digressing too deeply (though I will go into them more deeply in Part 2), here is the best I can do:
Confucianism seeks to bring about harmony by having people know their roles in the hierarchy and know how to play them well starting from within the family (between the husband and the wife, the father and the son, the older sibling and the younger sibling, etc.) and extending up to the ruler and their subjects, with them bound together by benevolence and obedience. Each person respects and obeys those above them, who are both benevolent and impose standards of behavior on them. All people are expected to be kind, honest, and fair. Confucianism values harmony, broad-based education, and meritocracy. Legalism favors conquest and unification of “all under heaven” as soon as possible by an autocratic leader. It believes that the world is a “kill or be killed” jungle in which the strength of the emperor’s central government and strict obedience to it must exist without much benevolence given to the people by the emperor/government. The Western equivalent is fascism. Taoism teaches that the laws of nature and living in harmony with them are of paramount importance. Taoists believe that all of nature is composed of opposites and that harmony comes from balancing them well — yin and yang. This plays an important role in how the Chinese seek the balance of opposites.

And how China’s culture fundamentally differs from the U.S.:

All of these Chinese systems from the beginning of recorded history were hierarchical and non-egalitarian. I was told by one of the most senior Chinese leaders, who is also a highly informed historian and an extremely practical top policy maker, that the core difference between Americans and the Chinese is that Americans put the individual above all else and the Chinese put the family and the collective above all else. He explained that Chinese leaders seek to run the country the way they think parents should run the family — from the top down, maintaining high standards of behavior, putting the collective interest ahead of any individual interest, with each person knowing their place and having filial respect for those in the hierarchy so that the system works in an orderly way. He explained that the word “country” consists of two characters, “state” and “family,” which represents how the leaders view their roles in looking after their state/family — like strict parents. So one might say that the Chinese government is run from the top down (like a family) and optimizes for the collective while the American approach is run from the bottom up (e.g., democracy) and optimizes for the individual. (These differences of approach can lead to policies that those on the opposite side find objectionable, which I will explore in more detail in the next chapter.)

These are fundamental differences that are hard to reconcile. Extrapolating this framework to the current ongoing conflicts around the internet and platforms makes you think that existing divisions will only deepen without some sort of new approach.

2. And so you have this:

Axios:

A group of researchers from Europe, the U.S. and Japan are proposing a “tech alliance” of democratic countries in response to the Chinese government’s use of technology standards and its tech sector as instruments of state power abroad, according to a version of the proposal viewed by Axios.
The proposal, called “Common Code: An Alliance Framework for Democratic Technology Policy” to be published next month, recommends that founding members of the new tech alliance include Australia, Canada, France, Germany, Italy, Japan, South Korea, Netherlands, United Kingdom and the U.S., as well as the European Union.
What they’re saying: “The status quo of uncoordinated and reactive technology policymaking for the major democratic technology powers in Asia, Europe and North America means growing risk of ceding their technological leadership,” the authors write.
3. All of this of course coincides with the ongoing shift to digital as a result of the pandemic.

And there’s plenty to figure out on that front as well — Axios:

The shift to online schooling is running roughshod over children’s privacy rules and rights and creating new inequalities, experts tell Ashley Gold.
The big picture: Minors are the only group that enjoys federal online privacy protections in the U.S., but that’s not enough to protect their privacy rights as school districts and teachers scramble to move all classwork to the internet amid the pandemic.
Why it matters: These rules aren’t just technicalities. If companies don’t maintain rigorous data security and privacy practices, children’s personal information, photos and video could end up in the wrong hands.
What’s happening: “We’re trying to take brick-and-mortar school and shove it onto the internet, and the two things just aren’t compatible,” Karen Richardson, executive director for the Virginia Society for Technology in Education, told Axios.
4. Stuff happens: Cory Doctorow railing on GDPR Facebook introduces a co-viewing experience in Messenger, ‘Watch Together’ — TechCrunch Via /jvsUS Postal Service Files A Patent For Voting System Combining Mail And A Blockchain Via /jvsOnline Payroll Services for Small Businesses | Square Talking Tech and Holograms with Mark Zuckerberg! Apple’s Reckoning Has Come. Kraken Becomes First Crypto Exchange to Charter a US Bank — CoinDesk Scoop: Facebook is building feature to combine business messaging Why International Identity Day is so important for us to support and celebrate Bitcoin CEO: MicroStrategy’s Michael Saylor Explains His $425M Bet on BTC — CoinDesk Leaked EU Draft Proposes All-Encompassing Laws for Crypto Assets — CoinDesk WSJ News Exclusive | FTC Preparing Possible Antitrust Suit Against Facebook Chinese firm harvests social media posts, data of prominent Americans and military

The GiD Report#127 — Amid TikTok deal, Ray Dalio explains China’s differences was originally published in GlobaliD on Medium, where people are continuing the conversation by highlighting and responding to this story.


Civic

ICYMI: Health Key by Civic at the UNWTO Forum

Skip to 1:25:36 to see Civic’s presentation. Recently, our CEO, Vinny Lingham, joined a forum hosted by the United Nations World Tourism Organization (UNWTO) to present an overview of Health Key by Civic™. The forum, called “Tourism Tech Adventures on Aviation Technologies,” showcased the best ways to jumpstart the tourism industry as the world evaluates […] The post ICYMI: Health Key by Civic a
Skip to 1:25:36 to see Civic’s presentation.

Recently, our CEO, Vinny Lingham, joined a forum hosted by the United Nations World Tourism Organization (UNWTO) to present an overview of Health Key by Civic. The forum, called “Tourism Tech Adventures on Aviation Technologies,” showcased the best ways to jumpstart the tourism industry as the world evaluates safe reopenings. The UNWTO was founded to cultivate modern tourism, which is closely linked to socio-economic progress around the world. 

Health Key by Civic offers one way to provide safe passage abroad. The solution can help travelers keep their identity and health information private, while allowing organizations to create safe spaces on airplanes, in hotels, and at large gatherings. It is built to comply with international GDPR and CCPA and other data privacy regulations.

For travelers, providing health status verification to gatekeepers is as easy as getting a vaccine from a qualified partner and then setting up Health Key by Civic in our Civic Wallet app. When a user signs up, they are authenticated as a real person, using both AI and blockchain-based technology. Civic does not take custody of personal data or medical records. 

For businesses, Health Key by Civic streamlines the management of entry points. This is critical because there are complicated logistics that the industry will need to navigate. First, there are many vaccines currently undergoing testing with varying efficacy rates. Second, there are 172 economies collaborating on distribution. This means that countries and organizations will need to confirm that an individual has received a vaccine, whether the vaccine received is effective, and which local entry point regulations must be followed around the world. It’s a problem that may be solved with thoughtful technology.

While we’ve been working hard to provide the right kind of technology solutions, we also know that collaboration on health status verification is essential. Civic partnered with Circle Medical to supply testing for interested San Francisco Bay Area companies. Additional regional testing partners will be coming online to support customers in other locales. Together, we seek to give individuals control over their own credentials and facilitate economic rebuilding.

And, along with collaborating in the medical and international communities, Civic has been making progress on other fronts. Recently, we’ve been working with California lawmakers on a bill that will provide privacy-oriented guidelines for proof-of-health technology.

To learn more about proof-of-health and the travel industry, we invite you to watch the UNWTO forum and check out Health Key by Civic.

The post ICYMI: Health Key by Civic at the UNWTO Forum appeared first on Civic Technologies, Inc..


KuppingerCole

5 Key Benefits of Marrying IGA and ITSM

For today’s companies, IT service management is more than IT support. ITSM is about working behind the scenes to help employees to do the work that drives your business – providing a one-stop shop for service needs, upgrades, improvements, and asset management. If IGA doesn’t play a critical role in your ITSM strategy, it should. We frequently hear from customers who are looking to better align

For today’s companies, IT service management is more than IT support. ITSM is about working behind the scenes to help employees to do the work that drives your business – providing a one-stop shop for service needs, upgrades, improvements, and asset management.

If IGA doesn’t play a critical role in your ITSM strategy, it should. We frequently hear from customers who are looking to better align IGA and ITSM, and our conversations with the analyst firm KuppingerCole often focus on this topic as well.

Simply put, it just makes sense to marry IGA and ITSM. There’s a relationship between every service that an employee can access and the role of that employee within your organization. Sometimes it’s a simple matter of convenience, such as automatically granting new hires in Sales access to the sales software. Sometimes it’s a matter of security and risk management, so employees with privileged accounts can safely view mission-critical data when they aren’t on the physical network.

On Oct. 1, I’ll be delivering a keynote at the KuppingerCole IGA Solutions for ServiceNow Infrastructures virtual event focused on the alignment of ITSM and IGA. At Clear Skye, we’re well positioned to talk about this, since we’re an IGA solution built on top of the ServiceNow platform – running natively without the need for custom integrations. Here’s a look at five key ways your company performs better when IGA and ITSM are able to work hand in hand.

Read Martin Kuppinger's Executive View | Clear Skye IGA: IGA on the ServiceNow NOW Platform

Self-Service Requests. It’s one thing to automate service requests such as account or group access. Why not take that to the next level and apply logic to the request based on a user’s role? Say a VP of Operations in the Dallas office requests permission to join one Active Directory group. There are likely other groups for Operations personnel, VPs, and the Dallas office that she should also join – as well as legacy Active Directory groups that she should be steered away from. The same is often true for many other types of requests. If an employee wants access to the travel booking software, for example, he probably wants access to the expense reporting tool, too. Identifying and taking action on similar requests simultaneously will save time for users and ITSM alike. As an added bonus, your end-users will be using an interface that they already feel comfortable with: ServiceNow. That means increased productivity and faster time to value.

Separation of Duties. If a software engineer based in Asia is logging onto the network in the middle of the night to access a dev server, that’s not unusual. If a VP of Accounting is logging on from New York at 3 a.m. to access financial software, well, that’s a problem. Restricting access to certain systems – by requiring higher-up approval or denying access outright – can prevent activity both nefarious and accidental (such as an HR rep accidentally attaching the salary spreadsheet to a company-wide email). You could also take this level of access a step further – and protect valuable assets to boot – by restricting access to certain systems unless a user has completed security training, installed required patches, or updated potentially vulnerable systems, all by matching their privileges to the systems they have access to and the level of security those systems require. The ServiceNow Common Services Data Model (CSDM) gives you visibility and alignment in data value, utilization, hygiene, compliance, governance, risk and security.

Incident Management. Responding to open tickets is often a matter of hunting for valuable information such as what permissions a user has, what software versions a user is running, or whether that security training was completed. When you bring together ITSM and IGA, the analyst working on a ticket can pull up this information right away and solve the problem – along with identifying other issues that a user may need to address, such as expiring credentials. For organizations with thousands of employees, this improved efficiency can quickly save a full FTE on the Help Desk.

Change Management. Knowing the roles and privileges of all end users who will be impacted by a software upgrade or hardware replacement helps organization manage risk, plan backup support, schedule outages, and identify vulnerabilities. This has the clear benefit of minimizing the disruption to clinical business services. It also provides predictive and retrospective insight into which changes can be treated as low-risk repeatable actions and which changes require more extensive preparation, as well as knowledge of which roles are most likely to request which changes. Instead of being a hassle, change management can become an efficient and differentiating process for your organization.

Mobility. Employees who respond to incidents or manage deliveries on a large campus rarely have a physical office, so mobile apps are critical for their fay-to-day role. They’ve also become a lifeline for remote employees who are juggling school and family care and need the flexibility of mobile solutions. Marrying IGA and ITSM on the Now Platform is more than just being able to offer SSO for multiple mobile solutions – it’s giving employees a single point of entry for those solutions, one that’s maintained by a trusted partner in ServiceNow.

Looking to the Future

The ultimate goal of marrying IGA and ITSM is taking a more holistic approach to enabling your staff / people / teams. Instead of the stimulus-and-response process of opening and closing tickets to fix a single issue, it’s being able to draw on a wealth of data that’s no longer sitting in silos to automate when possible and evaluate risk when necessary – whether it’s risk from outdated security settings, risk from the combination of a user’s privileges, or a bit of both. This enables an organization to manage the present and plan the future, whether it’s updating critical systems, culling obsolete Active Directory groups, or updating admin privileges.

Few organizations have truly reached this point, however. Instead of breaking down silos and leveraging single control plans and common interfaces, the market has primarily focused on integrating and synchronizing data and workflows across two or more disparate systems. This only adds complexity – which is the last thing ITSM needs.

At Clear Skye, we believe there’s a better way. Register to join our keynote discussion on Oct. 1 and learn more about our vision to bring IGA and ITSM together natively on the Now Platform.


Ontology

Citadel.one Joins Ontology’s Global Community Contributor Node Network

We are delighted to announce that Citadel.one has officially joined the Ontology Global Community Contributor (GCC) network, a program geared towards accelerating the process of applying blockchain technology to tangible business use-cases. Citadel.one is an all-in-one interface for decentralized finance (DeFI) that allows users to participate in DPoS consensus, an implementation where all stakeh

We are delighted to announce that Citadel.one has officially joined the Ontology Global Community Contributor (GCC) network, a program geared towards accelerating the process of applying blockchain technology to tangible business use-cases.

Citadel.one is an all-in-one interface for decentralized finance (DeFI) that allows users to participate in DPoS consensus, an implementation where all stakeholders receive rewards depending on the degree of contribution. The platform also allows users to create native public addresses for all supported networks with one seed phrase. On top of that, by allowing users to link their hardware devices like Trezor or Ledger, Citadel.one is able to support users with a built-in dashboard including features such as portfolio overview, charts analytics, rewards history, transaction history and status tracking.

The Ontology Foundation will provide an allocation of ONT tokens on the Citadel.one node as a result of this partnership in order for Citadel.one to start its node on the GCC program.

The Citadel team is committed to contribute in building the decentralized infrastructure of our world. This is exemplified by their smart-voting system, which allows users to participate in the lives of the community through creating polls and tracking data. Citadel.one also allows users to track their assets across multiple networks, incorporating proper reward calculation and useful analytics dashboards along the way. In addition, received funds are automatically staked, which results in less time wasted and higher efficiency — a sentiment that is of high importance to both Ontology and Citadel.one.

In regards to the announcement, Ontology Co-founder Andy Ji said, “It is with great excitement that we announce the addition of Citadel.one to our network. Citadel.one is a remarkable Proof-of-Stake platform with whom we are delighted to partner with in pursuit of our shared ambition of building the decentralized infrastructure of the world. Their innovative platform brings forward a new way of efficiently storing wealth, among much more, and we are delighted to be collaborating with their talented team on this highly worthwhile effort.”

If you have any questions or feedback, please contact us via contact@ont.io. If you’re interested in applying to become an Ontology GCC, please email gcc@ont.io with a short description of your platform’s service offering and GCC aspirations.

About Citadel.one

Citadel.one is a non-custodial Proof-of-Stake platform for the management and storage of crypto assets. Users can create public addresses for all supported networks with one seed phrase, connect their Ledger or Trezor device, or import an address generated by another wallet. The analytical dashboard provides relevant information on wallets’ balances and networks’ main metrics. One of the main functions of the Citadel.one platform is participation in the PoS consensus — users can stake and delegate their assets, claim rewards, and follow the latest network proposals in the voting tab.
Citadel.one offers its users instant cryptocurrency exchange services that allow fast and secure crypto assets swap, and it is also possible to buy and sell crypto with a credit or debit card. Among PoS platforms Citadel.one supports Cosmos ($ATOM), ICON ($ICX), IOST, Orbs and Tezos. For our users convenience the platform also supports Ethereum, Bitcoin and Tether ($USDT). Mobile and desktop versions, new networks, including Polkadot and Ontology are scheduled for the upcoming update. Citadel.one has a growing community across all major social networks including Telegram, Twitter, Medium, LinkedIn, and more.

Find Ontology elsewhere

Ontology website / ONTO website / OWallet (GitHub)

Twitter / Reddit / Facebook / LinkedIn / YouTube / NaverBlog / Forklog

Telegram Announcement / Telegram English / GitHubDiscord

Citadel.one Joins Ontology’s Global Community Contributor Node Network was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.

Monday, 21. September 2020

Bloom

Roadmap to Unsecured DeFi Lending and Credit Scoring

We believe the decentralized identity layer is finally mature enough to enable the next phase of development in DeFi. In an upcoming series of posts we will dive into the components that need to be built to enable unsecured DeFi lending, and lay out our vision for how to get

We believe the decentralized identity layer is finally mature enough to enable the next phase of development in DeFi. In an upcoming series of posts we will dive into the components that need to be built to enable unsecured DeFi lending, and lay out our vision for how to get there.

What has Bloom been up to?

For the last two and a half years, Bloom has been working towards the vision of a financial services ecosystem which is open and accessible to all individuals with any amount of inclusion in the traditional credit system.

In our original whitepaper, we outlined a multi-phase approach to achieve a decentralized credit system: First, we needed an identity layer that allowed individuals to receive, store, and share verified attributes about themselves. Next, we built a competitive marketplace to efficiently seek prices for identity verifications. Finally, with the first two pieces in place, we could build a system to aggregate data and build open risk assessment systems.

In April 2018, we built a discovery marketplace based on the BLT token, using a combination of smart contracts and Ethereum Whisper, which enabled vendors and individuals to find each other, in order to facilitate payment for identity verification related work. Shortly thereafter, we engaged in negotiations with data vendors to get the right data into the Bloom ecosystem and built connectors that enabled users to pull in data about their phone numbers, email addresses, watch-list screens, social media, identity documents, income & assets.

First Key Lesson: Ecosystem Data

Attaining vendor participation in Bloom’s marketplace presented challenges. On the one hand, most vendors require custom enterprise contracts with monthly or yearly minimums. On the other hand, the scalable approach to these marketplaces requires transaction-based pricing. The vendors simply didn't understand the potential in a decentralized data marketplace. So, we got to work on building the demand for that data, with the goal of re-introducing the marketplace at the right time.

Second Key Lesson: Built-In User Privacy

By the EthSanFrancisco Hackathon in Fall 2018, Bloom had a fully functional end-to-end flow - Users could sign up, generate a unique identifier, receive credentials, and share with any 3rd party that integrated Bloom’s free-to-use Bloom Share Kit. The overwhelming number of project submissions (Lending Party, SendPut, BlocuSign) at that first hackathon led the Bloom team to expand Bloom’s vision of the identity layer.

Users wanted the ability to selectively disclose identity traits, share data without being tracked by an Ethereum address, and pull in traits we hadn't considered before. Our team responded to that need by building the Selective Disclosure Merkle tree spec.

Third Key Lesson: No Point in Doing This Alone

In February 2019, our team went to EthDenver to share the latest Share Kit improvements and review the next round of feedback and submissions from developers (Check out the workshop video here). When we arrived, we learned the conference was using UPort decentralized identity wallets to register attendees for events. We each downloaded uPort, claimed a credential that verified our email addresses, and shared them by scanning a QR code. We realized that interoperability would have enabled us to register for the events using the Bloom app. Clearly, we needed to expand BloomID to be part of the wider decentralized identity community! This realization compelled the Bloom team to join the Decentralized Identity Foundation and start collaborating with the 100+ other companies working to make decentralized identity a reality.

How has Bloom decentralized the identity layer?

Since Bloom began, we have migrated the protocol to adopt the most open, interoperable, and extensible proposals from the community.

BloomIDDID (Decentralized Identifier) Bloom AttestationsVerifiable Credentials Bloom VaultEncrypted Data Vault & Secure Data Store community Bloom Selective Disclosure Merkle Tree -> Mattr BBS signatures (among others) Bloom Attestation Smart Contracts W3C RevocationList2020 spec (among others) Bloom Share Kit Presentation Exchange Specification & Credential Handler API (CHAPI) Bloom Share Kit PayloadVerifiable Presentations

More than 60 DID methods have been invented, none by Bloom. Bloom products have adopted the Element specification which is supported by multiple companies and is based on a protocol called Sidetree, maintained by Transmute and Microsoft, among others.

Why do we think the identity layer is ready?

The decentralized identity space is rapidly maturing. Additionally, developer tooling is becoming more and more accessible. The following examples illustrate an industry shift to make decentralized identity significantly more approachable for companies

Tutorial from Microsoft on generating DIDs & Verifiable Credentials using Azure Mattr tutorial for issuing credentials

What is the role of the Bloom App?

Think of the Bloom App as your DeFi credit utility. We aim to provide users with the best on-ramp experience to DeFi with vetted service providers, and access to all the data you need to get started. We want to open up the world of DeFi savings & lending products to traditional users. That's why in the app today, you'll find an experience that is familiar and close to many other free credit monitoring applications, with a live integration with TransUnion for credit scoring, and breach monitoring powered by Have I Been Pwned.

Why isn't there already an affiliate marketplace in the app?

Some users may remember we briefly had a BloomID based marketplace in the app, but restrictions from Apple forced us to move this online. We will be revamping our approach to this marketplace with instant onboarding opportunities for users, in a way that also allows us to stay within Apple's guidelines.

What do we need to make it happen?

We need to call on all of our early partnerships from the early days of crypto, as well as newer companies that are working towards the vision of accessible DeFi products for consumers like Teller.

At the same time we have prepared ourselves for the next step

We have built the first specification & reference implementation for a Verifiable Credential Aggregator (BloomIQ) that solves the issue of user controlled aggregation without self-censoring negative signals. In an upcoming series of DeFi related posts, we will expand on BloomIQ and how we plan to roll it as part of our journey to enabled unsecured DeFi loans and credit-based crypto on-ramps.

Sunday, 20. September 2020

KuppingerCole

KuppingerCole Analyst Chat: Business Resilience Management Part II

Warwick Ashford and Matthias Reinwarth talk about business resilience again, focusing on cyber supply chain risk management.

Warwick Ashford and Matthias Reinwarth talk about business resilience again, focusing on cyber supply chain risk management.




KCLive Award: Best IAM for Mid-Market Project



Saturday, 19. September 2020

Self Key

Governance Tokens: DeFi Giving Control Back to the Users

SelfKey Weekly Newsletter Date – 16th September, 2020 In this edition, read about Governance tokens and how it is democratizing crypto. The post Governance Tokens: DeFi Giving Control Back to the Users appeared first on SelfKey.

SelfKey Weekly Newsletter

Date – 16th September, 2020

In this edition, read about Governance tokens and how it is democratizing crypto.

The post Governance Tokens: DeFi Giving Control Back to the Users appeared first on SelfKey.

Friday, 18. September 2020

procivis

Online Session: Enhancing resilience in the Digital State Meeting of Thursday, 3 September 2020

Presentation by Member of the European Parliament Eva A. Kaili, Chair, Panel for the Future of Science and Technology, European Parliament, Belgium Presentation by Dr Gianluca Misuraca, Senior Scientist, European Commission, Joint Research Centre, Institute for Prospective Technological Studies, Spain The Procivis Think Tank meeting of 3 September 2020 took place against the backdrop of […] The

Presentation by Member of the European Parliament Eva A. Kaili, Chair, Panel for the Future of Science and Technology, European Parliament, Belgium

Presentation by Dr Gianluca Misuraca, Senior Scientist, European Commission, Joint Research Centre, Institute for Prospective Technological Studies, Spain

The Procivis Think Tank meeting of 3 September 2020 took place against the backdrop of a corona flare up and new travel restrictions in Europe. Accordingly, at short notice we switched to our first-ever webinar format.

Our first speaker, MEP Eva Kaili, shared her views against the background of the announcement of the EU’s €750 billion “Next Generation” pandemic recovery plan in addition to its new long-term budget of €1,100 billion. She highlighted the EU’s priorities – “green and digital” and security in North Africa/Middle East.

How did the European parliament operate during the lockdown?

Eva Kaili explained that ballots were submitted via email. This illustrates how one always makes do with the technology that one has during an emergency. Projects were under way to make the system more resilient. During the Q&A with Eva Kaili it was also helpful to be reminded again that the EU only has the competences that were conferred to it by member states in treaties.

Shaping Digital Europe 2040

Gianluca Misuraca presented his research on “Shaping Digital Europe 2040: Artificial intelligence & public sector innovation in a data-driven society”. It was interesting to see that the EU is funding such long-term research. Looking ahead to 2040, he presented scenarios based on a sliding scale of digital participation by citizens vs. the degree to which this participation was regulated.  The digital future might not be knowable, but we can imagine its contours, challenges and opportunities.

The post Online Session: Enhancing resilience in the Digital State Meeting of Thursday, 3 September 2020 appeared first on Procivis.


Meeco

Meeco announces KidTech partnership with Heder to co-develop safe media platform for kids

Meeco has some very happy news to share about a new partnership with Heder VZW and a new KidTech product planned to launch December 2020. But first, we would like to share a little about why we believe this is so important. Back in April 2019, we presented our thoughts at the ... Read More The post Meeco announces KidTech partnership with Heder to co-develop safe media platform for
Meeco has some very happy news to share about a new partnership with Heder VZW and a new KidTech product planned to launch December 2020. But first, we would like to share a little about why we believe this is so important. Back in April 2019, we presented our thoughts at the World Government Summit on the digital future for Kids.
“Every child born in today’s digital world will live a dual physical and digital life. How we enable privacy and security for their digital twin in as important as the rights afforded to them in the physical world. If we don’t get this right, we risk a generation born into digital slavery”

– Katryna Dow, Meeco
World Government Summit, 2019
Sadly, a year on from the Summit and our concerns are evidenced all too often. This week by the news that the parents of 5 million British children under the age of 13, are mounting a legal battle with the tech giant Google, for $3 Billion. At the heart of their claims is the desire to protect the digital rights of their children. The lawsuit alleges that YouTube’s method of targeting underage audiences constitute “major breaches” of UK and European privacy and data rules designed to protect citizens, especially kids. This month NETFLIX released The Social Dilemma, a must watch documentary on the social media addiction that is both alluring and alienating for each of us, especially our children. The documentary includes alarming statistics on the increase in childhood depression, self-harm and suicide. Most alarming is that the age group impacted is getting younger and younger and directly linked to their early access to social media. These platforms make it all too easy for underage children to access and be subjected to unfiltered feedback that contributes to feelings of inadequacy, dissatisfaction and isolation.
“There has to be a better way for us to enable children to participate in the digital world, along with making their first digital experiences safe.”
– Katryna Dow, Meeco
Enter Heder, a not-for-profit organisation based in Antwerp, Belgium. Heder provides care and guidance to improve quality of life for children with different abilities, along with their families and extended networks. Heder’s philosophy is to always start from present strengths. Finding small things or looking for bigger solutions that enable people to focus on their unique strengths. Fostering pride and allowing themselves to experience joy from everyday things. Cultivating this from early childhood, together with support from family and community really makes a difference. Their multidisciplinary experts work together to give support in different ways including creative play, adjusted sports, training, physical and psychological therapy. Heder offers services in the home, at day-care, school or on campus, and is just some of the ways Heder contributes to a more inclusive society. And now, there’s another way; drawing on Heder’s philosophy of building on strengths, the idea for mIKs-it was born. After a number of years of research, the inspiring vision of Gaby Pereira Martins (Masters in Educational Sciences) and Kim Struyf (Director of Early Development), supported by Heder’s Senior General Manager, Erik Van Acker, will now become a reality.
“Together with Meeco we can give mIKs-it a strong foundation”
– Kim Struyf, Heder
Meeco is honoured to announce our partnership with Heder to co-develop a media platform for kids, focussing on the developmental stages of 0-7 years. The platform is called mIKs-it. It has been designed to foster joy and connection to everyday life for kids, with privacy, security and control as the foundation. Early childhood is the most vulnerable time in a child’s development as they are totally dependent on family and society for nurturing and protection. It is also likely the period when they will first come into contact with technology. Whether it is playing with their parents’ phone, watching YouTube or listening to content, all these media choices present new challenges to navigate in order to keep children safe. In designing mIKs-it we had the privilege to work with Trendwolves, who validated the design, usability and kid’s experience through a series of research projects. The research was led by Maarten Leyts, the founder of Trendwolves. Maarten brought unique insight to the project, given his expertise in KidTech. Many of the issues we wanted to understand were addressed by Maarten in his upcoming book “Generation Alpha in Beta”, due for release later this year.
“Today’s and tomorrow’s kids swipe before they draw. Say hi to Generation Alpha. But we’re facing a challenge here. Fully immersed in technology during their formative years and in a fast-changing world, differentiates Gen Alpha from previous generations.”
– Maarten Leyts, Trendwolves
Working with Trendwolves was the obvious choice as they specialise in trend research focusing on families and global youth culture. Their insights help shape and validate product design, always focussing on how we can enhance meaningful human relationships.  Following a successful proof-of-concept completed earlier in the year, the decision to develop the platform was clear. The results of the research, and especially observing kids with a wide range of abilities interact with personal content touched our hearts and motivated our action. Especially as the study included kids with a diverse range of physical and developmental abilities, focussing on media formats for kids with reduced motor skills or that are visually or hearing impaired.
“Meeco’s guiding principle in the digital world is the same as Heder’s principle for the physical world: children are entitled to a safe and positive climate in which to flourish.”
– Gaby Pereira Martins, Heder
mIKs-it has been designed to provide a safe digital place for children to interact with multimedia; photos, video and audio files. The platform is developed using Meeco’s globally awarded technology, with content secured and encrypted using the same approach Meeco developed to protect bank applications, like the digital vault for KBC Bank in Belgium. mIKs-it is a family platform and not a social network. There are two companion apps – a media app for kids, supported by an app for their trusted grown-ups to manage connections, consent and content. The features of mIKs-it are all the things that it doesn’t do, such as: no ads no tracking no manipulation no unauthorised access to your media no contacts without your approval no content without your consent no data mining Most importantly, the control is always with parents and guardians, and personal data is never sold!
“Heder and Meeco is a purpose driven partnership, our shared goal is to enable a more empowering digital experience for all children.”
– Erik Van Acker, Heder
If you would like to know more about our progress over the coming months and the launch you can register your interest at the mIKs-it website.
We realise there’s still a lot to do to help our kids develop healthy and empowering digital habits, but our hope is that mIKs-it is a small, but meaningful step in the right direction. Now we invite you to join us on this most wonderful adventure, into the land of mIKs-it: a playground where children and their grown ups can connect and safely share snippets of everyday life and build lasting memories.

The safe multi-media app for kids

Made with by Heder & Meeco

The post Meeco announces KidTech partnership with Heder to co-develop safe media platform for kids appeared first on The Meeco Blog.

Thursday, 17. September 2020

MATTR

Using privacy-preserving ZKP credentials on the MATTR Platform

MATTR is proud to announce we’ve added support for privacy-preserving verifiable credentials on our platform using BBS+ signatures. Using a technique to implement selective disclosure, we’ve added the ability to generate credentials that support zero knowledge proofs without revealing any unnecessary information about the end-user, or placing any added burden on issuers, in the process. Since

MATTR is proud to announce we’ve added support for privacy-preserving verifiable credentials on our platform using BBS+ signatures. Using a technique to implement selective disclosure, we’ve added the ability to generate credentials that support zero knowledge proofs without revealing any unnecessary information about the end-user, or placing any added burden on issuers, in the process. Since we first introduced and open-sourced JSON-LD BBS+ Signatures at IIW30 in April of this year, we’ve received lots of engagement, feedback and contributions from the broader technical community to further develop the implementations and specifications we presented. You can read more about our approach to privacy-preserving verifiable credentials on our introductory blog post.

One of the benefits of using the BBS+ cryptographic scheme to sign credentials is the ability to derive a zero knowledge proof from the signature, where the party generating the proof can choose to partially disclose statements from the original message. When enabled, this feature allows issuers to create a credential that effectively enforces minimal data disclosure using the MATTR Platform and a compliant digital wallet.

Issuers can create ZKP-enabled credentials that allow the user to selectively disclose data

To support this functionality, we generate the keys required to support these signatures and create a Decentralized Identifier (DID) with the keys referenced in the DID Document. BBS+ signatures require what’s called a pairing-friendly curve, we use BLS12–381. This DID can be referenced in credentials to establish the issuer of the data, a common practice to allow a verifier or relying party to trace the root of trust in a credential.

To issue a ZKP-enabled credential, simply use our API endpoint to create a new DID Key with type set to BLS 12–381. Then, create a Verifiable Credential (VC) using your new DID Key as the issuer DID. Our platform will automatically detect this capability is available in your DID and create a ZKP-enabled BBS+ credential for you. You can use the platform this way to create a privacy-enabled credential, or you can create a regular credential by providing a DID with a different key type — you have the option.

On the user side, you can hold ZKP-enabled credentials in your wallet alongside all of your other credentials. We’ve designed this process in a way that minimizes friction to the user. In future updates, our Mobile Wallet App will be able to detect if BBS+ signatures are being used in a credential. When you get a request to verify some information contained in one of these privacy-enabled credentials, it will derive a new presentation that selectively discloses the required info using a zero-knowledge proof. The platform will then allow verification of the proof using the same interface as any other type of presentation.

Our integrated approach treats zero-knowledge proofs as an extension of VCs, rather than an entirely new framework with a separate set of dependencies. We have built BBS+ Signatures and privacy-enabled credentials into our platform for anybody to experiment with, in what we think is a significant milestone for standards-based credential solutions on the market today.

As a technology, BBS+ digital signatures can be used to sign more than just verifiable credentials. Combining these technologies is quite effective, though they can also be treated as modular or separate components. We’ve open-sourced software for creating and verifying BBS+ signatures in browser environments as well as node.js, and we’ve also published a library for generating BLS 12–381 keypairs for signing and verifying BBS+ Signatures.

By leveraging pairing-friendly elliptic-curve cryptography in the context of Linked Data Proofs, our approach provides an unprecedented way to perform zero-knowledge proofs using the semantics of JSON-LD. This allows credential issuers to tap into vast data vocabularies that exist on the web today, such as schema.org and Google Knowledge Graph, making user data more context-rich without sacrificing security and privacy of the user in the process. Not only is this approach more interoperable with existing implementations of the VC data model and semantic web technologies, it also doesn’t rely on any external dependencies to operate (like a distributed ledger), meaning it’s far more efficient than other approaches based on CL-signatures and zk-SNARKs. We’ve open-sourced our LD-Proofs suite for VCs including performance benchmarks so you can check it out yourself.

We’re excited to finally make these powerful privacy features easily accessible for everyone, and we can’t wait to see what you build with it. To get started, sign up now on our website and follow our tutorials on MATTR Learn to start creating ZKP-enabled verifiable credentials on the MATTR Platform.

Additional Links

Open-source:

Node JS BBS+ Signatures — BBS+ signatures implementation for node.js environments WASM JS BSS+ Signatures — BBS+ signatures implementation for browser & node.js environments BLS 12–381 Key Pair JS — crypto keys for signing/verifying BBS+ signatures BBS+ JSON-LD Signatures JS — uses BBS+ signatures & BLS 12–381 keypair in a Linked Data Proofs suite (for use in VC implementations)

Specifications:

BBS+ JSON-LD Signatures Spec — specifies linked data suite for BBS+ signatures BBS+ Signatures Spec — definition of BBS+ signatures scheme

The article Using privacy-preserving ZKP credentials on the MATTR Platform appeared first on MATTR.


KuppingerCole

Matthias Reinwarth: Beyond Static Access - Leveraging Access Policies To Deal With The Increasing Complexity Of Access Governance




Olivier Schraner: Adapting IGA to Your Digital Agenda

As more products become digitally presented and delivered, process agility increases and the requirements against IGA solutions change significantly. Established patterns need to be shed, and new approaches to governing your human and robotic workforce become essential. This talk looks at the evolution of IGA requirements in the face of rapid business transformation, and explores different approac

As more products become digitally presented and delivered, process agility increases and the requirements against IGA solutions change significantly. Established patterns need to be shed, and new approaches to governing your human and robotic workforce become essential. This talk looks at the evolution of IGA requirements in the face of rapid business transformation, and explores different approaches of solving new challenges while keeping then enterprise safe and compliant.




Interview with James Taylor




Mans Hakansson: Modernizing IAM - Implementing Policy Based Access Management & Governance

In this session PlainID will discuss how organizations can rethink, redesign and modernize their Identity and Access Management (IAM) architecture by implementing PBAC (Policy Based Access Control). This service should be a central service supporting not only one specific set of applications but rather act as a focal point (or a “brain” if you like) for different IAM technologies. This new archite

In this session PlainID will discuss how organizations can rethink, redesign and modernize their Identity and Access Management (IAM) architecture by implementing PBAC (Policy Based Access Control). This service should be a central service supporting not only one specific set of applications but rather act as a focal point (or a “brain” if you like) for different IAM technologies. This new architecture pattern has evolved to better support more applications and more advanced use cases.




Nick Groh: Evolving Data-Driven Decision Making Beyond Identity Management

As organizations become increasingly digital, they must continue to evolve their IAM strategy to solve business challenges, support new initiatives, and incorporate data-driven decisions. In this session, Nick Groh will introduce the concept of data-driven decision making, including how artificial intelligence can help reduce the costs of decision-making. The session will also cover mobile trends

As organizations become increasingly digital, they must continue to evolve their IAM strategy to solve business challenges, support new initiatives, and incorporate data-driven decisions. In this session, Nick Groh will introduce the concept of data-driven decision making, including how artificial intelligence can help reduce the costs of decision-making. The session will also cover mobile trends and other sources of leveraging data, and focus on applications to identity management. This session will look at how IGA has mature use cases, but needs to be applied more broadly. Finally, there will be a discussion on how these applications extend beyond identity management, such as other areas of security, and how the business can incorporate identity data. 




Alpha Barry: The Value of Identity Governance and Administration in Non-Regulated Companies

While properly defined and tool-supported identity and access governance (IGA) is prevalent in regulated industries to ensure compliance, it is still fairly uncommon in mid-sized or even larger companies in non-regulated industry sectors. This has not been a problem in the past, when classical, data-center based IT infrastructure was dominant. Mr. Barry will point out why a lack of IGA can become

While properly defined and tool-supported identity and access governance (IGA) is prevalent in regulated industries to ensure compliance, it is still fairly uncommon in mid-sized or even larger companies in non-regulated industry sectors. This has not been a problem in the past, when classical, data-center based IT infrastructure was dominant. Mr. Barry will point out why a lack of IGA can become a major issue when introducing hybrid or cloud-based IT infrastructure, and will explain why tool-based IGA can even add long term value in automating the administration of a hybrid infrastructure environment.




Darran Rolls: Standing on the Beach, Looking at the Sea: Identity Governance & Administration, Today, Tomorrow and Sometime Later

In this session Mr. Darran Rolls with provide a unique perspective on the emergence, growth and future advancement of IGA technology.  In it, he provides an assessment of where we stand today with existing solutions and deployment approaches, and highlights where the industry needs to focus regarding program oversight, cross-system orchestration and integration with cloud and DevOps processes

In this session Mr. Darran Rolls with provide a unique perspective on the emergence, growth and future advancement of IGA technology.  In it, he provides an assessment of where we stand today with existing solutions and deployment approaches, and highlights where the industry needs to focus regarding program oversight, cross-system orchestration and integration with cloud and DevOps processes.

I’ll start working on the content this week and have some questions on format and delivery:

Is there a preferred slide template or format? What is the optimum approach record heads-up and rotate slides in a split-screen Will each presenter go over their slides live or are things pre-recorded?


David Black: The Use of Real World Identities in Support of Identity and Access Management




In an Age of Digital Transformation Managing Vendor and Partner Identity Is Critical

Organizations have been managing the identity and access of employees for many years to protect data and the overall security of the enterprise. However, the onset of digital transformation has driven a need for faster, cost-effective innovation and with it the increased utilization of third-party resources. Consequently, organizations have a greater need to manage third-party access to data, syst

Organizations have been managing the identity and access of employees for many years to protect data and the overall security of the enterprise. However, the onset of digital transformation has driven a need for faster, cost-effective innovation and with it the increased utilization of third-party resources. Consequently, organizations have a greater need to manage third-party access to data, systems, and facilities. This includes contractors, vendors but also partners, affiliates, volunteers, and even service accounts and bots. Modern organizations are much more collaborative and open structures than those of even just a few years ago and continue to change.




Global ID

Partner Spotlight: Introducing Developer First, Instant Card Issuance from Apto

Greg Kidd is the co-founder and CEO of GlobaliD and the founding partner of Hard Yaka, an investment firm focused on fintech ecosystems. His commitment to innovation and democratization in banking and payments has led to his involvement in other projects such as Apto, a Y-Combinator alum focused on developer-first card issuance he co-founded with CEO Meg Nakamura. In 2014, after several years wor

Greg Kidd is the co-founder and CEO of GlobaliD and the founding partner of Hard Yaka, an investment firm focused on fintech ecosystems. His commitment to innovation and democratization in banking and payments has led to his involvement in other projects such as Apto, a Y-Combinator alum focused on developer-first card issuance he co-founded with CEO Meg Nakamura.

In 2014, after several years working for a regulatory and compliance advisory company, I co-founded Apto with my colleague and friend Meg Nakamura. It became apparent from my experience that the financial services landscape was in need of a significant upgrade — the industry was dependent on outdated, legacy infrastructure and entangled in a web of opaque regulatory requirements, making it difficult to build and launch innovative and bespoke card programs specifically designed for their users.

The system was set up to serve only the most highly-resourced, established players in the market.

Since those foundational experiences in finance, we’ve been passionate about creating a payments ecosystem that is more fair and equitable, allowing the most aspirational companies to enter the market quickly and responsibly. To this end, we’ve prioritized working with those who are equally as enthusiastic about creating user-first experiences in the financial services sector as we built Apto. Since its Y-Combinator days, when Apto was known as Shift Payments, we now support several of the fastest growing fintech companies in the U.S. and expanded our business to Europe as of 2019. After building cutting-edge card programs for fintech innovators like Venmo and Coinbase, we’re expanding our purview to make card programs truly accessible for all.

I’m thrilled to announce that in the coming weeks, Apto will be taking an exciting and important step forward in democratizing fintech by launching an instant card issuance portal for developers at companies of any size.

Launching a card program is historically a long, painful process. Our goal in designing this program is to remove barriers to entry and shield you from the technical complexities to make it fast and easy to design and launch card programs in minutes.

Read Greg’s full post over at Apto Sign up for Apto’s waitlist

You might also like:

Meet the Team — Erik Westra, head of GlobaliD Labs GlobaliD App: Introducing SEPA and crypto transfers to your Wallet Why “developer-first” matters

Partner Spotlight: Introducing Developer First, Instant Card Issuance from Apto was originally published in GlobaliD on Medium, where people are continuing the conversation by highlighting and responding to this story.


Mythics Blog

Oracle Expands Government Cloud with National Security Regions for IC and DoD

Oracle has announced the expansion of the Oracle Government Cloud with National Security Regions for US Intelligence…

Oracle has announced the expansion of the Oracle Government Cloud with National Security Regions for US Intelligence…


KuppingerCole

Fudo PAM by Fudo Security

by Paul Fisher Fudo Security’s PAM solution is the company’s primary product in the expanding PAM market. In the last few years PAM has evolved into a set of targeted technologies that addresses some of the most urgent areas of business security in a period of rapid technological change. Digital transformation, Cloud, and Hybrid IT environments are creating new demands and innovative PAM solution

by Paul Fisher

Fudo Security’s PAM solution is the company’s primary product in the expanding PAM market. In the last few years PAM has evolved into a set of targeted technologies that addresses some of the most urgent areas of business security in a period of rapid technological change. Digital transformation, Cloud, and Hybrid IT environments are creating new demands and innovative PAM solutions are emerging to meet these challenges.


PingTalk

Keep Me Safe, Make Me Happy Pt 3

In Part 1 of this series, we started our journey of understanding why customers want both security and great customer experiences by examining how customer expectations have changed in the past few years. Customers want more than just great digital experiences: they also expect companies to protect their privacy and security and insist on true security for their digital identities.   Toda

In Part 1 of this series, we started our journey of understanding why customers want both security and great customer experiences by examining how customer expectations have changed in the past few years. Customers want more than just great digital experiences: they also expect companies to protect their privacy and security and insist on true security for their digital identities.

 

Today we’ll conclude this series by explaining how frictionless and enjoyable customer experiences can only be achieved by addressing security first.

 


MATTR

Introducing the MATTR Platform

Here at MATTR, we have been hard at work building a suite of products to serve the next generation of digital trust. We’ve designed our products based on a few key principles: extensible data formats, secure authentication protocols, a rigorous semantic data model, industry-standard cryptography, and the use of drivers and extensions to allow modular […] The article Introducing the MATTR Platfor

Here at MATTR, we have been hard at work building a suite of products to serve the next generation of digital trust. We’ve designed our products based on a few key principles: extensible data formats, secure authentication protocols, a rigorous semantic data model, industry-standard cryptography, and the use of drivers and extensions to allow modular and configurable use of the platform over time. By combining our core capabilities with extensions and drivers, our platform offers developers convenience without compromising flexibility or choice.

The MATTR Platform delivers digital trust in a scalable manner. Our belief is that a modular security architecture is one which can work across many different contexts. When it comes to trust, context is everything, and we know our users each have their own unique requirements and expectations when it comes to their digital interactions.

We provide flexible and configurable building blocks for trust on the web in order to create a digital ecosystem that can support global scale.

The platform consists of 3 main components:

Platform Core Platform Extensions Platform Drivers

Our platform provides the capabilities needed for digital trust through a set of modular and flexible building blocks known as our Platform Core. This includes the ability to establish and use DIDs, sign and encrypt messages, manage the verifiable credentials lifecycle, and share privacy-preserving verifiable presentations. Platform Core is designed as a set of simple APIs that are available for all of our users, with operational tools and documentation.

We’ve designed the platform to have cryptographic agility and flexibility built in at a fundamental level. Platform Drivers are pre-configured integrations that allow our capabilities to be pluggable and extensible over time, preventing vendor lock-in and enabling user choice. They identify key areas where flexibility, choice, and optionality are desirable and surface them to the user to promote more resilient security architectures for the future. They are typically surfaced to the user as pluggable parameters in our Platform Core.

Extensibility is a key component of our platform architecture. Platform Extensions are higher level capabilities that plug in to our platform, providing convenient and easy-to-access application logic, such as service orchestration and workflow. They are built on top of our Platform Core, allowing users to smoothly onboard and extend our platform as well as enabling MATTR’s digital trust infrastructure to integrate with digital services and protocols that exist outside of our products. They are modular components in terms of logic and configuration, operating independently of Platform Core as an extensible set of APIs.

Finally, we offer a growing number of Developer Tools to simplify the user experience by providing additional interfaces and ways to interact with our platform. These tools are free and mostly optional to use, though they do simplify setting up the infrastructure needed to get started experimenting with the platform. Some tools, like some of the features exposed by MATTR’s Mobile Wallet, may be required to use certain features of the platform. Our Developer Tools are designed to work natively with Platform Core as well as our Platform Extensions.

Over the past 6 months, we have been working in close collaboration with a number of preview customers to create a great developer experience and identify features that are important for a wide variety of use cases. We’ve been working with partners from industry and government to make sure we’ve built a solution for the problems that matter to you.

Checkout MATTR Learn to find out more about our platform, view our API documentation, and follow our tutorials to start using the platform today.

The article Introducing the MATTR Platform appeared first on MATTR.

Wednesday, 16. September 2020

One World Identity

Socure: Fighting the Uptick in Identity Fraud

Socure Senior Counsel & Privacy Lead Annie Bai joins State of Identity to discuss why an uptick in identity fraud usually follows economic downturns, the new types of fraud proliferating in the COVID-19 era, and how financial institutions and fintechs can protect themselves.

Socure Senior Counsel & Privacy Lead Annie Bai joins State of Identity to discuss why an uptick in identity fraud usually follows economic downturns, the new types of fraud proliferating in the COVID-19 era, and how financial institutions and fintechs can protect themselves.


KuppingerCole

Zugriffsschutz für sensible Daten – mit Data Access Governance und Identity Governance

Damit Sie besagte Vorschriften rechtzeitig erfüllen können, ist es notwendig, sensible Daten zu erkennen und zu klassifizieren, unabhängig davon, wo sie sich befinden. Vor einer Cloud-Migration müssen Sie die Kritikalität von Daten verstehen und definieren, welche Informationen in die Cloud verlagert werden können, welche nicht und wie solche Informationen geschützt werden müssen. Die Sicherheit k

Damit Sie besagte Vorschriften rechtzeitig erfüllen können, ist es notwendig, sensible Daten zu erkennen und zu klassifizieren, unabhängig davon, wo sie sich befinden. Vor einer Cloud-Migration müssen Sie die Kritikalität von Daten verstehen und definieren, welche Informationen in die Cloud verlagert werden können, welche nicht und wie solche Informationen geschützt werden müssen. Die Sicherheit kann unter anderem durch proaktives Überwachen von unberechtigten und möglicherweise bösartigen Zugriffen erhöht werden. Die Ergebnisse Ihrer Arbeit sollten integriert in einem Identity-Management-System (IDM) dargestellt werden.

In diesem Webinar lernen Sie:

Wie Sie herausfinden, wer direkt und indirekt Zugriff auf geschäftskritische Applikationen und Daten hat Wie diese Informationen in das IDM-System kommen und wie Sie sie nutzen Mit welchen Methoden Sie sensible und unstrukturierte Daten in Ihrem Unternehmen aufspüren können Welche Informationen aus dem IDM-System Ihnen dabei helfen Wie Sie gleichzeitig die gesetzlichen Vorgaben zur Datensicherheit einhalten

KuppingerCole Principal Analyst Martin Kuppinger wird erklären, weshalb Data Access Governance und Identity Governance hervorragende Tools sind, um sensible Daten zu schützen und wieso sie darüber hinaus auch geeignet sind, um Compliance-Vorschriften umzusetzen.

Im zweiten Teil wird Klaus Hild, Senior System Sales Engineer von Sailpoint, in einer Live-Demo zeigen, wie man sensible D




Nyheder fra WAYF

RMC ny brugerorganisation i WAYF

I dag er Rytmisk Musikkonservatorium (RMC) indtrådt i WAYF som brugerorganisation. Studerende og ansatte herfra kan derfor nu identificere sig som RMC-brugere over for de mange webtjenester i WAYF og eduGAIN af relevans for forskning og uddannelse. Language Danish Read more about RMC ny brugerorganisation i WAYF

I dag er Rytmisk Musikkonservatorium (RMC) indtrådt i WAYF som brugerorganisation. Studerende og ansatte herfra kan derfor nu identificere sig som RMC-brugere over for de mange webtjenester i WAYF og eduGAIN af relevans for forskning og uddannelse.

Language Danish Read more about RMC ny brugerorganisation i WAYF

IDnow

IDnow welcomes agreement on a transition period for online gambling law in Germany

Munich / September 16th, 2020 – IDnow, a leading Identity verification specialist, welcomes that Germany’s 16 states have agreed to a transition...

Munich / September 16th, 2020 – IDnow, a leading Identity verification specialist, welcomes that Germany’s 16 states have agreed to a transition period incorporating the new and upcoming legal framework by October 15th. This transition clarifies existing and new mandates including, and importantly, age verification requirements for online gambling. The new regime officially enters into effect in July of next year.

IDnow, in its mission, is to make the connected world a safer place, is pleased about this decision among the German states. The company looks forward to supporting the operators in order to provide greater safety for users. For IDnow, this ensures better protection for customers under German law as well as clearer rules in regard to responsible gambling.

“This is a big step for Germany’s online gambling industry, and again, it shows the importance of eKYC methods and their need to evolve. In a world that becomes more and more digital by the day, we need to stay vigilant and constantly adapt our security requirements,” says Rayissa Armata, Head of Regulatory Affairs at IDnow. “IDnow strives to contribute to responsible corporate citizenship, ensuring that the social responsibility for this industry can be achieved effectively through innovative methods. This is the strong desire of the federal and state governments within Germany, and IDnow has been and –  will continue to be – an active supporter of those efforts,” she adds.

“We have developed our products together with our clients – the biggest players in the market –  to perfectly meet their needs. IDnow offers AML compliant video verification, but also an approved automated verification solution. Gambling operators can choose, depending on their security needs, which of those multiple solutions they want to use. All of them fulfill mandatory age verification requirements.” says Oliver Obitayo CSO at IDnow. “To us, it is more than important that our clients can offer safe service platforms to customers so underage use can be prevented,” he adds.

After a period of uncertainty, each of Germany’s 16 states have agreed to a transition period for online gambling in Germany before the official start date of July 1, 2021. This transition period is noted by the shared agreement to not punish gambling companies who conform to the new law that will officially go into effect next year.

In January 2020, the German states agreed on an amendment to the State Treaty on Gambling (Glücksspielstaatsvertrag) that also comes into effect on July 1, 2021. This date was partly put into jeopardy, as was the process to issue sports betting licenses, in the first half of this year when matters came to a halt due to a court injunction sought by an Austrian gambling operator.

In a welcome development last week, Germany’s 16 states agreed to a transition period for gambling activities. This will allow operators to offer casino-style gaming and poker as long as such activities are fully compliant with the draft for the new Glücksspielstaatsvertrag. Online gambling operators must meet all licensing requirements by an October 15 deadline. This will include an Age Verification (AV) solution for online operators.

Under the new and upcoming regulatory regime, the following changes will be implemented:

It will be possible for German players to play online casino games and online poker under strict regulations A wagering limit of €1 will be placed per spin for slot games The number of licenses for table games will be limited to the number of physical locations for casinos in each state A €1,000 deposit limit will now be implemented across all online gambling verticals in Germany.

 With the recent acquisition of Wirecard Communication Services into the IDnow Group, the Munich based company has created additional capacities and possibilities for more flexibility in order to be able to adequately support its customers in every situation.


IDnow announces Bettina Pauck as new COO

Identity Verification Provider from Munich welcomes Bettina Pauck as Manager of the Operations Division.   Munich – September 4th, 2020, IDnow, a...

Identity Verification Provider from Munich welcomes Bettina Pauck as Manager of the Operations Division.

 

Munich – September 4th, 2020, IDnow, a leading provider of Identity Verification-as-a-Service solutions, is welcoming Bettina Pauck as Chief Operations Officer to its management team. She will head the Operations division at the Munich site as well as the Leipzig site, which will be part of the IDnow Group as of September. IDnow took this new subsidiary over as part of the acquisition of Wirecard Communication Services.

Following the successful takeover of Wirecard Communication Services GmbH at the beginning of this week, IDnow announces the appointment of Bettina Pauck as Chief Operations Officer to the management team of the Munich-based identity verification provider. Within the scope of the acquisition, she has already assisted in the valuation of the company. Her main task in the coming months will be to integrate the former Wirecard Communication Services GmbH – now IDnow Services GmbH – into the existing processes and to optimally position the business unit for the foreseeable growth.

In the last 12 years Bettina Pauck has been working with her own company as a consultant for companies like N26, reBuy or Axel Springer and has optimized their customer operations. She has been working in customer service since 2004 and now brings her many years of experience in strategic, tactical and operational customer management as well as in the conception and control of service structures to IDnow.

“Identity verification is at the heart of many industries, but most of all it ensures the security and sense of security of many customers. Since the customers are the focus of all my activities, I am particularly pleased to be able to make a real difference for the customers – and for IDnow – in this central function,” says Bettina Pauck. “IDnow is a company with a strong vision and my goal is to take the Operations division to a new level together with the outstanding team,” she adds.

“I am pleased that we were able to win Bettina Pauck as COO for IDnow. She will play a central role in the scaling of our operations area, also and especially in the context of the recent strong increase in demand. With the acquisition of Wirecard Communication Services, we have created additional capacity and infrastructure in order to further improve our range and service quality,” says Andreas Bodczek, CEO of IDnow.


KuppingerCole

Oct 29, 2020: What’s Really Going on in Your Microsoft Active Directory and Azure AD Infrastructure

Most small and mid-sized businesses rely on Microsoft technology in their IT infrastructure. For the vast majority of larger organizations, solutions such as Microsoft Active Directory also form a vital part of their IT infrastructure. Understanding what is going on in these infrastructures thus is essential. Only then, organizations will be able to react quickly and focused.
Most small and mid-sized businesses rely on Microsoft technology in their IT infrastructure. For the vast majority of larger organizations, solutions such as Microsoft Active Directory also form a vital part of their IT infrastructure. Understanding what is going on in these infrastructures thus is essential. Only then, organizations will be able to react quickly and focused.

Ontology

Ontology Weekly Report (September 8–15)

This week we have exciting news as Wing, the credit-based cross-chain DeFi platform situated on the Ontology blockchain, has opened their genesis pool for ONT and other digital assets available to be deposited in their Flash Pool. Generous incentives in WING tokens will soon be released for mining in the genesis pool. Users can use either a Cyano Wallet or ONTO Wallet to participate in Flash Pool,

This week we have exciting news as Wing, the credit-based cross-chain DeFi platform situated on the Ontology blockchain, has opened their genesis pool for ONT and other digital assets available to be deposited in their Flash Pool. Generous incentives in WING tokens will soon be released for mining in the genesis pool. Users can use either a Cyano Wallet or ONTO Wallet to participate in Flash Pool, which possesses a collateral rate that is significantly lower than similar platforms.

Back-end

- Completed 50% of Ontology GraphQL interface development

- The Rust Wasm contract development hub released ontio-std v0.3

Product Development

ONTO

- ONTO v3.3.0 released

- Supported logging into Wing from ONTO and depositing ONT on Wing. ONT deposited from ONTO taking up 55% of the total amount

- ONTO new users and daily active users increased over 500% from last month

dApp

- 80 dApps now live on Ontology

- 6,087,908 dApp-related transactions since genesis block

- 22,221 dApp-related transactions in the past week

Bounty Program

- 1 new application for the Technical Documentation Translation

Community Growth

- We onboarded 230 new members across Ontology’s Spanish, Korean, German communities.

Newly Released

- Starting from September 8, ONT can be deposited in Wing Flash Pool. Flash Pool will start releasing WING tokens as incentives for mining in the genesis pool. Users can participate in Flash Pool by using Cyano Wallet or ONTO Wallet.

- By September 11, an amount of ONT worth USD 15,000,000 has been deposited in Wing, the first credit-based, cross-chain DeFi platform based on Ontology blockchain. Wing also possesses a collateral rate significantly lower than similar platforms. During the first phase (September 15–19) of Wing Mining Celebration, 10 times the incentives will be released exclusively for depositing ONT.

Global Events

On September 9, Nick ZHANG, initiator of Wing, showed up at an online AMA co-organized by RealSatoshi, Winkrypto and Wing, during which he briefed blockchain enthusiasts and Wing users on their vision, mechanisms and core values. Nick believed that in the DeFi course, Wing has its defining value in its credit lending function, enabled by integrating OScore, which is an on-chain credit evaluation system. He also noted that Wing’s infrastructure demands are met by the credit-based identity and data framework exclusively provided by Ontology. He added, “The power of blockchain lies in consensus. It could be found in something as trivial as the record of a single transaction, or as big as a decentralized autonomous organization (DAO). It’s what enables us to envision a future where blockchain technology facilitates the process of globalization, and is on the way to breaking the barriers between races, national borders, assets and social classes. The debated catchphrases in the blockchain arena, such as decentralization, immutability and equitability, all check the boxes of derivative values born out of consensus built on blockchain.”

Find Ontology elsewhere

Ontology website / ONTO website / OWallet (GitHub)

Twitter / Reddit / Facebook / LinkedIn / YouTube / NaverBlog / Forklog

Telegram Announcement / Telegram English / GitHubDiscord

Ontology Weekly Report (September 8–15) was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.

Tuesday, 15. September 2020

Global ID

The GiD Report #126 — Forget Zoom, the video revolution begins now

The GiD Report #126 — Forget Zoom, the video revolution begins now Welcome to The GiD Report, a weekly newsletter that covers GlobaliD team and partner news, market perspectives, and industry analysis. You can check out last week’s report here. The GiD Report is back after a brief hiatus. Today’s newsletter will be a bit of a quickie as I’m still getting up to speed. What we have
The GiD Report #126 — Forget Zoom, the video revolution begins now

Welcome to The GiD Report, a weekly newsletter that covers GlobaliD team and partner news, market perspectives, and industry analysis. You can check out last week’s report here.

The GiD Report is back after a brief hiatus. Today’s newsletter will be a bit of a quickie as I’m still getting up to speed.

What we have for you this week:

The beginning of a video revolution Book Club preview: Marc Andreessen The need for positive tech narratives The Economist on digital identity What’s going on with Epic v. Apple The NYTimes on WeChat Stuff happens 1. The pandemic has changed the way we work, play, and connect, accelerating brewing trends. What was once a ten-year horizon has transpired overnight, the domain of nerds and early adopters suddenly mainstream. And Zoom is one of the hottest stocks around because it’s all about video. The thing is, this is just the beginning.

Here’s a great overview from the Telegraph via /gregkidd (pdf attached):

If Wu and other start-ups get their way, Zoom, Google Meet and Microsoft Teams will one day seem like the Model T of video calling services: functional and pioneering, but primitive. As the stubborn coronavirus continues, and the likelihood of increasing reliance on screens even after it passes, investors and entrepreneurs are now going through an explosion of dedicated video apps intended to replace today’s one-size-fits-all services.

In a sense, it’s not so different from how Ayo thinks about fintech (re: vertical neobanks). One-size-fits-all is a reasonable way to start. But it’s only a matter of time before fine-tuned UX for specific target groups will rule the roost.

Photo: cottonbro
Cannon says Zoom fulfils the specific and valuable task it was designed for: virtual workplace meetings. But it was not designed for classrooms, doctors’ appointments, live concerts or hen parties, all of which it has been co-opted for in the last six months.
The most recent batch of companies at Y Combinator, the leading Silicon Valley start-up school that is seen as a barometer of new tech trends, had at least a dozen video-related services. They ranged from Together, a video chat app for grandparents and grandchildren that includes games and bedtime stories, to Zuddl, for holding and streaming large business conferences online.
Another, Rally, aims to let people hold parties online without everyone shouting over one another: individuals can “take the stage”, for example when delivering a speech, or groups can split off into “tables”, where the muffled noise of the rest of the party can still be heard in the background.
Former yoga teacher Rachel Lea Fishman founded Sutra, which lets fitness instructors host classes online, earlier this year. The service was originally designed to let instructors rent out space and host classes, but was forced to shift to video as coronavirus closed gyms.

Related:

Facebook introduces a co-viewing experience in Messenger, ‘Watch Together’ — TechCrunch KNOW Identity Digital Forum Recap: Identity’s Role in Re-Opening the Economy — One World Identity Via /mg — The Social Dilemma | Netflix Official Site Zoom is killin’ it 2. Book Club preview: Marc Andreessen On Productivity, Scheduling, Reading Habits, Work, and More

It’s an interview with a16z founder Marc Andreessen (via /gregkidd). We’ll dive into this later in the week with the Book Club.

3. The last decade of tech has naturally gravitated us towards a more dystopian outlook on technology, further emphasized by popular culture with TV shows like Black Mirror and books like The Circle. Given the outcomes, challenges, and unintended consequences of this last cycle of tech, the reaction feels natural. But maybe it’s time for a more positive narratives to take hold to balance all the negativity.

Here’s the NYTimes tech newsletter:

Sriram Krishnan, a technology executive whom I respect, tweeted a few days ago asking for more optimistic descriptions in movies and television of people building technology. He didn’t put it quite this way, but I imagined he wanted less fiction like “The Circle,” about a surveillance-state corporate cult, and more like “Iron Man,” in which a tech nerd cobbles together a suit that saves his life and gives him superhero powers.
I get what Krishnan is saying, and there’s a bigger meaning behind it. Right now, there’s a lot of pessimism about the harm of social media, the creepiness of digital surveillance of our smartphones and our faces and the nefarious power of tech giants.
Those downers sometimes drown out the ways that we know technology has made many of our lives immeasurably better. Both “The Circle” and “Iron Man” encompass some form of reality, but it’s easy to see technology as either one or the other.

Which I believe is very apt for the work we’re trying to do here at GlobaliD. Technology shouldn’t force you to undermine your values or concede your individuality. There’s a hugely positive story to tell here and we’re writing it every day.

Speaking of which — why telling that story in the right way from day one is so important:

Zuckerberg says he regrets not explaining Facebook’s free-speech ideals from the start.
“I just wish that I’d spent more time earlier on communicating about what our principles are and what we stand for — you know, things like free expression and voice and that we’re going to defend those.”
“Now a lot of people look at us and they see this as a successful company. With a lot of money. And it’s a little hard now, I think, for us to go back and talk about our principles and have people see the principles for anything but, you know, some talking points.”
4. Along with video, the pandemic has also put a spotlight on digital identity.

Here’s The Economist:

In countries without a system of secure digital identities, the closure of bricks-and-mortar government offices and the shift of public services online have caused havoc (see article). Divorces and adoptions have run into a virtual brick wall. Italy’s system for doling out emergency payments crashed and then demanded paperwork that applicants could not obtain because government offices were shut. In America, Washington state paid $650m in unemployment insurance to fraudsters who made applications using stolen identities.
No such havoc occurred in Estonia, a tiny Baltic state where every citizen has an electronic identity. More than just an identity card, it links every Estonian’s records together. So when the government created a furlough system for workers affected by the pandemic, it already knew where they worked and how to pay them. Nobody in Estonia had to join a queue on a pavement to claim benefits, as people in other places did.

See also: Apple Pay Was Not Disruptive But Apple ID Will Be (via Luka)

Related:

U.K. digital ID tool passes big test — SecureIDNews Phil Windley — Authentic Digital Relationships LG CNS to Take Lead in Developing Next-generation Digital Identification Technology 5. What’s going on with Epic v. Apple:

The Information:

Apple on Tuesday countersued Epic Games, the developer of “Fortnite,” asking a judge for monetary damages in the escalating legal battle between the two companies.
Apple made its claims in a filing with a federal court in the Northern District of California on Tuesday, in which the company also rejected the legal claims Epic made in an antitrust lawsuit it filed against Apple last month.
While Apple’s counter-suit was anticipated and didn’t contain any big surprises, the company tucked a noteworthy detail into the filing, saying that Epic has earned over $600 million in revenue through iOS apps in the past. That’s a substantial figure that likely represents the success of “Fortnite,” Epic’s hit battle game.
Apple justified its demand for financial damages in part because it said Epic has deprived it of App Store commissions by allowing “Fortnite” users to bypass an Apple payment system. The company also said Epic has hurt Apple’s image through an “extensive smear campaign” and requested punitive damages for its conduct.

See also: Tim Sweeney on open platforms

6. The NYTimes talks WeChat.

Because WeChat is a really, really, really big deal:

Still, to be free she would have to delete WeChat, and she can’t do that. As the coronavirus crisis struck China, her family used it to coordinate food orders during lockdowns. She also needs a local government health code featured on the app to use public transport or enter stores.
“I want to switch to other chat apps, but there’s no way,” she said.
“If there were a real alternative I would change, but WeChat is terrible because there is no alternative. It’s too closely tied to life. For shopping, paying, for work, you have to use it,” she said. “If you jump to another app, then you are alone.”
7. Stuff happens: Via /jvsDigital Bank Neon Pagamentos Raises $300M Series C India Bans 118 Chinese Apps as Indian Soldier Is Killed on Disputed Border Inside China’s unexpected quest to protect data privacy Cash App’s Surge During Covid-19 Pandemic Fuels Square Stock ING and Albert Heijn to pilot online payments service that tokenizes customers’ bank account details • NFCW PayPal has a fraud problem Alejandro Machado: Venezuelans Look to Crypto-Dollars — CoinDesk Top regulator pushes ahead with plan to reshape banking, sparking clash with states Chinese Bank Disables Digital Yuan Wallet After Soft Launch Draws Wide Attention Via /lauraParticl.io • Privacy-focused Decentralized Applications Justice Dept. Plans to File Antitrust Charges Against Google in Coming Weeks Kenosha militia group, not Facebook, took down its event page Apple to Delay iOS Change Roiling Mobile Ad Market Tech platforms hold dry runs to game out election night chaos scenarios Briefing: Oracle Chosen as ByteDance’s Partner for TikTok Briefing: TikTok Submits Oracle Deal Proposal to U.S. Treasury Via /mg — Why Amazon Has A Fake Review Problem Square Forms Group to Stop Patent Hoarding From Stifling Crypto Innovation — CoinDesk Pandemic Will Speed Bitcoin Adoption, Says DBS Bank Economist — CoinDesk Druckenmiller is worried about inflation, says it could hit 10% in coming years CBDCs: Geopolitical Ramifications of a Major Digital Currency Facebook’s Sandberg Says TikTok Ban ‘Very Unusual’ in Business History Facebook just invented… Facebook