Last Update 11:45 PM October 09, 2024 (UTC)

Web3 | Identosphere Blogcatcher

Brought to you by Identity Woman and Infominer.
Support this collaboration on Patreon!!!

Wednesday, 09. October 2024

Sequoia

Generative AI’s Act o1

The post Generative AI’s Act o1 appeared first on Sequoia Capital.
Generative AI’s Act o1 By Sonya Huang, Pat Grady, and o1 Published October 9, 2024 The Agentic Reasoning Era Begins

Two years into the Generative AI revolution, research is progressing the field from “thinking fast”—rapid-fire pre-trained responses—to “thinking slow”— reasoning at inference time. This evolution is unlocking a new cohort of agentic applications. 

On the second anniversary of our essay “Generative AI: A Creative New World,” the AI ecosystem looks very different, and we have some predictions for what’s on the horizon.

The foundation layer of the Generative AI market is stabilizing in an equilibrium with a key set of scaled players and alliances, including Microsoft/OpenAI, AWS/Anthropic, Meta and Google/DeepMind. Only scaled players with economic engines and access to vast sums of capital remain in play. While the fight is far from over (and keeps escalating in a game-theoretic fashion), the market structure itself is solidifying, and it’s clear that we will have increasingly cheap and plentiful next-token predictions.

As the LLM market structure stabilizes, the next frontier is now emerging. The focus is shifting to the development and scaling of the reasoning layer, where “System 2” thinking takes precedence. Inspired by models like AlphaGo, this layer aims to endow AI systems with deliberate reasoning, problem-solving and cognitive operations at inference time that go beyond rapid pattern matching. And new cognitive architectures and user interfaces are shaping how these reasoning capabilities are delivered to and interact with users. 

What does all of this mean for founders in the AI market? What does this mean for incumbent software companies? And where do we, as investors, see the most promising layer for returns in the Generative AI stack?

In our latest essay on the state of the Generative AI market, we’ll explore how the consolidation of the foundational LLM layer has set the stage for the race to scale these higher-order reasoning and agentic capabilities, and discuss a new generation of “killer apps” with novel cognitive architectures and user interfaces.

Strawberry Fields Forever

The most important model update of 2024 goes to OpenAI with o1, formerly known as Q* and also known as Strawberry. This is not just a reassertion of OpenAI’s rightful place atop the model quality leaderboards, but also a notable improvement on the status quo architecture. More specifically, this is the first example of a model with true general reasoning capabilities, which they’ve achieved with inference-time compute. 

What does that mean? Pre-trained models are doing next token prediction on an enormous amount of data. They rely on “training-time compute.” An emergent property of scale is basic reasoning, but this reasoning is very limited. What if you could teach a model to reason more directly? This is essentially what’s happening with Strawberry. When we say “inference-time compute” what we mean is asking the model to stop and think before giving you a response, which requires more compute at inference time (hence “inference-time compute”). The “stop and think” part is reasoning. 

AlphaGo x LLMs

So what is the model doing when it stops and thinks?

Let’s first take a quick detour to March 2016 in Seoul. One of the most seminal moments in deep learning history took place here: AlphaGo’s match against legendary Go master Lee Sedol. This wasn’t just any AI-vs-human match—it was the moment the world saw AI do more than just mimic patterns. It was thinking.

What made AlphaGo different from previous gameplay AI systems, like Deep Blue? Like LLMs, AlphaGo was first pre-trained to mimic human experts from a database of roughly 30 million moves from previous games and more from self-play. But rather than provide a knee jerk response that comes out of the pre-trained model, AlphaGo takes the time to stop and think. At inference time, the model runs a search or simulation across a wide range of potential future scenarios, scores those scenarios, and then responds with the scenario (or answer) that has the highest expected value. The more time AlphaGo is given, the better it performs. With zero inference-time compute, the model can’t beat the best human players. But as the inference time scales, AlphaGo gets better and better—until it surpasses the very best humans.

Let’s bring it back to the LLM world. What’s hard about replicating AlphaGo here is constructing the value function, or the function by which the responses are scored. If you’re playing Go, it’s more straightforward: you can simulate the game all the way to the end, see who wins, and then calculate an expected value of the next move. If you’re coding, it’s somewhat straightforward: you can test the code and see if it works. But how do you score the first draft of an essay? Or a travel itinerary? Or a summary of key terms in a long document? This is what makes reasoning hard with current methods, and it’s why Strawberry is comparatively strong on domains proximate to logic (e.g. coding, math, the sciences) and not as strong in domains that are more open-ended and unstructured (e.g. writing). 

While the actual implementation of Strawberry is a closely guarded secret, the key ideas involve reinforcement learning around the chains of thought generated by the model. Auditing the model’s chains of thought suggests that something fundamental and exciting is happening that actually resembles how humans think and reason. For example, o1 is showing the ability to backtrack when it gets stuck as an emergent property of scaling inference time. It is also showing the ability to think about problems the way a human would (e.g. visualize the points on a sphere to solve a geometry problem) and to think about problems in new ways (e.g. solving problems in programming competitions in a way that humans would not). 

And there is no shortage of new ideas to push forward inference-time compute (e.g. new ways of calculating the reward function, new ways of closing the generator/verifier gap) that research teams are working on as they try to improve the model’s reasoning capabilities. In other words, deep reinforcement learning is cool again, and it’s enabling an entire new reasoning layer.

System 1 vs System 2 Thinking

This leap from pre-trained instinctual responses (”System 1”) to deeper, deliberate reasoning (“System 2”) is the next frontier for AI. It’s not enough for models to simply know things—they need to pause, evaluate and reason through decisions in real time.

Think of pre-training as the System 1 layer. Whether a model is pre-trained on millions of moves in Go (AlphaGo) or petabytes of internet-scale text (LLMs), its job is to mimic patterns—whether that’s human gameplay or language. But mimicry, as powerful as it is, isn’t true reasoning. It can’t properly think its way through complex novel situations, especially those out of sample.

This is where System 2 thinking comes in, and it’s the focus of the latest wave of AI research. When a model “stops to think,” it isn’t just generating learned patterns or spitting out predictions based on past data. It’s generating a range of possibilities, considering potential outcomes and making a decision based on reasoning. 

For many tasks, System 1 is more than enough. As Noam Brown pointed out on our latest episode of Training Data, thinking for longer about what the capital of Bhutan is doesn’t help—you either know it or you don’t. Quick, pattern-based recall works perfectly here.

But when we look at more complex problems—like breakthroughs in mathematics or biology—quick, instinctive responses don’t cut it. These advances required deep thinking, creative problem-solving and—most importantly—time. The same is true for AI. To tackle the most challenging, meaningful problems, AI will need to evolve beyond quick in-sample responses and take its time to come up with the kind of thoughtful reasoning that defines human progress.

A New Scaling Law: The Inference Race is On

The most important insight from the o1 paper is that there’s a new scaling law in town.

Pre-training LLMs follows a well understood scaling law: the more compute and data you spend on pre-training the model, the better it performs. 

The o1 paper has opened up an entire new plane for scaling compute: the more inference-time (or “test-time”)  compute you give the model, the better it reasons.

Source: OpenAI o1 technical report

What happens when the model can think for hours? Days? Decades? Will we solve the Riemann Hypothesis? Will we answer Asimov’s last question?

This shift will move us from a world of massive pre-training clusters toward inference clouds—environments that can scale compute dynamically based on the complexity of the task.

One Model to Rule Them All?

What happens as OpenAI, Anthropic, Google and Meta scale their reasoning layers and develop more and more powerful reasoning machines? Will we have one model to rule them all?

One hypothesis at the outset of the Generative AI market was that a single model company would become so powerful and all-encompassing that it would subsume all other applications. This prediction has been wrong so far in two ways.

First, there is plenty of competition at the model layer, with constant leapfrogging for SOTA capabilities. It’s possible that someone figures out continuous self-improvement with broad domain self play and achieves takeoff, but at the moment we have seen no evidence of this. Quite to the contrary, the model layer is a knife-fight, with price per token for GPT-4 coming down 98% since the last dev day.

Second, the models have largely failed to make it into the application layer as breakout products, with the notable exception of ChatGPT. The real world is messy. Great researchers don’t have the desire to understand the nitty gritty end-to-end workflows of every possible function in every possible vertical. It is both appealing and economically rational for them to stop at the API, and let the developer universe worry about the messiness of the real world. Good news for the application layer. 

The Messy Real World: Custom Cognitive Architectures

The way you plan and prosecute actions to reach your goals as a scientist is vastly different from how you would work as a software engineer. Moreover, it’s even different as a software engineer at different companies.

As the research labs further push the boundaries on horizontal general-purpose reasoning, we still need application or domain-specific reasoning to deliver useful AI agents. The messy real world requires significant domain and application-specific reasoning that cannot efficiently be encoded in a general model.

Enter cognitive architectures, or how your system thinks: the flow of code and model interactions that takes user input and performs actions or generates a response.

For example, in the case of Factory, each of their “droid” products has a custom cognitive architecture that mimics the way that a human thinks to solve a specific task, like reviewing pull requests or writing and executing a migration plan to update a service from one backend to another. The Factory droid will break down all of the dependencies, propose the relevant code changes, add unit tests and pull in a human to review. Then after approval, run the changes across all of the files in a dev environment and merge the code if all the tests pass. Just like how a human might do it—in a set of discrete tasks rather than one generalized, black box answer.

What’s Happening with Apps?

Imagine you want to start a business in AI. What layer of the stack do you target? Do you want to compete on infra? Good luck beating NVIDIA and the hyperscalers. Do you want to compete on the model? Good luck beating OpenAI and Mark Zuckerberg. Do you want to compete on apps? Good luck beating corporate IT and global systems integrators. Oh. Wait. That actually sounds pretty doable!

Foundation models are magic, but they’re also messy. Mainstream enterprises can’t deal with black boxes, hallucinations and clumsy workflows. Consumers stare at a blank prompt and don’t know what to ask. These are opportunities in the application layer. 

Two years ago, many application layer companies were derided as “just a wrapper on top of GPT-3.” Today those wrappers turn out to be one of the only sound methods to build enduring value. What began as “wrappers” have evolved into “cognitive architectures.”

Application layer AI companies are not just UIs on top of a foundation model. Far from it. They have sophisticated cognitive architectures that typically include multiple foundation models with some sort of routing mechanism on top, vector and/or graph databases for RAG, guardrails to ensure compliance, and application logic that mimics the way a human might think about reasoning through a workflow.

Service-as-a-Software

The cloud transition was software-as-a-service. Software companies became cloud service providers. This was a $350B opportunity.
Thanks to agentic reasoning, the AI transition is service-as-a-software. Software companies turn labor into software. That means the addressable market is not the software market, but the services market measured in the trillions of dollars.

What does it mean to sell work? Sierra is a good example. B2C companies put Sierra on their website to talk with customers. The job-to-be-done is to resolve a customer issue. Sierra gets paid per resolution. There is no such thing as “a seat”. You have a job to be done. Sierra does it. They get paid accordingly. 

This is the true north for many AI companies. Sierra benefits from having a graceful failure mode (escalation to a human agent). Not all companies are so lucky. An emerging pattern is to deploy as a copilot first (human-in-the-loop) and use those reps to earn the opportunity to deploy as an autopilot (no human in the loop). GitHub Copilot is a good example of this. 

A New Cohort of Agentic Applications

With Generative AI’s budding reasoning capabilities, a new class of agentic applications is starting to emerge.

What shape do these application layer companies take? Interestingly, these companies look different than their cloud predecessors: 

Cloud companies targeted the software profit pool. AI companies target the services profit pool.  Cloud companies sold software ($ / seat). AI companies sell work ($ / outcome) Cloud companies liked to go bottoms-up, with frictionless distribution. AI companies are increasingly going top-down, with high-touch, high-trust delivery models. 

We are seeing a new cohort of these agentic applications emerge across all sectors of the knowledge economy. Here are some examples.

Harvey: AI lawyer Glean: AI work assistant  Factory: AI software engineer Abridge: AI medical scribe XBOW: AI pentester  Sierra: AI customer support agent

By bringing the marginal cost of delivering these services down—in line with the plummeting cost of inference—these agentic applications are expanding and creating new markets.

Take XBOW, for example. XBOW is building an AI “pentester.” A “pentest” or penetration test is a simulated cyberattack on a computer system that companies perform in order to evaluate their own security systems. Before Generative AI, companies hired pentesters only in limited circumstances (e.g. when required for compliance), because human pentesting is expensive: it’s a manual task performed by a highly skilled human. However, XBOW is now demonstrating automated pentests built on the latest reasoning LLMs that match the performance of the most highly skilled human pentesters. This multiplies the pentesting market and opens up the possibility of continuous pentesting for companies of all shapes and sizes. 

What does this mean for the SaaS universe?

Earlier this year we met with our Limited Partners. Their top question was “will the AI transition destroy your existing cloud companies?”  

We began with a strong default of “no.” The classic battle between startups and incumbents is a horse race between startups building distribution and incumbents building product. Can the young companies with cool products get to a bunch of customers before the incumbents who own the customers come up with cool products? Given that so much of the magic in AI is coming from the foundation models, our default assumption has been no—the incumbents will do just fine, because those foundation models are just as accessible to them as they are to the startup universe, and they have the preexisting advantages of data and distribution. The primary opportunity for startups is not to replace incumbent software companies—it’s to go after automatable pools of work. 

That being said, we are no longer so sure. See above re: cognitive architectures. There’s an enormous amount of engineering required to turn the raw capabilities of a model into a compelling, reliable, end-to-end business solution. What if we’re just dramatically underestimating what it means to be “AI native”?

Twenty years ago the on-prem software companies scoffed at the idea of SaaS. “What’s the big deal? We can run our own servers and deliver this stuff over the internet too!” Sure, conceptually it was simple. But what followed was a wholesale reinvention of the business. EPD went from waterfalls and PRDs to agile development and AB testing. GTM went from top-down enterprise sales and steak dinners to bottoms-up PLG and product analytics. Business models went from high ASPs and maintenance streams to high NDRs and usage-based pricing. Very few on-prem companies made the transition. 

What if AI is an analogous shift? Could the opportunity for AI be both selling work and replacing software?

With Day.ai, we have seen a glimpse of the future. Day is an AI native CRM. Systems integrators make billions of dollars configuring Salesforce to meet your needs. With nothing but access to your email and calendar and answers to a one-page questionnaire, Day automatically generates a CRM that is perfectly tailored to your business. It doesn’t have all the bells and whistles (yet), but the magic of an auto-generated CRM that remains fresh with zero human input is already causing people to switch.

The Investment Universe

Where are we spending our cycles as investors? Where is funding being deployed? Here’s our quick take. 

Infrastructure

This is the domain of hyperscalers. It’s being driven by game theoretic behavior, not microeconomics. Terrible place for venture capitalists to be. 

Models

This is the domain of hyperscalers and financial investors. Hyperscalers are trading balance sheets for income statements, investing money that’s just going to round-trip back to their cloud businesses in the form of compute revenue. Financial investors are skewed by the “wowed by science” bias. These models are super cool and these teams are incredibly impressive. Microeconomics be damned!

Developer tools and infrastructure software 

Less interesting for strategics and more interesting for venture capitalists. ~15 companies with $1Bn+ of revenue were created at this layer during the cloud transition, and we suspect the same could be true with AI. 

Apps

The most interesting layer for venture capital. ~20 application layer companies with $1Bn+ in revenue were created during the cloud transition, another ~20 were created during the mobile transition, and we suspect the same will be true here. 

Closing Thoughts

In Generative AI’s next act, we expect to see the impact of reasoning R&D ripple into the application layer. These ripples are fast and deep. Most of the cognitive architectures to date incorporate clever “unhobbling” techniques; now that these capabilities are becoming baked deeper into the models themselves, we expect that agentic applications will become much more sophisticated and robust, quickly. 

Back in the research lab, reasoning and inference-time compute will continue to be a strong theme for the foreseeable future. Now that we have a new scaling law, the next race is on. But for any given domain, it is still hard to gather real-world data and encode domain and application-specific cognitive architectures. This is again where last-mile app providers may have the upper hand in solving the diverse set of problems in the messy real world.

Thinking ahead, multi-agent systems, like Factory’s droids, may begin to proliferate as ways of modeling reasoning and social learning processes. Once we can do work, we can have teams of workers accomplishing so much more.

What we’re all eagerly awaiting is Generative AI’s Move 37, that moment when – like in AlphaGo’s second game against Lee Sedol – a general AI system surprises us with something superhuman, something that feels like independent thought. This does not mean that the AI “wakes up” (AlphaGo did not) but that we have simulated processes of perception, reasoning and action that the AI can explore in truly novel and useful ways. This may in fact be AGI, and if so it will not be a singular occurrence, it will merely be the next phase of technology.

Share Share this on Facebook Share this on Twitter Share this on LinkedIn Share this via email Generative AI: A Creative New World By Sonya Huang, Pat Grady and GPT-3 Perspective Read Generative AI’s Act Two By Sonya Huang, Pat Grady and GPT-4 Perspective Read AI’s $600B Question By David Cahn Perspective Read Fireworks: Production Deployments for the Compound AI Future By Sonya Huang and Romie Boyd News Read JOIN OUR MAILING LIST Get the best stories from the Sequoia community. Email address Leave this field empty if you’re human:

The post Generative AI’s Act o1 appeared first on Sequoia Capital.


PIVX

Exciting News from PIVXLabs — MyPIVXWallet V2.0

Exciting News from PIVXLabs — MyPIVXWallet V2.0 Enhancements Galore, MyPIVXWallet V2.0 Unveils Exciting Updates! PIVXLabs has been getting a lot of attention recently, especially because of their latest updates, notably the launch of MyPIVXWallet V2.0. This new version heralds a fresh era for PIVX enthusiasts, boasting a complete redesign, significant stability improvements, and numerous us
Exciting News from PIVXLabs — MyPIVXWallet V2.0

Enhancements Galore, MyPIVXWallet V2.0 Unveils Exciting Updates!

PIVXLabs has been getting a lot of attention recently, especially because of their latest updates, notably the launch of MyPIVXWallet V2.0. This new version heralds a fresh era for PIVX enthusiasts, boasting a complete redesign, significant stability improvements, and numerous user experience enhancements. With v2.0, PIVX is setting a higher standard for user-friendly cryptocurrency wallets.

Some of the Updates:

Hindi Translation:
To cater to a more diverse user base, the wallet now offers a Hindi translation, making it more accessible to a wider audience and fostering inclusivity within the PIVX community.

UI Redesign:
A sleek and modern user interface redesign has been implemented, enhancing the overall aesthetic appeal and usability of the wallet. Navigating through MyPIVXWallet is now more intuitive and visually engaging.

PIVX SHIELD Improvements:
Significant enhancements have been made to the $PIVX SHIELD, bolstering security measures and ensuring that users’ assets are protected with the latest security protocols.

Better Password Privacy:
Users can now enjoy improved password privacy features, adding an extra layer of security to their accounts and further safeguarding their funds.

Removed Identicon:
The removal of Identicon adds a fresh look to user profiles, offering a cleaner and more streamlined visual experience within the wallet.

PIVX Oracle Implementation:
In a groundbreaking move, the #PIVX Oracle has been integrated, replacing #CoinGecko stats. This implementation brings real-time data and insights directly to users, empowering them with up-to-date information to make informed decisions.

With these latest updates, MyPIVXWallet V2.0 not only enhances functionality but also prioritizes user security, accessibility, and innovation. The inclusion of Hindi translation, improved UI, advanced security features, and real-time data through the #PIVX Oracle underscores PIVXLabs’ commitment to delivering a cutting-edge and user-centric cryptocurrency wallet experience.

Stay tuned for more developments as PIVXLabs continues to push boundaries and redefine the standards of cryptocurrency wallets with MyPIVXWallet V2.0! 🚀

PIVX. Your Rights. Your Privacy. Your Choice.
To stay on top of PIVX news please visit PIVX.org and Discord.PIVX.org.

Exciting News from PIVXLabs — MyPIVXWallet V2.0 was originally published in PIVX on Medium, where people are continuing the conversation by highlighting and responding to this story.

Tuesday, 08. October 2024

a16z Podcast

A Big Week in Tech: NotebookLM, OpenAI’s Speech API, & Custom Audio

Last week was another big week in technology.  Google’s NotebookLM introduced its Audio Overview feature, enabling users to create customizable podcasts in over 35 languages. OpenAI followed with their real-time speech-to-speech API, making voice integration easier for developers, while Pika’s 1.5 model made waves in the AI world. In this episode, we chat with the a16z Consumer team—Anish A

Last week was another big week in technology. 

Google’s NotebookLM introduced its Audio Overview feature, enabling users to create customizable podcasts in over 35 languages. OpenAI followed with their real-time speech-to-speech API, making voice integration easier for developers, while Pika’s 1.5 model made waves in the AI world.

In this episode, we chat with the a16z Consumer team—Anish Acharya, Olivia Moore, and Bryan Kim—about the rise of voice technology, the latest AI breakthroughs, and what it takes to capture attention in 2024. Anish shares why he believes this could finally be the year of voice tech.

 

Resources: 

Find Olivia on Twitter: https://x.com/omooretweets

Find Anish on Twitter: https://x.com/illscience

Find Bryan on Twitter: https://x.com/kirbyman01

 

Stay Updated: 

Let us know what you think: https://ratethispodcast.com/a16z

Find a16z on Twitter: https://twitter.com/a16z

Find a16z on LinkedIn: https://www.linkedin.com/company/a16z

Subscribe on your favorite podcast app: https://a16z.simplecast.com/

Follow our host: https://twitter.com/stephsmithio

Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures.


Sequoia

Partnering with Cove: The Next Gen of Assistants

Images courtesy of VisualElectric.com The post Partnering with Cove: The Next Gen of Assistants appeared first on Sequoia Capital.
Partnering with Cove: The Next Gen of Assistants

Stephen, Andy, Mike and their team are building a collaborative AI thought partner.

By Jess Lee Published October 8, 2024

The rise of LLMs has undoubtedly changed the world and the way we work. With the superpowers of knowledge, summarization and synthesis at our fingertips, it’s easy to imagine a future where every person has a digital assistant. 

However, the user interface for that future has a long way to go. Today, using an AI chatbot can feel less like having an assistant, and more like visiting an oracle—someone who knows a lot about the world, but less about you. An executive assistant or chief of staff, on the other hand, can easily access your inbox, files and docs; can take detailed notes; can retain important context about your projects; and can not just tell you exactly how to do a task, but actually do it, too. LLMs are brilliant in many ways, but they still leave a lot to be desired in terms of productivity—especially for more complex tasks.

AI assistance today: Visiting an oracle and praying that your prompt works What AI assistance should be: Let me help you with that—let’s work together

Images courtesy of VisualElectric.com

So what would a better AI assistant look like? Ideally, it would be easy to summon to the space where you’re already working, such as your inbox or browser. It would see what you see, and incorporate information from that shared context. It would take notes, write documents, populate spreadsheets. It would do the work alongside you, collaborating and iterating as your project evolves. 

That’s exactly what Cove co-founders Stephen Chau, Andy Szybalski and Mike Chu are building. Cove is an AI thought partner that helps you think brilliantly, breaking out of chat threads and into a shared visual workspace. It can follow you across your desktop, your email and your web browser, via a Chrome extension. It can summarize long articles and clip notes, ingest documents from your desktop, and find new data online. It can fill in the blanks, working alongside you to do web searches and fill in empty cells of tables. It is truly collaborative, not only making smart suggestions to complete your tasks, but actually helping you complete them.

What excites me most about Cove is the team. I had the privilege of working with Stephen, Andy and Mike as a product manager in the early days of Google Maps, and I saw firsthand their ability to create beautiful consumer products—at the cutting edge of both what’s technically possible and what’s easily usable. Stephen and Andy went on to be founding members of Uber Eats, which they helped scale to a global business, while Mike did the same with Stripe Identity. Their track record shows: If AI has a UI problem, they are a great team to fix it.

Cove co-founders Andy Szybalski, Stephen Chau AND Mike Chu.

We at Sequoia are proud to partner with them and lead Cove’s seed round. With today’s announcement of general availability, the product, which is currently in beta, will no doubt evolve rapidly alongside AI capabilities. We can’t wait to see what Stephen, Andy, Mike and their team do next—and what you do and Cove do together.

Share Share this on Facebook Share this on Twitter Share this on LinkedIn Share this via email Related Topics #AI #Consumer Partnering with Squint: Instructions for the Real World News Read Vision, Grit, Growth: Introducing the Next Arc Founders News Read Partnering with Protégé: Talent is Distributed, Opportunity is Not News Read JOIN OUR MAILING LIST Get the best stories from the Sequoia community. Email address Leave this field empty if you’re human:

The post Partnering with Cove: The Next Gen of Assistants appeared first on Sequoia Capital.


Greylock Partners

Orb Helps Companies Ship Pricing as Fast as Product

The post Orb Helps Companies Ship Pricing as Fast as Product appeared first on Greylock.

Circle Blog

Migration Guide: Bridged to Native USDC on Sui

This guide explains the differences between bridged USDC and native USDC, and shares the best practices for migrating from bridged USDC to native USDC.

This guide explains the differences between bridged USDC and native USDC, and shares the best practices for migrating from bridged USDC to native USDC.


Now Available: Native USDC on Sui

Native USDC on Sui is now live! We’re excited to announce that native USDC is now available on Sui Mainnet and accessible to developers and users – no bridging required. Circle Mint and Circle APIs now fully support USDC on Sui, making it easy to access USDC liquidity and benefit from Sui’s fast and secure network.

Native USDC on Sui is now live! We’re excited to announce that native USDC is now available on Sui Mainnet and accessible to developers and users – no bridging required. Circle Mint and Circle APIs now fully support USDC on Sui, making it easy to access USDC liquidity and benefit from Sui’s fast and secure network.

Monday, 07. October 2024

Zcash

New Release 6.0.0

zcashd 6.0.0 has been released, which will deploy Network Upgrade 6 (NU6) at the next Zcash halvening, at an activation height of 2726400. This will occur on or around November […] Source
zcashd 6.0.0 has been released, which will deploy Network Upgrade 6 (NU6) at the next Zcash halvening, at an activation height of 2726400. This will occur on or around November […]

Source


Epicenter Podcast

Konstantin Lomashuk: '#bitcoin is like a #memecoin' - cyber·Fund

Outside of BTC’s cypherpunk movement, the early days of crypto were more or less barren in terms of innovation. However, this presented a huge opportunity for visionaries and angel investors to either launch or back bold projects, recognizing the potential of crypto stretching far beyond than just payments. One of them was Konstantin Lomashuk, co-founder of cyber•Fund which was a key actor in boot

Outside of BTC’s cypherpunk movement, the early days of crypto were more or less barren in terms of innovation. However, this presented a huge opportunity for visionaries and angel investors to either launch or back bold projects, recognizing the potential of crypto stretching far beyond than just payments. One of them was Konstantin Lomashuk, co-founder of cyber•Fund which was a key actor in bootstrapping the cybernetic economy. Apart from early investments in Ethereum, Polkadot, Cosmos, Solana and many others, Konstantin recognized the potential threats to the core ethos of decentralisation and also co-founded Lido DAO in order to stave off Ethereum’s centralization risks.

Topics covered in this episode:

Konstantin’s background cyber·Fund & cybernetic economy Entrepreneurship vs. angel investing Delegating tasks and scaling businesses Lido’s origin story Lido DAO proposals and improvements Solana vs. Ethereum The evolution of Ethereum’s economy Konstantin’s other areas of interest AI’s risk of centralization The impact of AI on crypto How CyberFund invests in new narratives

Episode links:

Konstantin Lomashuk on Twitter cyber·Fund on Twitter P2P org

Sponsors:

Gnosis: Gnosis builds decentralized infrastructure for the Ethereum ecosystem, since 2015. This year marks the launch of Gnosis Pay— the world's first Decentralized Payment Network. Get started today at - gnosis.io Chorus1: Chorus1 is one of the largest node operators worldwide, supporting more than 100,000 delegators, across 45 networks. The recently launched OPUS allows staking up to 8,000 ETH in a single transaction. Enjoy the highest yields and institutional grade security at - chorus.one

This episode is hosted by Brian Fabian Crain.

Sunday, 06. October 2024

Scala Project

New Website & Brand Identity

IMPORTANT: We will be using our official website to share content from now on. This Medium space will not be updated anymore. We’re beyond excited to finally share something we’ve been working on for a while — our redesigned website and brand identity! This update marks a huge milestone for us as we continue to evolve and improve our platform, ensuring that it meets the needs of our growing
IMPORTANT: We will be using our official website to share content from now on. This Medium space will not be updated anymore.

We’re beyond excited to finally share something we’ve been working on for a while — our redesigned website and brand identity! This update marks a huge milestone for us as we continue to evolve and improve our platform, ensuring that it meets the needs of our growing community.

A New Look, A New Experience

Our revamped website isn’t just about fresh visuals, though we do love how sleek it looks now. More importantly, it reflects our commitment to making Scala a user-friendly, engaging, and intuitive space for everyone. The new design not only aligns better with our vision but also makes it easier for users to find what they’re looking for.

We’ve put a lot of thought into simplifying navigation, so whether you’re a crypto miner, a developer, or simply curious about what we do, you’ll find the information you need faster and with less hassle.

Plus, we didn’t forget about SEO! Our new structure is optimized to boost our online presence, making sure Scala stands out in search results. Because let’s be honest, if you’re not visible online, are you really there?

A New Domain to Match Our Growth

To top it all off, we’re thrilled to announce our new domain: scala.network. This isn’t just a vanity update — it’s a reflection of how serious we are about building a strong, professional presence for the future. A clean and memorable domain is key for any modern project, and we believe this shift positions us exactly where we need to be.

What’s Changed?

Alongside the main site, we’ve successfully migrated all satellite sites under the Scala umbrella. We’re actively collaborating with our third-party partners to ensure their links and resources point to the new domain, keeping everything smooth for everyone.

And this is just the start. In the coming months, you can expect a lot more updates! Our new Blog section will feature fresh content regularly, focusing on crypto mobile mining, tech trends, and community updates. We’ll also sprinkle in some fun animations and goodies throughout the site to keep things lively. Stay tuned for all the cool stuff we have in store!

Shoutouts 🙌

Of course, this project wouldn’t have come to life without some serious teamwork. A huge thank you to @Le Yaube, @Boiss, and @hayzam — you guys truly crushed it! The effort and dedication that went into this project were off the charts, and we’re beyond proud of what we’ve achieved together.

We’re so excited for you all to explore the new site, and we hope you love the experience as much as we loved building it!

Cheers to the future! 🚀

New Website & Brand Identity was originally published in Scala on Medium, where people are continuing the conversation by highlighting and responding to this story.


Nym - Medium

NymVPN supercharged: WireGuard is live and speedy!

Decentralized VPN routing meets state-of-the-art encryption All the updates for the NymVPN app in September and the integration of the WireGuard protocol for the Fast Mode. WireGuard on NymVPN A huge milestone has been reached for NymVPN: the WireGuard protocol is now live on the Fast Mode for most apps and operating systems, with Windows to be launched shortly. Why is this so i

Decentralized VPN routing meets state-of-the-art encryption

All the updates for the NymVPN app in September and the integration of the WireGuard protocol for the Fast Mode.

WireGuard on NymVPN

A huge milestone has been reached for NymVPN: the WireGuard protocol is now live on the Fast Mode for most apps and operating systems, with Windows to be launched shortly.

Why is this so important? Decentralized and independent routing is absolutely crucial for privacy, but this inevitably creates latencies with most encrypted routing protocols. Every stop your traffic makes slows things down, requiring independent servers to decrypt before forwarding your data to its intended destination. Making this encrypted routing process as efficient and secure as possible is crucial, and WireGuard comes in for the save here.

WireGuard utilizes a streamlined codebase and state-of-the-art encryption protocols. While not designed specifically for decentralized routing, the Nym dev team has succeeded in adapting WireGuard to a tunnel-in-tunnel protocol for enhanced privacy on NymVPN’s Fast Mode, a 2-server network to complement the app’s 5-server Anonymous Mode.

And it’s fast!

So when you need the confidence of enhanced privacy but with speeds that rival other centralized and essentislly unprivate VPNs, Nym has you covered.

NymVPN in September

There are also other important updates to the NymVPN app in September:

New language localizations in Arabic and Persian New nodes worldwide The release of the Nym Trust Center so you know how NymVPN fundamentally differs from what other VPNs’ no log’s policies claim to provide

Read about all the new updates here.

And there’s a lot more coming very soon:

Release of the WireGuard Fast Mode on Windows Reduced leaks (IPv6, DNS, etc.) while connected to the app Launch of zk-nym-enabled accounts Launch of localized Apple apps and website Improved censorship-resistant gateways Free NymVPN Beta testing continues

NymVPN is being built on many principles: everyone needs enhanced online privacy not provided by other VPNs on the market, but also speed for things like casual browsing without compromising on privacy.

NymVPN thus allows users to customize their level of online privacy with two modes:

The Anonymous Mode, which provides the most technologically sophisticated online privacy protection through the Nym mixnet And the Fast Mode which is now powered by WireGuard for less sensitive traffic

But the job isn’t done yet: Nym still needs all you dedicated Beta testers to flood the Nym network with your traffic so that the Nym team can identify all the bugs and make it work seamlessly for users. So if you haven’t tried it yet, sign up to get your free instant credential today!

And if you’re curious about how the Nym mixnet handles your traffic, check out our step-by-step guide to what happens behind the screen in the network.

Join the Nym Community

Telegram // Element // Twitter

Privacy loves company

English // 中文 // Русский // Türkçe // Tiếng Việt // 日本 // Française // Español // Português // 한국인

NymVPN supercharged: WireGuard is live and speedy! was originally published in nymtech on Medium, where people are continuing the conversation by highlighting and responding to this story.

Friday, 04. October 2024

Greylock Partners

49 of the Most Promising Fintech Startups

The post 49 of the Most Promising Fintech Startups appeared first on Greylock.

Thursday, 03. October 2024

a16z Podcast

From Swipe to Scale: How Tinder Became #1

In 1995, just 2% of couples met online. Today, that number has surged to over 50%, making online dating the top way couples connect. In this episode, a16z General Partner Andrew Chen chats with Tinder founder Sean Rad about how he built an app that changed culture. Sean shares why seamless experiences matter, how startups often get marketing wrong, and how Tinder became the catalyst for online da

In 1995, just 2% of couples met online. Today, that number has surged to over 50%, making online dating the top way couples connect.

In this episode, a16z General Partner Andrew Chen chats with Tinder founder Sean Rad about how he built an app that changed culture. Sean shares why seamless experiences matter, how startups often get marketing wrong, and how Tinder became the catalyst for online dating's explosive growth.

This conversation was recorded live in LA at a16z’s Games third Speedrun program. Learn more about Speedrun: https://a16z.com/games/speedrun/

Find Sean on Twitter: https://x.com/seanrad

Find Andrew on Twitter: https://x.com/andrewchen

Stay Updated: 

Let us know what you think: https://ratethispodcast.com/a16z

Find a16z on Twitter: https://twitter.com/a16z

Find a16z on LinkedIn: https://www.linkedin.com/company/a16z

Subscribe on your favorite podcast app: https://a16z.simplecast.com/

Follow our host: https://twitter.com/stephsmithio

Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures.

Wednesday, 02. October 2024

Circle Press

Circle Appointed to Board Position on U.S. Government-led Partnership to Combat Illicit Finance

The new Illicit Virtual Asset Notification (IVAN) global public-private partnership  identifies and counters criminal threats and activity in real-time

The new Illicit Virtual Asset Notification (IVAN) global public-private partnership  identifies and counters criminal threats and activity in real-time

Tuesday, 01. October 2024

Zcash

ECC Transparency Report for Q1 2024

Why release a transparency report? ECC is committed to openness and transparency — as we help evolve and support the Zcash digital currency, and in support of our mission to […] Source
Why release a transparency report? ECC is committed to openness and transparency — as we help evolve and support the Zcash digital currency, and in support of our mission to […]

Source


Greylock Partners

Greylock-backed Resolve AI raises $35 million in seed funding to help engineers

The post Greylock-backed Resolve AI raises $35 million in seed funding to help engineers appeared first on Greylock.

Sequoia

A Startup Founder To Scaleup CEO’s Journey from $0 to $25billion (Halliganisms)

The post A Startup Founder To Scaleup CEO’s Journey from $0 to $25billion (Halliganisms) appeared first on Sequoia Capital.
A Startup Founder To Scaleup CEO’s Journey from $0 to $25billion (Halliganism’s)

By Brian Halligan Published October 1, 2024

In my work as an investor with Sequoia and Propeller, many CEOs pick my brain about the journey from startup founder to scaleup CEO.  It was indeed a long strange trip, but one on which I learned a lot.  Here’s a list of Halliganism’s I picked up along the way that I hope will help some of you on that same journey.

Leading

Perspiration vs Inspiration: In startup mode, the CEO role is 90% perspiration and 10% inspiration. In scaleup mode, the CEO role is 10% perspiration and 90% inspiration. What worked in startup mode won’t necessarily work in scaleup mode—you’ll need to evolve.

1 back, 2 forward, 1 back, 2 forward: From the outside, HubSpot might seem like a smooth up-and-to-the-right story. It belies what actually goes on inside. The best way I’d describe HubSpot is two steps forward followed by one step back followed by two steps forward followed by one step back over and over. We never found that “silver bullet.” There was never one hire, one customer, one partner or one investor that moved us forward more than two steps. It’s a grind.

A perpetual state of constructive dissatisfaction: This is how one of my board members, Lorrie Norrington, described me once. I kind of liked it.

Manage your trust battery carefully: As a CEO, you generally start out with a fully charged trust battery. Over time you make calls—you get some right and get some wrong. If you get too many wrong in a row, you lose a lot of the charge in the trust battery with the organization and along with it some of your moral authority. My rule of thumb is that you get a few “CEO cards” to play that may be very bold bets or ones that your team disagrees with. You don’t have 52. 

Keep your head in the sky and your feet on the ground: You need to paint a compelling vision of a terrific future for your company while at the very same time dealing head on with the very real problems you have today. Toggling between those two is a delicate dance that takes some feel to get it right.  …Btw, if you give your investor the line that you “focused on the terrific future and not to worry too much about this quarter’s numbers,” prepare for them to vomit on your sneakers. [h/t Teddy Roosevelt]

I was a wartime CEO: There were times in HubSpot that were relatively peaceful and times that were more like war. I felt much more comfortable during wartime. During peacetime, decisions bubbled up from the bottom and were made more deliberately. During wartime, decisions came from the top down and were made more quickly.  Knowing whether you are in wartime or peacetime is useful. Knowing which mode you thrive in is also useful …Imho, the h/t goes here to Ben Horowitz who coined this term, but I think Paul Graham kind of picked up the thread with “founder mode.”

Culture

EV>TV>MeV: It is critical that everyone in the organization solve for the “Enterprise Value” over their “Team’s Value” and over “Their Own Value (MeV)” when they are making decisions. I found that when I wasn’t watching closely, leaders would solve for their “team’s value” over the “enterprise’s value” and often when they solved for their team’s value, that sub-optimized another leader’s team’s value. Like many other things, this is a tax that creeps into the organization as it grows (#ScalesTax) and needs to be fought.

Your culture is your second product: At HubSpot, we have two products: one we sell to our customers (HubSpot’s CRM), and one we sell to our employees (HubSpot’s culture). Like your product, you need your culture to be unique relative to the competition (for talent) and you want your culture to be very valuable (for talent). Like your product, when your culture is unique and valuable, your company turns into a magnet that attracts and retains terrific talent. And also like your product, it’s never done—it needs continuous iteration.

Customer OR Employee OR Investor: I think every company is either customer centric (Amazon), employee centric (Nvidia), or investor centric (Berkshire Hathaway). This usually comes from the CEO. If you want to change it, you need to work hard to do that. In HubSpot’s case, our first 8 years we “talked the talk” about being customer centric, but if you looked at what we spent our time on, we were actually “employee centric.” This had its benefits, btw, and isn’t a bad way to go ( see Nvidia), but we felt we needed to walk the walk and be customer centric. Around year 8-ish we made a conscious choice to make that shift and it worked pretty well. We largely did this through more carefully managing the agenda in our management meetings, changing the incentives for our bonus (i.e. added NPS), moving money around the P&L from S&M to R&D, etc.

Keep an eye on the mercenary to missionary ratio: When you are in the early stages of your startup, almost everyone you hire is a missionary. As you scale, more and more of your employees will be mercenaries, but not all, of course.  I think you can do things to delay the shift, but it is pretty inevitable and you should live it with.  There are plenty of companies that scaled just fine before yours with a mix of missionaries and mercenaries.  #ScalesTax 

Your org will naturally become more risk averse as you scale: When you are in the early stages of your startup, almost by definition, everyone you hire is very risk seeking and is willing to be very bold to create a lot of upside.  After all, there’s very little downside to protect!  As you scale and have something to lose, the organization will protect that.  This is a natural thing that happens to existing employees whose net worth is improving – I don’t blame them.  This is a natural thing that happens to new hires when they look at your brand on their LinkedIn profile as part of the decision to join you – again, I don’t blame them.  In our case, my cofounder is preternaturally risk seeking and was always a very good counterbalance to that trend toward risk aversion. 

Cultures typically “break” around 150 employees (Dunbar’s number):  Our culture broke around this size and almost every other CEO’s company I talk to did too.  I think it is a combination of the missionary ratio, some risk aversion, the fact the founder didn’t interview everyone, and that there is a new layer of management.  Getting through that in one piece is kind of where you move from a startup to a scaleup, imho.

I tried to kill titles and org charts:  At one point, I drew on a whiteboard near my desk at HubSpot what I thought was the “influence chart” as a counter to the “organization chart.”  It looked much different and actually showed a guy named Brad Coffey as the most influential person inside HubSpot, not me.  Around the same time I tried to eliminate titles.  There was a huge amount of pushback and I decided these weren’t the hills I wanted to die on.  I suspect you’ve thought of the same thing.  If so, may the force be with you.

Team 

Times change, teams change: Our original management team at HubSpot was brilliant and I just assumed it would be the same crew we’d be in the trenches with for the whole run. That turned out not to be true.  If you step back, you’d see that we are probably on our third or fourth management team – all were good in their respective stage.  It turns out that people kind of self-select for stage. Those early leaders we had were super risk seeking and really well suited to that phase and have all gone on to do awesome stuff in that early stage. I’m super proud of them.…My advice is not to sweat it too much if you need to replace leaders on your team as you grow. The company and the leader are likely both better off.

Recruit from companies just a few years ahead of you: I’ve found this applies to board members and executives. When we hire folks from companies that are several orders of magnitude larger than HubSpot (i.e. Microsoft, Google), there is an impedance mismatch. They are dealing with different issues at a different scale. We’ve had good luck with hiring folks who are at companies we admire that are just a few steps ahead of us. 

A truly independent board member is worth her weight in gold: We have had several along the way from companies that were ahead of us in size, but still growing fast. Along the way, they helped us avoid untold landmines as they had “already seen the movie.” The other benefit of a truly independent board member is she is typically a good counterbalance with your venture capitalists. I found it helpful to have sourced the independent board member myself as opposed to having it come from the VC’s tight circle. 

Home grown talent is underrated:  I’ve noticed that the vc playbook when a new round is done is to recommend “upleveling” some of the home grown talent.  In some cases this might be right, but I think folks over index on it.  If you look at the executive teams of some of the best companies (i.e. Apple, Nvidia, Amazon, etc) they are full of people who have been there a long time.  HubSpot’s current management team has a lot of “home grown” talent.

Build your team like the 2004 Red Sox: The Red Sox went 86 years without winning the World Series. In 2004, they finally won it with a ton of home grown talent and few seasoned veterans that put them over the top, like Pedro Martinez, Curt Schilling and David Ortiz. HubSpot often had a lot of home grown talent on its senior team and usually had a couple of well placed veterans that put them over the top.  

You can’t teach someone how to be smart: Tommy Heinsohn was a former Boston Celtic great who was an announcer for the team. I remember him talking once about a backup center for the team who was 7 feet tall and had some girth. Tommy said, “you can’t teach 7 feet.” He had a good point. That player wasn’t highly skilled, but he was huge and that was in and of itself quite valuable. I feel the same way about brain power. I used to look around the table at management team meetings and think about whether each person was empirically smarter than I was. I always wanted to be the dumbest guy in the room and largely succeeded at it.

Avoid compliments. Find complements: If I put 100 calories into getting better at something I was already kinda good at and enjoyed, I would get 1000 out. If I put 100 calories into getting better at something I was kinda bad at and didn’t enjoy, I would get 101 out. I stopped obsessing about fixing my weaknesses and started hiring folks that could plug them. Don’t hire in your own image! Hire folks who are complementary.

It was mostly product and sales: In scaleup mode, I spent most of my time on product and sales and didn’t give much attention to things like legal and finance. This was “unpopular” with folks in legal and finance and probably led to some unfortunate turnover of some good people.

Five compliments for every criticism is bull-shiitake: Management gurus will tell you that you need to give five pieces of positive feedback from every correction. If this is true, I had it all wrong and so did everyone I ever worked for.…Related to this, the feedback sandwich, the one where you surround your negative feedback with positive feedback before and after it is also bull-shiitake. Everyone knows this game—it doesn’t work anymore.…I’ll be pilloried for this, but this was my lived experience.

Jensen Huang is mostly right about management, imho: In the early days of HubSpot, I liked big, relatively infrequent meetings (monthly, not weekly) where everyone could hear what was on my mind, avoided the weekly 1:1, and I would criticize and compliment in public.  As time went on, I started doing closed weekly meetings, having 1:1s, and I stopped criticizing in public.  Listening to how Jensen runs Nvidia leads me to believe I wasn’t completely crazy to run it the way I did back then.

Analytical skills are overrated. Taste is underrated. Almost everyone these days has pretty good analytical skills. Vanishingly few people have taste. Figure out who has it and give them power.

Exec Hiring Success Rates Are Lower Than You Think:  If I look at all of our executive hires over time, I’d guess that 18 months after that senior hire happens, around 60% of them have “stuck” and we end up churning about 40%.  This is similar across most of the CEOs of companies I coach. Senior level hire candidates are very good at interviewing.  Interviewers of senior level hires, including myself, overestimate their skill in interviewing.  My advice would be to reduce your interview panel and hire the folks who have great strengths (4/4s) and maybe some weaknesses (2/4s) while avoiding the candidates who are all “good” (3/4s).  In other words, hire for strengths, rather than for a lack of weakness.  …If you whiff on one or two big hires, don’t sweat it too much.  

You need glue people to scale:  This is a term used in sports to describe players on team sports who aren’t the stars, but are the glue that makes the teams really get over the top and win.  I think it applies even better to scaleups.  As we grew, the leverage really shifted from star players to glue players who really knew how the machine worked and were “ops” wizards.  For example, in startup mode, the leverage is with your top sales reps and leaders, but the lever shifts to that quiet ops person who can turn the right knobs and dials and not break the machine.  [h/t Ravi Gupta from Sequoia]

Strategy

Watch the competition, but never follow it: I got this line from Arnoldo Hax, my strategy professor at Sloan, and repeated it so many times that it is ingrained in HubSpot’s DNA. It is relatively obvious at this point that HubSpot competes with Salesforce.com (a formidable competitor). We very carefully watched them, but tried not to “follow” them—see next lesson.

When everyone is zigging, you should zag: Regardless of what you think of Peter Thiel’s politics, he wrote a really good book on startups called Zero To One. In it, he talks about how you need to be right about something that everyone thinks you are wrong about for a long time. This type of “zagging” worked for HubSpot three times. First, we decided to focus on SMB (more M than S, btw) and stuck with it when everyone and their brother thought we should move to the enterprise. Second, we decided we would move from a marketing application company to a CRM platform company, competing with Salesforce, when everyone and their sister told us we were crazy to try because they were too hard to compete with. Third, we decided we would “build” (craft!) our CRM in-house as opposed to acquiring our way there when everyone and their cousin told us that we needed to follow ye olde CRM M&A franken-playbook.

Don’t trash talk: I recently watched the U.S. Open tennis finals. In the remarks after the matches, I always appreciate how respectful the players are toward their opponents and how they express it. I feel the same way about Salesforce; they are a very good company that is hard to compete with, and no good comes in “poking the bear.” [h/t to my co-founder Dharmesh for coming up with the “poke the bear” analogy and many other brilliant things]

Creating a category is harder than it looks:  HubSpot created the “inbound marketing” category.  Pulling that off involved writing about zillion blog articles, giving a jillion speeches, writing a book, running a conference, etc.  We invested way more energy in creating the inbound marketing category in the early years than we did in marketing the HubSpot product.  …So, when we wanted to go into the sales category, we thought we could just re-run the same playbook for “inbound sales.”  Failed.  When we went into CRM, we thought we’d create a new category called CMR, “customer managed relationship” software.  Failed.  When we released our CMS, we thought we’d create a new category called COS, “content optimization system.”  Failed.  In retrospect, we caught lightning in a bottle with “inbound marketing.”

Either you are eaten by a platform or become a platform:  In the early days of HubSpot, we used to pitch the company as “Salesforce.com is to sales as HubSpot is to marketing.”  Under our breath, we’d always say “until Salesforce.com wants to become the Salesforce.com of marketing.”  Well, one day they did.  They picked up Exact Target, Pardot, Radian6 and Buddy Media all within a few months and built themselves a very large Marketing Cloud business.  We decided at that point that we ought to pivot from being a marketing app to a CRM platform ourselves, lest we be eaten.  This turned out to be a very good call in hindsight.  [h/t Steve Fradette, co-founder of Toast]

Decision Making

Compromise is the enemy of greatness: As a CEO, you are often in the middle of roiling debates about any number of things where there are really good arguments on both sides. The temptation as a first-time CEO surrounded by senior and smart folks is to make people happy on both sides of those good arguments and craft an “uninspired compromise.” This is the kiss of death. You want to actually break the tie. You want the argument at hand to have “winners and losers.”  [h/t Brad Coffey]

Wearing NO shirt!: As a new CEO with lots of money in the company’s bank account, I wanted to do all the things. I found myself saying “yes” a lot. This caused a lack of focus which led to shoddy execution and budget problems. I got some pointed feedback about this and wanted to change, so I made a literal “No” shirt and wore it to management meetings. People got the point that I wasn’t just going to rubber stamp every good idea.

If you want to kill a plant, have two people water it: If you ask two people to look after your plant while you were away for a month, it would likely struggle due to over or under watering. If you ask one person to look after your plant while you were away for a month, it would likely flourish. Whenever we assigned multiple people to own a project, it almost always went sideways. The DRI concept certainly wasn’t novel to us, but when we ignored it, we suffered.

Sometimes doing nothing is the right thing to do: I think a lot of CEOs have a “bias to action.” I think this is mostly good, but when you are short on staff and have a ton going on, it is often best to sit on your hands and let folks keep grinding on their projects and not jerk the steering wheel. I jerked the steering wheel too much.

Avoid the “Tyranny of Or”: We used to do an annual field trip with the management team to the west coast and meet with execs we respected to just learn. One year we visited George Hu. He was the former COO at Salesforce and was then at Twilio (I think). He encouraged us to challenge our teams to avoid the “Tyranny of Or,” like you can have it fast or you can have it good or you can have it cheap. I started pushing back on the “or” framing in meetings and was kind of surprised that it actually worked sometimes. Thanks George.

Nap on it: If too much food comes into my mouth, my stomach gets full. If too much information comes into my head, my brain gets full. I’m definitely someone who gets major duomo decision fatigue! To combat that, during a day where there was a lot of information flying around, I’d sneak out to our nap room (yup) and get a quick 20 minutes in. When I’d lie down, I’d be thinking about the decision at hand and when I’d wake up, often I’d have some clarity on it. I think of a nap as a time when my brain cleans up—sweeps the debris, organizes the files, etc. When things are cleaned up, I’m better able to make a call.

“Get to the coal face”: This was an expression one of my VCs, David Skok, used one day that I pretended to understand, but didn’t. I googled it later on: “The place where the actual work of an activity is done. For example, ‘Those at the coal face of the business may lose patience with theories and abstractions.’” It turns out that the bigger the organization gets, the further the CEO gets from the front line employee and the customer. It used to drive my managers crazy, but I’d often jump several levels down in the organization and try to get the coal face truth from the sales reps or support reps who were talking to customers every day. I also used to have customer panels at all of our management meetings and some of our board meetings to keep everyone grounded in what the customers were actually saying. I did a lot of stupid things at HubSpot; this was smart.

Crisis management

Never waste a good crisis: There were tons of problems and crises along the way. One of the things that served HubSpot particularly well is that we recognized the crises when they were happening and were explicit about trying to take advantage of them to learn and get better. During a crisis, we’d literally say outloud over and over, “let’s not waste this crisis.” Ironically, one of the best “crises” that happened to us was a long outage on the last day of the first quarter in 2019. It led us to make comprehensive changes to the way we built and delivered products, and helped enable us to move the culture to becoming even more customer centric. As awful as Covid was for humanity, it was that crisis that gave us cover to make some wholesale, healthy changes to our business model. Crisis = Opportunity.

When you have to eat a shit sandwich, don’t nibble: This is an idea I heard from Ruth Porat, the CFO at Google, that rang true to me. Where I got in trouble was when I spun things to employees, customers, partners, etc. People are smart and they are paying very close attention to what the CEO says. If you spin them, you’ll regret it. It’s better to take a giant bite out of that shit sandwich now than have the whole thing shoved down your throat later.

Listen to Coach K: The winningest coach in men’s basketball was Duke’s Mike Krzyzewski. The thing that drove him crazy was when someone would miss a shot on offense and then compound it by making a dumb play right after on defense by taking a stupid risk to make up for the missed shot. You could hear him on the sideline yelling (imploring) after every missed shot “Next play… Next play.” I used this method several times following unforced errors, including my own, to try to get the company to put the past in the past. I think it mostly worked. Thanks, Coach K.

Improving Your CEO Craft

Feedback is the breakfast of champions: Once a year we had my co-founder, Dharmesh, do a 360 review for me that was like pure gold. His method is highly replicable. He gave an NPS survey to about 25 folks up and down the org asking two questions: (a) “Your likelihood to recommend Brian as the CEO of HubSpot” and (b) “Why?” He took the answers to those questions and put together a 20 page document for me. He found the themes in the feedback (people wrote novels!) and grouped them together with example quotes to back up the theme. For example, “Brian is good at setting and selling the vision for HubSpot” would be the theme and then he’d pick out 6 or 7 direct quotes that backed this up. …Now, not all the themes were positive like that one. The first 10 pages were my “feature” themes and the back 10 pages were my “bug” themes. I was convinced I was the world’s best CEO after page 10 and the world’s worst CEO after page 20! I shared the document with the company and board along with my own “performance plan.” I’m not sure, but I “think” it might have inspired HubSpotters to take their own improvement more seriously.

Your greatest strength turns into your greatest weakness: Starting up is doing as many jobs as possible so your company can survive. Scaling is shedding as many jobs as possible so your company can survive. [h/t Aaron Levy] Even founder mode types need to delegate, particularly on things that aren’t super core.

I benefited from CEO groups:  When we were an early stage startup, I was in a CEO group.  One of the CEO’s was Colin Angle from iRobot – he had a whopper influence on a very young and raw version of myself.  After we scaled, I joined a CEO group with a bunch of terrific public company CEO’s from companies like Slack, Atlassian, Shopify, etc.  Half of the value of these CEO groups was a collective “misery loving company dynamic” where we all shared our problems and all our problems rhymed.  The other half of the value was getting best practices on how to solve those problems.

I benefited from a CEO coach:  At one point my board suggested I hire a coach.  Ahem.  Strongly suggested I hire a CEO coach.  This turned out to be a good idea.  He was half coach and half psychologist.  I need both!

Communicating

Get the truth telling vs cheerleading thing right-ish: I worked with folks who were always cheerleading and didn’t spend enough time grinding on the details. It drove me crazy. I tended to over-index the other way. It drove many of my people crazy. My advice is to check yourself on this from time to time and not over index one way or the other.

Transparency builds trust: We were always very transparent with employees, customers, partners and investors. When new execs would come in they would always be surprised at our level of transparency and a little uncomfortable at first.…I think that transparency built trust with all our constituencies. Filling that trust battery helped in innumerable ways..

It’s not 10,000 hours, it’s 10,000 times: In the early days of HubSpot, I remember sitting in the audience at Dreamforce listening to Marc Benioff tell the “cloud story.” I remember wondering to myself how many times Marc had told that story. It must have been at least 10,000 times. Whenever I was getting bored telling the “inbound marketing” story, I’d take comfort in the fact that it took 10,000 times to sink in. This was also true for the story of HubSpot moving from a marketing app to a CRM platform—at least 10,000 and still counting!

I lived presentation to presentation: In the early days as CEO of HubSpot, I felt like I lived from one presentation to the next. I was always working on my next presentation—an updated customer pitch, a company meeting presentation, a board meeting presentation, an Inbound conference talk, etc. I was always, always, always working on a deck.

You’re being watched 10x more closely than you think: I often hear from HubSpotters about something I said 10 years ago in a hallway conversation that I had long forgotten. It may not seem like your team pays any attention to you (it seemed that way to me), but they are very closely observing what you say and your body language. This can be a superpower or kryptonite for you.  

You need to absorb complexity and pass down clarity:  Inside the walls of HubSpot at any given time, there was anywhere from a few light wisps of fog to a full on San Francisco style fog bank socking us in.  The job of the CEO is to do a good job of “clearing that fog” and I found myself frequently in meetings where the topic was “clearing the fog on ____.”  My colleague JD Sherman used to walk into those meetings and say “the job of a leader is to absorb complexity and pass down clarity” and I think he was right about that.

Managing Yourself

Work and life never balanced for me. I’ll probably be pilloried for this, but I never had work-life balance and never really had a “real” vacation. Being a CEO was a full contact sport and I chose it over balance more often than not. The truth is, I don’t regret it. I have come up short on some of my personal objectives, but have far, far exceeded my professional objectives. My life’s been really good so far.

Work can be a lot of fun: I see a lot of CEOs these days with their teams and sit in a lot of board meetings. Honestly, there aren’t a lot of laughs! This is interesting to me because HubSpot’s management meetings and board meetings were always pretty serious, but there was also almost always a ton of levity, even and especially when things were going sideways. Some of the funniest moments of my life were work-related.  My former HubSpot colleague, JD Sherman, might be the funniest person I know and my cofounder isn’t far behind him.

Be yourself; everyone else is taken: I worked for three different CEOs prior to doing it myself. They couldn’t have been more different. I joined a CEO group with 8 other CEOs. They were all pretty different. I tried to be like other CEOs for many years. Over time as I got a bit more confident, I just tried to be myself, quirks and all. Folks didn’t seem to mind me much most of the time. [h/t Oscar Wilde]

Imposter syndrome didn’t go away: Even today as I’m typing this article, I feel major imposter syndrome. If you’ve got it, you’re not alone. If you don’t, I’m jealous.

Make a large pizza and take a slice out along the way: One thing Sequoia did in our Series D round was allow us to sell some of our common shares to them as part of the round. This turned out to be a great idea for me (and even better for Sequoia!). It “stiffened” my backbone when it came to acquisition interest and kept us focused on building a company our grandkids would be proud of. This had the added benefit of aligning our interests very well with our investors.…In retrospect, it was likely one of the worst financial decisions I’ve ever made, but I don’t regret it. The liquidity then was great (I’m typing this from the home on Cape Cod it bought me back then) and the pie was plenty big.

Exits

We didn’t get any acquisition offers: I always thought that every scaleup had multiple offers multiple times for acquisition.  Maybe they do.  We didn’t!  …The likelihood you are going to get acquired for a good price by your dream acquirer is really low and is even lower these days with the regulatory environment. Build something that you think your grandkids will be proud of decades from now.

The IPO is the starting line: The day of the IPO was one of the best days in HubSpot’s history—lots of laughter and a lot of tears as well. In the years leading up to the IPO, I repeated the line “the IPO is the starting line” hundreds of times and I can’t say for sure, but I think it kinda worked. I wanted folks focused on building something special that ultimately our grandkids would be proud of.…The other thing I did is I never talked about the stock price and would ask folks around me to stop talking about it whenever I heard those conversations. Focusing on the price will drive one bonkers—it has a lot to do with our performance, but it also has a lot to do with many things out of our control.

Going public is underrated:  Now that founders are able to get liquidity prior to going public, there is less allure in the IPO.  I get it.  What I think people underestimate is the pure joy the actual IPO gives you and your team.  The day of the IPO and the party we had the day after were among the happiest and most gratifying moments of my life.

Planning

Aligning vectors is actually magical: This is something I “borrowed” from an Elon Musk talk at a Sequoia event a few years back. He describes each employee as a vector with a strength and a direction. In most companies, these vectors are pointing all over the place.  One the top of the list of jobs of the CEO of a scaleup is to hire folks with strong vectors and point them all in the same direction. Here’s how I think about it:

MSPOT–the planning doc to rule them all: We visited lots of companies to ask about planning and looked at all kinds of methodologies. We came up with our own, we called MSPOT that enabled us to get our vectors aligned. Mission, Strategy, Projects, Omissions and Tracking. The thing I liked about it was it got everyone to argue about and agree on a few key priorities and how to measure them, and it put everyone’s pet rocks on the shelf until the next planning period. HubSpot’s first MSPOT from 2012 was something like this:

Use 6 Month Planning Seasons: When we were in peacetime, we benefited greatly from planning “seasons.” 

April to June: Long range strategy planning and navel gazing July to Sept: Turn long range plan into next year’s plan Oct to Dec: Budgets Jan to Mar: Heads down on Q1 plan

These seasons avoided constant churning around different strategic directions and allowed a place for everyone to bring up their pet rocks and where to put those pet rocks to bed. In retrospect, I think I would have done these biannually because we’d often edit the plan halfway through the year. It turns out that the 365 day earth-sun cycle doesn’t match the earth-technology cycle.

Complexity kills: Complexity creeps into the organization as it scales. It’s like gravity. My advice would be fight it tooth and nail in HR policy, compensation plans, pricing policies, etc.  #ScalesTax

The more people you have, the less you get done: This is another one of those gravity things. I used to look at our priority list every year and it got shorter as we got bigger. I learned to be okay with it—we got most of the big things right most of the time.

Stepping Away

Know when to hand over the keys: One day in 2021 I had a very bad snowmobile accident that nearly took my life. Lying in the snow badly injured, I had some time for introspection while waiting a long time for 911 to arrive. I basically decided while lying there that I didn’t want to be CEO of HubSpot anymore. I wasn’t getting the joy out of it I once did and I didn’t think the next phase of bringing HubSpot from $2 billion in revenue to $10 billion in revenue suited my skills very well. I stewed on that for a while and then when I recovered about 6 months later, I handed over the keys and became chairperson of HubSpot.

Pick your successor carefully: I ended up handing those keys to Yamini Rangan who has done a terrific job. There are a few reasons she was a good fit. First, we worked together for a while and I got to know her—we promoted her from within. Second, prior to HubSpot, she worked at Dropbox and Workday, two companies that, at the time, were “a few steps ahead of us.” Third, while I was out on medical leave, she did a great job of running the company on my behalf….I hope none of you CEOs have to go through a near death experience to decide to step down, but I might encourage some introspection on when that right time might be for you. Most wait too long, imho.

Chairpeople Don’t Drive: Once I took on the chairperson role, I talked to several other founders who have gone through similar CEO transitions and taken on chairperson roles. The one thing I heard from everyone was to “let go of the steering wheel.” Chairpersons and board members are the grandparents and the CEO is the parent. The failure case is that the chairperson has to go back in and be CEO again à la Howard Shultz at Starbucks. 

Being CEO is overrated: A lot of people want to become a CEO, including me 18 years ago. I’d just tell you that it is an overrated job. You work for everyone: your customers, partners, employees and investors. It’s not the other way around. You are on call to them at all times. Be careful what you wish for!

Hope these help you on your journey.

Share Share this on Facebook Share this on Twitter Share this on LinkedIn Share this via email JOIN OUR MAILING LIST Get the best stories from the Sequoia community. Email address Leave this field empty if you’re human:

The post A Startup Founder To Scaleup CEO’s Journey from $0 to $25billion (Halliganisms) appeared first on Sequoia Capital.


Circle Blog

Circle Responds to Abu Dhabi’s Proposed Regulatory Framework for Fiat-Referenced Tokens

Circle recently expressed strong support for the initiative by Abu Dhabi’s financial regulator to introduce a regulatory regime for the issuance of fiat-referenced tokens (FRTs). Circle commended how the Financial Services Regulatory Authority (FSRA) of the Abu Dhabi Global Market (ADGM) is engaging with the private sector through an open consultation process before finalizing the regul

Circle recently expressed strong support for the initiative by Abu Dhabi’s financial regulator to introduce a regulatory regime for the issuance of fiat-referenced tokens (FRTs). Circle commended how the Financial Services Regulatory Authority (FSRA) of the Abu Dhabi Global Market (ADGM) is engaging with the private sector through an open consultation process before finalizing the regulatory framework.


Web3 Services: September 2024 Updates

We’re excited to share two important launches this month: support for Arbitrum across our Web3 Services product line, and the launch of Compliance Engine for Programmable Wallets. Learn more below.

We’re excited to share two important launches this month: support for Arbitrum across our Web3 Services product line, and the launch of Compliance Engine for Programmable Wallets. Learn more below.


Greylock Partners

Introducing Resolve: An AI Production Engineer

The post Introducing Resolve: An AI Production Engineer appeared first on Greylock.

Empiria

Empe Issuer Demo Tutorial

Experience the power of verifiable data with our interactive Empeiria Issuer Demo. This guide will show you how to effortlessly claim, store, and verify Verifiable Credentials (VCs) using the Empeiria Verifiable Data Wallet. Ready to elevate your digital experience? Let’s dive in! What is Empe Issuer Demo? The Empe Issuer Demo is an online platform designed for issuing Verifiable

Experience the power of verifiable data with our interactive Empeiria Issuer Demo. This guide will show you how to effortlessly claim, store, and verify Verifiable Credentials (VCs) using the Empeiria Verifiable Data Wallet.

Ready to elevate your digital experience? Let’s dive in!

What is Empe Issuer Demo?

The Empe Issuer Demo is an online platform designed for issuing Verifiable Credentials. It allows users to quickly issue example credentials like Proof of Attendance (POAP), and Proof of Purchase (POP). Users can then instantly claim these credentials through the Empe Wallet app.

Empe Issuer Demo is fully interactive and leverages the power of Empe Blockchain to showcase a variety of real-life use cases for verifiable data.

The best way to experience its full potential is in tandem with the new Empe Verifiable Data Wallet.

What is Empe Verifiable Data Wallet?

The Empe Verifiable Data Wallet (“Empe Wallet”) makes managing and sharing your data securely easy. This key part of End-to-End Verifiable Data Infrastructure (EVDI) is built on Self-Sovereign Identity (SSI), which means it empowers you with full control over your Verifiable Credentials.

With Empe Wallet, you don’t have to worry about anyone else accessing your information without your permission. It uses top-notch encryption to keep your data safe and follows standards like W3C and OpenID to ensure it works smoothly with other systems. Plus, developers can quickly create and test new ideas with the Empe DID Wallet SDK before adding it to their apps.

The latest update to the Empe Wallet brings a faster and lighter experience. It’s now 90% lighter and 3 times quicker than before, making data management smooth and efficient. The new app keeps your Verifiable Credentials safe with top-level encryption and offers an easy-to-use interface.

You can download Empe Wallet from the Apple App Store and Google Play Store now.

What you’ll need? A computer, tablet, or other device with a web browser. A smartphone with the Empe Wallet app installed. Getting Started

Let’s begin by getting your Empe Wallet set up. Don’t worry. It couldn’t be easier.

1. Download and Install Empeiria Wallet

Begin by downloading and installing Empe Wallet on your smartphone. Visit the Apple App Store or Google Play Store now and follow the on-screen instructions to set up your wallet.

2. Set Up Your Wallet

Open the Empe Wallet app on your smartphone and follow the setup prompts. Create a secure PIN code, back up your recovery phrase, and complete the initial setup.

Exploring The Empe Issuer Demo Flows

Empe Issuer Demo offers a range of Verifiable Credentials (VCs) types, reflecting real-world scenarios and showcasing the practical benefits of our Verifiable Data infrastructure in everyday life. Let’s try them all out.

Proof of Attendance Anonymous (POAP) What is it for?

POAP guarantees privacy and authenticity for event attendance, whether it’s conferences, webinars, or others, while ensuring personal data remains secure and anonymous. In real-world scenarios, the QR code for the POAP VC can be displayed physically at the event venue or shared during a web event, ensuring that only attendees can scan it. This feature significantly limits the opportunity for unauthorized access, reinforcing the integrity of the attendance verification process.

How to use it? 1. Open the Empe Issuer Demo

Open the web browser on your computer and go to https://issuer-demo.empe.io

2. Select Proof of Attendance Anonymous

Click the “Open Proof of Attendance Anonymous” button.

3. Scan the QR code

Scan the QR code using your Empe Wallet and tap “Claim Credential”. Congratulations. You’ve just claimed your first VC. You can already see it in your Empe Wallet, where it is securely stored.

Proof of Purchase Anonymous (POP) What is it for?

POP ensures privacy and authenticity for product purchases, tickets, product interactions, and more. In real-world scenarios, a Proof of Purchase will typically be issued at the end of the purchasing process, such as after a payment is completed. It can also be linked to an official receipt, allowing it to include specific details about the transaction. This enhances its value, providing users with a secure and verifiable record of their purchases.

How to use it? 1. Open the Empe Issuer Demo

Open the web browser on your computer and go to https://issuer-demo.empe.io

2. Select Proof of Purchase Anonymous

Click the “Open Proof of Attendance Anonymous” button.

3. Scan the QR code

Scan the QR code using your Empe Wallet and tap “Claim Credential”. Congratulations. You’ve just claimed your first POP VC. You can already see it in your Empe Wallet, where it is securely stored.

In-Wallet Verifiable Credentials Offering What is it for?

This flow enables the issuance of multiple Verifiable Credentials (VCs) using a single QR code. A real-world use case scenario might involve a government agency providing various credentials like a driver’s license, passport, or national ID. The user can select the appropriate credential using the wallet.

This flow follows the OpenID4VC standard, which facilitates issuers in presenting specific or generic credential lists in a standardized, automated manner, ensuring interoperability across different systems. This simplifies credential issuance, enhances scalability, and delivers a secure, unified process for both users and issuers, particularly in environments with various credential types. The approach is designed to be seamless, flexible, and privacy-preserving.

How to use it? 1. Open the Empe Issuer Demo

Open the web browser on your computer and go to https://issuer-demo.empe.io

2. Select In-Wallet Verifiable Credentials Offering

Click the “In-Wallet Verifiable Credentials Offering” button.

3. Scan the QR code

Scan the QR code using your Empe Wallet. You’ll be presented with a choice of two different Verifiable Credentials. Select one by tapping “Claim credential”. Congratulations, you’ve claimed another VC. You can already see it in your Empe Wallet, where it is securely stored.

4. Proof of Attendance with passwordless authentication using the Proof of Purchase What is it for?

Tired of juggling countless passwords, tackling tedious CAPTCHA, and risking your private data just to log into services? Empe’s passwordless login with POP is here to revolutionize your experience!

With this feature, only users who have purchased a ticket for an event or gained access to a service (and hold it as a Verifiable Credential) will be able to log in. This approach not only ensures a secure experience but also provides seamless access for verified users.

How to use it? 1. Claim your POP

For instructions on claiming POP, see flow 2. Once complete, you should have your POP VC displayed in your Empe Wallet.

2. Open the Empe Issuer Demo

Open the web browser on your computer and go to https://issuer-demo.empe.io

3. Scan the QR code

Scan the QR code using your Empe Wallet. Once prompted, select your previously claimed POP VC and tap the “Continue verify process” button.

4. Select Proof of Attendance with passwordless authentication using the Proof of Purchase

Click the “Proof of Attendance with passwordless authentication using the Proof of Purchase” button.

5. Scan the QR code

Scan the QR code using your Empe Wallet. Once prompted, select your previously claimed POP VC and tap the “Continue verify process” button.

6. You’re Now Logged In!

You will be instantly logged into the service on your other device and receive a new POAP VC in your wallet. No logins and passwords needed!

5. Proof of Attendance with passwordless authentication using the Proof of Purchase What is it for?

This flow enables issuers to create customized Verifiable Credentials (VCs) that meet individual user needs. The form illustrates a larger process where various data points can be generated and included in the VC, showing that VCs can contain any relevant information.

How to use it? 1. Open the Empe Issuer Demo

Open the web browser on your computer and go to https://issuer-demo.empe.io

2. Select Custom Proof of Attendance with Passwordless Authentication for Issuer Access

Click the “Custom Proof of Attendance with Passwordless Authentication for Issuer Access” button.

3. Scan the QR code

Scan the QR code using your Empe Wallet. Once prompted, select the POP VC to verify and tap the “Continue verify process” button.

4. Fill in the Details

The Empe Issuer Demo will now display a simple form. Fill it in and click submit. A new QR code will be displayed.

5. Scan the new QR code

Scan the new QR code using your Empe Wallet to claim your new custom POAP VC.

6. Check the custom POAP VC details (optional)

Now you can check your new custom POAP VC details. Simply select it from the list of your in-app credentials. It will contain the details previously submitted through the Empe Issuer Demo form.

Congratulations! You’ve successfully navigated the Empeiria Issuer Demo using the Empeiria Wallet. We hope this tutorial helped you discover the power of verifiable data and how our technology can enhance your everyday digital interactions.

Follow Empeiria on X, or LinkedIn for the latest news & updates. For inquiries or further information, contact Empeiria at media@empe.io


Sequoia

Partnering with Eon: Cloud Backup Reinvented

The post Partnering with Eon: Cloud Backup Reinvented appeared first on Sequoia Capital.
Partnering with Eon: Cloud Backup Reinvented

Backup solutions are critical—and behind the times. Now Ofir, Gonen and Ron are bringing this massive market into the future.

By Shaun Maguire and Dean Meyer Published October 1, 2024 Eon co-founders Gonen Stein, Ron Kimchi and Ofir Ehrlich.

When Ofir Ehrlich first told us about the idea that would become Eon, he warned us it might sound “incredibly boring.” It was November 2023, a time when many venture firms were focused primarily—even exclusively—on AI, and he knew the category of cloud backup was, to put it mildly, comparatively unsexy.

But not to us.

On that Friday in a bustling Tel Aviv neighborhood, Dean had invited Ofir to join what was originally planned as a one-on-one lunch, and his instinct to introduce Shaun and Ofir had been proven right. They hit it off immediately, bonding over their shared passion for cyber security born of backgrounds working with the U.S. and Israeli military. Then, the discussion turned to Ofir’s vision for a cloud-native backup autopilot—an intriguing proposition for a huge and growing market and, to Dean and Shaun, anything but dull.

While migration to the cloud isn’t new, the rate of growth is: adoption has surged since 2023 to an estimated $234 billion last year, and is expected to more than triple again by 2034. As much as 30% of that total is going to backup—a top priority for companies to ensure continuity, compliance and security. But current solutions rely largely on general-purpose snapshots, which simply duplicate files, regardless of the kind of data they contain.

That makes restoration, when it’s needed, a headache if not a nightmare: recovering a single file often requires recovering an entire snapshot. It can take weeks—a turnaround that may be unacceptable under regulatory demands, not to mention prohibitively expensive.

As the builders of Disaster Recovery and Cloud Migration Services at AWS, Ofir and co-founders Gonen Stein and Ron Kimchi have lived and breathed these problems for years, and they recognized the need for a purpose-built solution. The result is Eon, a next-generation platform that is reinventing cloud infrastructure backup and introducing cloud backup posture management (CBPM).

Eon continuously scans and maps cloud resources, then smartly classifies data across services, and sets backup policies—all automatically. It also offers global, cross-service search and file-level restoration—no more grappling with the whole haystack just to find a needle. And because Ofir, Gonen and Ron designed Eon to sit on top of existing cloud services, their solution is not just more advanced, but more affordable, as well.

Novel as Eon is, it’s no surprise coming from this team, whose decades of experience have made them legends in the Israeli tech industry. In the Israel Defense Forces, Ofir was at the top of the first class of Aram, an elite research program, and remains a well-respected mentor to many founders. Eon is his fourth company, and his second with his co-founders; their team at AWS began as the startup CloudEndure, which after its 2019 acquisition by Amazon, built the largest cloud-native migration and disaster recovery services, serving the world’s largest customers. Like Ofir, Gonen and Ron are outlier leaders who were critical to the unit’s success; Gonen defined a massively successful GTM and product strategy, vastly scaling service adoption, while Ron rose to GM and led one of the largest Amazon engineering teams in Israel.

When we met in November, Ofir planned to hold off on fundraising for a couple of months—but we weren’t the only ones excited about his idea. A week later, Sequoia and Dean were cooperating on what had quickly become Israel’s most competitive seed round, which it was Sequoia’s privilege to lead. We built a relationship with each other, as well as with Eon, and the process eventually led to another kind of partnership: in March, we happily welcomed Dean to the Sequoia team.

Though we knew then that Ofir, Gonen and Ron were special, we hadn’t yet realized just how exceptional they are. In less than a year, they have hired an off-the-charts team, including three dozen of the best engineers in Israel—and their product velocity has been remarkable. Now, as they launch out of stealth and announce these three rounds of funding, it is again our privilege to double down on their Series B and continue supporting that rapid growth. By reimagining the space they knew so well, they have created a new storage layer for the cloud and finally made backups useful. For us and for companies around the world, there’s nothing “boring” about it.

Team Eon. Share Share this on Facebook Share this on Twitter Share this on LinkedIn Share this via email Related Topics #Enterprise #Funding announcement Partnering with Bridge: A Better Way to Move Money By Shaun Maguire and Josephine Chen News Read Partnering with Neros: Next-Gen Drones, Made in America By Shaun Maguire News Read Partnering with Foundry: AI Compute, On Demand By Shaun Maguire News Read JOIN OUR MAILING LIST Get the best stories from the Sequoia community. Email address Leave this field empty if you’re human:

The post Partnering with Eon: Cloud Backup Reinvented appeared first on Sequoia Capital.

Monday, 30. September 2024

Epicenter Podcast

Celestia: 1Gb Blocks, Rollup Interoperability & Lazy Bridging. Ismail Khoffi & Mustafa Al-Bassam

As Ethereum’s roadmap shifted to a rollup-centric approach, a plethora of rollups have been launched, to the point where L3s have been conceptualized. Celestia sits at the forefront of modularity as it aims to replace the monolithic blockchain architecture with specialized layers that are more suited for scalability and customization. As a result, Celestia strictly focuses on data availability to

As Ethereum’s roadmap shifted to a rollup-centric approach, a plethora of rollups have been launched, to the point where L3s have been conceptualized. Celestia sits at the forefront of modularity as it aims to replace the monolithic blockchain architecture with specialized layers that are more suited for scalability and customization. As a result, Celestia strictly focuses on data availability to accommodate the recent rollup expansion, as data storage represents the largest portion of fees on L2s. The Lemongrass upgrade lays the foundation for further use cases being enabled through Celestia, mainly revolving around interoperability and zk proof enabled ‘lazy bridging’. 

Topics covered in this episode:

The vision behind Celestia Rollup architecture Centralised sequencers and the role of fraud proofs Celestia’s market share Rollkit & sovereign rollups Governance & light clients The importance of decentralisation: L1s vs. rollups Celestia’s data availability capacity Latency & Celestia block times Interoperability & ‘lazy bridging’ The Lemongrass upgrade On-chain economics & value accrual Interchain accounts on Celestia Crypto’s mainstream adoption Future roadmap

Episode links:

Mustafa Al-Bassam on Twitter Ismail Khoffi on Twitter Celestia on Twitter

Sponsors:

Gnosis: Gnosis builds decentralized infrastructure for the Ethereum ecosystem, since 2015. This year marks the launch of Gnosis Pay— the world's first Decentralized Payment Network. Get started today at - gnosis.io Chorus1: Chorus1 is one of the largest node operators worldwide, supporting more than 100,000 delegators, across 45 networks. The recently launched OPUS allows staking up to 8,000 ETH in a single transaction. Enjoy the highest yields and institutional grade security at - chorus.one

This episode is hosted by Brian Fabian Crain.

Friday, 27. September 2024

Nym - Medium

Dive into the deep wonders of NymVPN’s Anonymous Mode

Learn all about how the mixnet works and what it’s good for Nym has been going on and on about mixnets for a while. But now the Nym mixnet is really live and free to test with the click of “connect” to NymVPN’s Anonymous Mode. But how does the mixnet underneath the hood work exactly? What makes it the most technologically sophisticated solution to online privacy available today? And, of cou
Learn all about how the mixnet works and what it’s good for

Nym has been going on and on about mixnets for a while. But now the Nym mixnet is really live and free to test with the click of “connect” to NymVPN’s Anonymous Mode.

But how does the mixnet underneath the hood work exactly? What makes it the most technologically sophisticated solution to online privacy available today? And, of course, why would you want to use it?

Check out Nym’s new step-by-step guide to everything the mixnet does to anonymize your data and make you unlinkable to what you do online. There’s a lot to go through, and a lot to learn:

Sphinx encryption and data anonymization Mixnet routing and data mixing Cover traffic to anonymize the network Timing obfuscation against AI surveillance The role of entry and exit gateways Data reassemblage And return messages through SURBS

If you’re not ready just yet for that kind of deepdive, don’t worry: read about the need for the mixnet and its core privacy features through NymVPN, including the intricacies of added network noise against AI-powered surveillance.

And remember, the bigger the crowd, the more private everyone will all be on the Nym network. So join the crowd today. Don’t just help Nym beta test NymVPN, help build a private internet for everyone, one noisy packet at a time.

Join the Nym Community

Telegram // Element // Twitter

Privacy loves company

English // 中文 // Русский // Türkçe // Tiếng Việt // 日本 // Française // Español // Português // 한국인

Dive into the deep wonders of NymVPN’s Anonymous Mode was originally published in nymtech on Medium, where people are continuing the conversation by highlighting and responding to this story.

Tuesday, 24. September 2024

Circle Press

Circle’s Global Impact Report Showcases Financial Inclusion Utility of USDC

Use cases indicate how USDC and open-source innovations are transforming aid delivery, remittances, and more

Use cases indicate how USDC and open-source innovations are transforming aid delivery, remittances, and more


Circle Blog

Welcome to the Future of Digital Asset Compliance


PIVX

PIVX 4 -Week Liquidity Mining Campaign

#PIVX is thrilled to announce the return of the hummingbot 4-week liquidity mining campaign for the $PIVX/ $BTC pair on Binance Total Reward Pool: $2000 #USD Campaign Terms Start date: September 17, 2024 12:00am UTC Total reward pool*: ~US$ 2,000 for 4 weeks (PIVX 2,400 / week) Reward token: PIVX Eligible token pairs: PIVX/ BTC Eligible orders: maker orders placed wi

#PIVX is thrilled to announce the return of the hummingbot 4-week liquidity mining campaign for the $PIVX/ $BTC pair on Binance
Total Reward Pool: $2000 #USD

Campaign Terms

Start date: September 17, 2024 12:00am UTC

Total reward pool*: ~US$ 2,000 for 4 weeks (PIVX 2,400 / week)

Reward token: PIVX

Eligible token pairs: PIVX/ BTC

Eligible orders: maker orders placed with spreads of 2% or lower
Exchange: Binance (Use this Hummingbot referral link to support hummingbot project!)

PIVX. Your Rights. Your Privacy. Your Choice.
To stay on top of PIVX news please visit PIVX.org and Discord.PIVX.org.

PIVX 4 -Week Liquidity Mining Campaign was originally published in PIVX on Medium, where people are continuing the conversation by highlighting and responding to this story.


a16z Podcast

Human Data is Key to AI: Alex Wang from Scale AI

What if the key to unlocking AI's full potential lies not just in algorithms or compute, but in data?  In this episode, a16z General Partner David George sits down with Alex Wang, founder and CEO of Scale AI, to discuss the crucial role of "frontier data" in advancing artificial intelligence. From fueling breakthroughs with complex datasets to navigating the challenges of scaling AI models,

What if the key to unlocking AI's full potential lies not just in algorithms or compute, but in data? 

In this episode, a16z General Partner David George sits down with Alex Wang, founder and CEO of Scale AI, to discuss the crucial role of "frontier data" in advancing artificial intelligence. From fueling breakthroughs with complex datasets to navigating the challenges of scaling AI models, Alex shares his insights on the current state of the industry and his forecast on the road to AGI.

 

Resources: 

Find Alex on Twitter: https://x.com/alexandr_wang

Find David on Twitter : https://x.com/DavidGeorge83

 

Stay Updated: 

Let us know what you think: https://ratethispodcast.com/a16z

Find a16z on Twitter: https://twitter.com/a16z

Find a16z on LinkedIn: https://www.linkedin.com/company/a16z

Subscribe on your favorite podcast app: https://a16z.simplecast.com/

Follow our host: https://twitter.com/stephsmithio

Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures.

Monday, 23. September 2024

BlueYard Capital

Infinex

Infinex is a platform that combines the streamlined UX of centralized exchanges with the security and self-custody safety of Decentralized Finance (DeFi). Users create their account using a passkey, and all assets are under their control at all times. They can then access the entire DeFi ecosystem through a simple standardized UI that feels like using Coinbase or Binance, but under the hood the fu

Infinex is a platform that combines the streamlined UX of centralized exchanges with the security and self-custody safety of Decentralized Finance (DeFi). Users create their account using a passkey, and all assets are under their control at all times. They can then access the entire DeFi ecosystem through a simple standardized UI that feels like using Coinbase or Binance, but under the hood the funds are deposited into Aave, swapped in Uniswap, or eventually any of the other hundreds of DeFi protocols available.

This separation of concerns allows Infinex to focus on building the best user experience possible while thousands of developers create backend systems they can integrate with, so they can iterate faster than any vertically integrated centralized exchange. And because Infinex’s functionality is (almost) fully onchain, it unlocks the power of DeFi composability in a way centralized exchanges cannot, enabling emergent product features and use cases.

With traditional centralized exchanges you’re exposed to many risks, such as owners secretly stealing or misusing funds (ala FTX), or accidental risks of the developers introducing a bug that loses your money or creates downtime just when you need to trade the most (as with MTGOX). Infinex solves both of these issues. Firstly it gives you full control over your funds so it can never steal them, and secondly you as a user have the power to opt in to only the DeFi services you wish to use, keeping your funds as safe as possible even as the service expands to hundreds of offerings.

Infinex solves the risks centralized exchanges have, as well as the problems existing DeFi products face — replacing fragmented, complex user interfaces with an experience better than either existing option today.

Our Thesis

We believe much of the global financial system will eventually run on blockchain rails. Similar to Linux, it will eat away at the legacy systems; replacing centralized, opaque, inefficient middleware with fast settling, open source, decentralized software. Infinex takes us one step closer to this future by giving users the same experience they have using centralized exchanges today, but on a more open, secure, and decentralized backend.

Furthermore we believe for DeFi to really gain adoption, it must have a user experience as good as if not better than any Web 2.0 website today. This is why we invested in e.g. Privy and Kiln in the past, and why we are excited to be backing Infinex today.

Disclaimer: The information contained in this article has been prepared solely for informational purposes and is not an offer to sell or a solicitation of an offer to purchase an interest in any entity managed by BlueYard Capital (“BlueYard”). Any reference to a specific company or security does not constitute a recommendation to buy, sell, hold, or directly invest in the company or its securities. It may not be modified, reproduced, or redistributed in whole or in part without the prior written consent of BlueYard. Portfolio company information presented herein is for informational purposes only and not intended to be a guarantee of certain investment results. BlueYard does not represent that the information herein is accurate, true, or complete, makes no warranty, express or implied, regarding the information herein and shall not be liable for any losses, damages, costs, or expenses relating to its adequacy, accuracy, truth, completeness, or use. All other company, product and service names or service marks of others and their use does not imply their endorsement of, or an association with this program.


Greylock Partners

Welcome, Sophia Luo!

The post Welcome, Sophia Luo! appeared first on Greylock.

The post Welcome, Sophia Luo! appeared first on Greylock.


Nym - Medium

The Value of NYM: The spice powering the world’s most private network

The real world value of the token behind NymVPN NYM is not just any other crypto token: it is incentivizing the most private digital journey you can take across the internet. As the “spice” of Arrakis fuels connections across the vast expanse of the Dune universe, the NYM token is what powers secure, private, and unlinkable online activity through the Nym mixnet. Dune (2021) A token ca
The real world value of the token behind NymVPN

NYM is not just any other crypto token: it is incentivizing the most private digital journey you can take across the internet.

As the “spice” of Arrakis fuels connections across the vast expanse of the Dune universe, the NYM token is what powers secure, private, and unlinkable online activity through the Nym mixnet.

Dune (2021)

A token can have much more utility than mere crypto-speculation. Through innovative tokenomics, NYM offers token holders the opportunity to be a part of something new: a token with a real world value and impact. And that value is digital privacy. For millions of people all over the world, the stakes couldn’t be higher.

Accomplishing this, however, requires a community, a crowd, and crypto-confidants willing to take a risk for a new future of both crypto and online privacy.

In this new weekly series, Nym will walk you through how Nym’s complex tokenomics works. But first things first, what does a token have to do with online privacy?

The Nym mixnet: A digital privacy portal

Unfortunately, everything we do online is highly vulnerable to surveillance. With the advent of AI, the threats are exponentially growing. How we move across the web requires not only new privacy technologies, but also a new style of being online. In the spirit of the Fremen, who silently and safely glide across the desert sand, we can call our stealth digital steps the mixnet shuffle.

Dune - Sandwalk on Make a GIF

The Nym mixnet (short for mix network) was designed to take digital privacy into this new generation. Rather than a traditional VPN model — which is centralized and typically only 1-hop by default — the Nym mixnet routes users traffic through 5 independently operated servers across the world.

With the addition of network noise, the mixnet can also help confuse surveillance in a way that other privacy networks do not:

Anonymizing data packets Mixing or shuffling the data packets of different users together Introducing timing delays to combat AI-powered traffic analysis Adding cover traffic to expand the size and anonymity of the network crowd

Imagine the mixnet as if it were a portal through which your online activity is teleported. While someone might see you enter, tracking where you exit becomes extremely difficult because of how things get scrambled in between. The result is true anonymity and privacy for everyone in whatever they do.

But there’s no need to simply imagine it, learn more about how the mixnet underneath NymVPN’s Anonymous Mode works step-by-step.

In the end, though, the Nym mixnet is much more than a network routing scheme: it’s a community of people around the globe who are incentivized to build a privacy network together. This is where NYM as token comes into play.

Nym tokenomics: Decentralized, permissionless, and unlinkable

The NYM token firstly incentivises operators to run the powerful Nym network. NYM tokens are distributed to nodes that provide privacy services to end users. As the usage and demand increases, more value flows through the network and is distributed to operators, incentivisng more to join in and serve demand.

But the token also serves a technical function that provides very unique privacy features. Namely decentralization and unlinkability.

Making a decentralized network

The nodes that operate the Nym mixnet are run by independent people all over the world who care about privacy: not employees, but a community of service providers. In fact, anyone can set up a Nym node to be part of this growing community.

But this raises the question, who decides what node is or is not part of the network? One way to solve that is to have a “Directory Authority” meaning a list of nodes held by someone somewhere. Instead, Nym uses a more decentralised method, namely a Cosmos blockchain called Nyx to maintain this topology of nodes. This makes the network permissionless: running a node does not require the consent or administration of any central authority (e.g., Nym Technologies).

The result is a decentralized directory authority: a technologically coordinated and token-incentivized community of service providers without a center of control.

Unlinkability

Nym is decentralized, not just to be ethical and equitable, but because that helps for privacy. Because the network is decentralised and permissionless, anyone can join as a node by bonding some NYM tokens to signal that they want to join. This registers them with the Mixnet smart contract.

Every ‘epoch’ (set to 1hr currently), the mixnet contract selects a subset of those bonded nodes to be in the ‘active set’ of nodes routing user traffic. Each user packet is routed through different pathways across five hops. And because the available pathways change every epoch, this makes it near-impossible for an attacker to construct a malicious pathway and trace a packet from a user’s client to its destination. In short, offer unlinkability between sender and receiver.

In short, this protects people’s patterns of online behaviour through a unique architecture powered by the NYM token. The result of this form of tokenomics is a major, unprecedented privacy architecture: making users unlinkable to their traffic through a network that is truly permissionless and decentralized.

Incentivizing performance

Other similar privacy networks often rely solely on volunteers and may suffer from a lower quality of service in certain cases. Nym nodes are alternatively incentivized with NYM tokens to run high performing nodes for clients in a way that protects the privacy of the whole network.

Performance

There are two important components to this:

By network design, only nodes which qualify (e.g., by running the latest protocols or binaries) will be eligible to be part of the active set responsible for routing mixnet client traffic. The higher the performance of nodes in routing user traffic through the network, the more rewards in NYM they will be eligible to receive.

In both cases, it is NYM which provides the incentive for nodes to perform better for clients.

The Value of NYM: Deepdives on Nym tokenomics

Tokenomics is anything but simple: it’s a reinvention of many core economic ideas and assumptions about monetary value. But fear not, The Value of NYM series is here to help. Each week Nym will walk you through the different intricacies and levels of the tokenomics behind the world’s most private network. Here are some of the questions to look forward to learning about:

What is the payment flow model in which the value of NYM sustains and empowers Nym mixnet operators? How does the NYM token allow for anonymous payments? How do NYM rewards for service providers get determined? How can crypto provide a real world utility and not just speculation?

In the meantime, here’s a taste of the different aspects of Nym tokenomics that will be covered in coming weeks.

The payment flow

Traditional VPNs operate or rent proprietary servers across the world to reroute client traffic, making them highly susceptible to government overreach and surveillance. And other existing privacy networks do not incentivize their own service providers.

To push privacy technology to its next generation, Nym has created a decentralized community of global privacy-enthusiasts who are instead rewarded for their service with NYM. This “flow” of rewards, however, is the result of an extensive Nym research and development project to make novel tokenomics for privacy a concrete and self-sustaining reality.

For instance, it requires a feedback loop between different parts:

A pool of rewards available to be earned by node operators A delegation’s program of NYM staking in performative node operators A community-decided division of rewards for operators and stakers A feedback of Nym profits into the NYM rewards pool The Nym Token Flow

So understanding the complex process of this NYM payment flow will be the first step in appreciating the tokenomics behind the Nym privacy network.

But the aim, for NymVPN users and NYM holders alike, should not be forgotten: as a utility token, the value of NYM is linked to a real world problem: providing genuine privacy online. This ultimately depends on a symbiotic relationship between users paying in NYM for anonymity, the privacy-strength of the entire Nym network, and the success of NymVPN as the first commercial app to run on it.

There are three basic principles of this flow:

Principle of Anonymity: The more users who pay to use NymVPN with NYM, the more resilient the whole network becomes for everyone against external surveillance. Principle of Incentives: The more private the network is for users, the more attractive it becomes for new ones, andthe more node operators are rewarded in NYM. Principle of Growth: The higher the demand for online privacy becomes, the more the value of NYM will grow, benefiting clients and operators alike. Anonymous payments: ZK-Nyms

The NYM token doesn’t just incentivize node operators, it also anonymizes NymVPN subscriptions for clients.

One of the biggest privacy problems with traditional VPNs is payments: you sign up for a privacy service, but you need to use fiat payments (e.g., a credit or debit card). This can be used to easily link your identity to your subscription, and potentially even to your traffic records, through the VPN’s servers if they are breached.

NymVPN addresses this with a zero-knowledge payment system called zk-nyms. A zk-nym credential functions paradoxically like an anonymous passport for your digital journey. By converting all payments first into zk-nyms, the subscriptions of users are then fully dissociated from their usage, and even from Nym Technologies itself!

At the portal entrance (what Nym calls an “entry gateway”), the only information available to the node operator or Nym is if the traveling document is valid. This does not require any knowledge of who or how it was acquired. Unlinkability thus extends all the way down for users.

So stay tuned for a deepdive into how zk-nyms work to protect your identity and all the work that went into developing them.

Communal governance

Nym nodes are not properties or employees of Nym, but rather stakeholders in a new privacy enterprise, and as such they should have a say in their own compensations.

A novel part of NYM tokenomics is to increasingly make this an effort in communal governance regarding how NYM rewards are distributed between different network agents: entry gateways, mixnodes, exit gateways, and Nym Tech itself.

The Nym community of operators has been active this year in their first self-governance initiatives, including votes on the distribution of rewards and minimum profit margins. So a big part of understanding Nym tokenomics will be appreciating these novel experiments in communal governance happening on the ground.

Stay tuned to dive into NYM tokenomics

However revolutionary crypto technology has been, it is definitely at a crossroads. While foundational tokens like Bitcoin continue to rise as planned, the rest fluctuate like squirrels in the middle of the road. And fiat currencies are freight trains running in every direction.

The big problem is that most tokens are connected with little but hype. Utility tokens like NYM provide a different route: don’t just speculate, be part of a project to address a real world need and power new technologies and institutions capable of meeting it.

With NymVPN and the power of NYM, we can move toward a future where digital sovereignty is within reach, and privacy is woven into the very fabric of the web.

But the tokenomics behind all this is admittedly new. So join us each week to explore a different part of it. We will deepdive into all the technical and social aspects of how the NYM utility token functions to power the world’s most private network.

In the meantime, Tweet or Telegram us with #NymTokenomics to share any questions or topics you’d like Nym to explain in more depth about NYM as a utility token.

Resources

Nym Node Operator Guide

Join the Nym Community

Telegram // Element // Twitter

Privacy loves company

English // 中文 // Русский // Türkçe // Tiếng Việt // 日本 // Française // Español // Português // 한국인

The Value of NYM: The spice powering the world’s most private network was originally published in nymtech on Medium, where people are continuing the conversation by highlighting and responding to this story.

Sunday, 22. September 2024

Circle Blog

Unlocking Impact: Circle live with the UN Summit of the Future

Today, the next installment in Circle’s Unlocking Impact pitch competitionseries will take place at the SDG Media Zone inside the UN Headquarters. We are incredibly excited to see the next wave of entrepreneurs present their visionary solutions for using Circle’s technology to address the UN’s Sustainable Development Goals (SDGs). 

Today, the next installment in Circle’s Unlocking Impact pitch competitionseries will take place at the SDG Media Zone inside the UN Headquarters. We are incredibly excited to see the next wave of entrepreneurs present their visionary solutions for using Circle’s technology to address the UN’s Sustainable Development Goals (SDGs)

Friday, 20. September 2024

Epicenter Podcast

Numerai & Predictoor: How AI Changed Crypto Prediction Markets - Trent McConaghy & Richard Craib

Prediction markets were one of the first use cases of smart contracts, yet their popularity only recently spiked, following Polymarket’s social media spread. However, while prediction markets are generally zero-sum games, the rise of AI models trained on large datasets led to AI-powered prediction feeds and hedge funds. Crypto offers a unique array of use cases as it allows data scientists to not

Prediction markets were one of the first use cases of smart contracts, yet their popularity only recently spiked, following Polymarket’s social media spread. However, while prediction markets are generally zero-sum games, the rise of AI models trained on large datasets led to AI-powered prediction feeds and hedge funds. Crypto offers a unique array of use cases as it allows data scientists to not only share their data sets and models with complete privacy, but also access decentralised computing and model training. While Predictoor employs Ocean’s data infrastructure to run AI-powered prediction bots on lower timeframes, Numerai developed its own AI hedge fund for stocks, that recently also expanded to crypto (Numerai does not trade cryptocurrencies, and Numerai’s Hedge Fund(s) have no relation to Numerai Crypto).

Topics covered in this episode:

Predictoor & Numerai overview Prediction markets Prediction feeds Prediction markets in TradFi and other use cases Implementation of Ocean’s tech in Predictoor Predictions vs. Futures Market participants Numerai hedge fund Numerai’s trust assumptions The role of AI The evolution of AI and how it might solve market inefficiencies Numerai crypto and how it differs from Predictoor Predictoor x Numerai collaborations The future of prediction markets

Episode links:

Trent McConaghy on Twitter Richard Craib on Twitter Predictoor on Twitter Numerai on Twitter Ocean Protocol on Twitter Oasis Protocol on Twitter

Sponsors:

Gnosis: Gnosis builds decentralized infrastructure for the Ethereum ecosystem, since 2015. This year marks the launch of Gnosis Pay— the world's first Decentralized Payment Network. Get started today at - gnosis.io Chorus1: Chorus1 is one of the largest node operators worldwide, supporting more than 100,000 delegators, across 45 networks. The recently launched OPUS allows staking up to 8,000 ETH in a single transaction. Enjoy the highest yields and institutional grade security at - chorus.one

This episode is hosted by Friederike Ernst.

Thursday, 19. September 2024

Zcash

Coinbase Onramp headlines Zashi 1.2 release

ECC is excited to announce the new Zashi-Coinbase integration, offering Zashi users a faster, simpler way to buy and self-custody ZEC. This new onramp reflects our commitment to building a […] Source
ECC is excited to announce the new Zashi-Coinbase integration, offering Zashi users a faster, simpler way to buy and self-custody ZEC. This new onramp reflects our commitment to building a […]

Source


Circle Press

Circle Appoints Bradley Horowitz to Board of Directors

BOSTON, September 19, 2024 — Circle today announced Bradley Horowitz as the newest addition to its Board of Directors. Mr. Horowitz brings more than three decades of tech industry expertise to his appointment, including leadership roles at Google and Yahoo as well as experience as a co-founder and Chief Technology Officer at his video analysis startup, Virage. He is also an investo

BOSTON, September 19, 2024 — Circle today announced Bradley Horowitz as the newest addition to its Board of Directors. Mr. Horowitz brings more than three decades of tech industry expertise to his appointment, including leadership roles at Google and Yahoo as well as experience as a co-founder and Chief Technology Officer at his video analysis startup, Virage. He is also an investor in over 150 startups, a General Partner at Wisdom Ventures and was named the #10 Seed Investor by Business Insider this year.


a16z Podcast

The Frontier of Spatial Intelligence with Fei-Fei Li

Fei-Fei Li and Justin Johnson are pioneers in AI. While the world has only recently witnessed a surge in consumer AI, our guests have long been laying the groundwork for innovations that are transforming industries today. In this episode, a16z General Partner Martin Casado joins Fei-Fei and Justin to explore the journey from early AI winters to the rise of deep learning and the rapid expansion of

Fei-Fei Li and Justin Johnson are pioneers in AI. While the world has only recently witnessed a surge in consumer AI, our guests have long been laying the groundwork for innovations that are transforming industries today.

In this episode, a16z General Partner Martin Casado joins Fei-Fei and Justin to explore the journey from early AI winters to the rise of deep learning and the rapid expansion of multimodal AI. From foundational advancements like ImageNet to the cutting-edge realm of spatial intelligence, Fei-Fei and Justin share the breakthroughs that have shaped the AI landscape and reveal what's next for innovation at World Labs.

If you're curious about how AI is evolving beyond language models and into a new realm of 3D, generative worlds, this episode is a must-listen.


Resources: 

Learn more about World Labs: https://www.worldlabs.ai

Find Fei-Fei on Twitter: https://x.com/drfeifei

Find Justin on Twitter: https://x.com/jcjohnss

Find Martin on Twitter: https://x.com/martin_casado

Stay Updated: 

Let us know what you think: https://ratethispodcast.com/a16z

Find a16z on Twitter: https://twitter.com/a16z

Find a16z on LinkedIn: https://www.linkedin.com/company/a16z

Subscribe on your favorite podcast app: https://a16z.simplecast.com/

Follow our host: https://twitter.com/stephsmithio

Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures.


Brave Browser

When privacy-preserving advertising measurement falls short

Privacy-preserving ad measurement should prioritize user control and transparency rather than catering to third-party advertisers.

Brave has worked towards a more private Web for several years. The current third-party-based advertising ecosystem is built on the continual surveillance of users, so we’ve spent a lot of time thinking about the economy of the Web and how it can work while still maximizing user privacy, which is why we offer Brave Ads

We’re happy to see other browsers paying more attention to these issues as well. Mozilla, for example, recently announced their Privacy-Preserving Attribution (PPA) feature to track ad interactions in the browser while preserving user privacy. Mozilla’s PPA, co-designed with Meta and default-enabled in the Firefox browser, was met with a lot of criticism on launch, however. While we agree with the overall goal of the Mozilla/Meta system (i.e., privacy-preserving advertising), we disagree with the design of the system itself.

We believe that ad measurement should be transparent, simple, and limited to parties the user trusts—not random third-party advertisers.

Ad measurement should be transparent

Mozilla’s ad measurement scheme is default-enabled and — unless you’re hunting around in Settings — undiscoverable by users. This means that if you go to a page which happens to have ads on it, the browser will record that ad impression and eventually send it to the advertising measurement server. This lack of transparency is a problem, as users deserve to be in control of what information is being shared and with whom.

Mozilla’s argument is that no personal data is actually being shared in PPA (given their protections), and that a consent dialog prior to enabling PPA would have been user-hostile. We agree that consent dialogs are often bad for user privacy (which is one reason we block cookie consent banners by default), but PPA is a complicated, unproven, and experimental prototype operating over extremely sensitive user browsing data.

Users should not be guinea pigs, especially not for a system that is built for third-party advertisers and does not directly benefit users.

Ad measurement should be simple

Ad measurement in the browser is inherently privacy-sensitive because users’ website browsing activity is being tracked and sent. To preserve privacy, Mozilla’s PPA combines novel differential privacy (DP) and multi-party computation (MPC) techniques to provide arbitrary websites with ostensibly only aggregated data. Websites (unbeknownst to the user) ask the browser to generate encrypted reports that are then sent by the browser to an “aggregation service” backend, which is run by another third-party.

As mentioned above, this is all experimental, complex, and carries a fair amount of privacy risk. The more complex the system, the higher the risk of bugs and vulnerabilities. Complex systems are not only harder to audit but also increase the chances of mistakes that can compromise user privacy. The more moving parts, the higher the risk of something going wrong — this is software engineering 101.

Complexity is especially dangerous for a system that will be used for online advertising, where there is a lot of economic motivation for bad actors to try to break privacy protections.

Ad measurement shouldn’t be handed over to third parties

One of the most glaring flaws with Mozilla’s approach is that it is built to attempt to incrementally improve the traditional third-party advertising economy. In doing so, it allows third parties, who have no known relationship with the user, to measure ad performance while increasing privacy risk for the user.

We at Brave believe third-party advertising is fundamentally broken. It doesn’t need patching or tweaking — it needs to be completely overhauled. No user wants to be tracked by third-party advertisers, and the vast majority of users find third-party ads to be, at best, annoying, and at worst a dangerous vector for malware and spam. This is why ad and tracker blockers such as Brave have become a necessity for so many, and are widely considered security best practice.

Mozilla’s response to the controversy is especially worrying. In their Reddit post, they correctly note that Firefox has shipped several useful anti-tracking features over the years, and that there is an arms race happening between privacy-enhancing tools and malicious actors on the Web. But the post then doubles down on the third-party bet, and concludes that the only way to increase privacy on the Web at this point is to collude with third-party advertisers and ad-tech providers, and design systems for their use.

We strongly disagree. Brave offers an alternative to the traditional third-party-based advertising model of the Web. Brave Ads center the user, not the advertiser and definitely not ad-tech third parties, while operating on principles of transparency and simple, proven privacy techniques. The Brave browser offers best-in-class privacy and anti-tracking features for all users. We focus on protecting our users by blocking third-party ads and intrusive trackers entirely, and shipping privacy- and Web compatibility-enhancing features in the browser.

This isn’t easy. It requires constant work to ensure websites don’t break while also protecting user privacy by default. But it’s the right thing to do by our users, versus prioritizing third-party advertising and “ad-tech” over user privacy and security.

Wednesday, 18. September 2024

Nym - Medium

Who’s afraid of anonymity?

Why anonymity is so important, online and off Let’s face it, “anonymity” has become a dirty word. It is too often associated with “hiding” illicit activities and the perpetrators behind them. Languages: Русский // Bahasa Indonesia // українська мова // Tiếng Việt // 中文 Not only is this a false association, it also has huge consequences for our privacy whether we’re online or 
Why anonymity is so important, online and off

Let’s face it, “anonymity” has become a dirty word. It is too often associated with “hiding” illicit activities and the perpetrators behind them.

Languages: Русский // Bahasa Indonesia // українська мова // Tiếng Việt // 中文

Not only is this a false association, it also has huge consequences for our privacy whether we’re online or off.

Nym’s new series Who’s afraid of anonymity? will try to demystify anonymity and explore its fundamental relation to privacy and a safe, democratic life. But nothing is so simple, so we will also need to explore the limits and possible dangers of anonymity.

“Anonymity is not just a mask; it’s a shield.”
– Anonymous
What is anonymity?

For a moment, let’s forget all the scaremongering about anonymous criminals, shadowed faces lurking in the dark looking to hurt us. Yes, the fear of the unknown and unrecognizable is very real. But isn’t anonymity also a central aspect of human life? And when it comes to online surveillance, our many adversaries are even too dark to notice.

The meaning of anonymity

“Anonymity” comes to English from Ancient Greek, literally meaning the state (-ity) of having no (ano-) name (nym). But why would we want to lose our name? Isn’t this what makes us all individuals and protects our unique identities?

In some cases, yes. It can assure that private property belongs to us, or allow us to speak authentically and be held responsible for what we say publicly. But having an identifiable name which links us to something personal is also what leaves us vulnerable to attacks and exploitation when all we really want and need is privacy.

Given the widespread surveillance and tracking of the mico-details of our lives online, this linking of our identities with what we expect to be a private activity has, unfortunately, become a commonplace industry for profit and control. But it doesn’t begin and end with the internet.

But before diving in, let’s consider a more simple example to appreciate the value of anonymity.

A game of cards Rounders (1998)

Imagine we’re playing a game of cards. Each of us has an individual hand which is private and concealed from others until we decide to play a card. Doing so will then make the card public. In some games, we may even share a collective hand in which teammates don’t even know what cards the others hold.

And then there is the deck from which cards can be collectively drawn. A key to fair play is that each card in the deck has two sides. First, there is the face value (number and suit) with its respective benefits to individuals when held or played. And then there is the card’s back side, facing the public and anonymous to them.

The anonymity of the card’s back is essential. If something about a card in the deck could be used to indicate the face-value (different shaped or damaged cards, unique designs or colors), then some players could de-anonymize the card to gain an unfair advantage over others. This would end the collective fun, joy, and gamble of the game itself.

Of course, human life isn’t just a game. But games do reveal something about the people that invented them: there is always a play of public and private, of anonymity and deanonymization, which have real world consequences for us all.

Privacy and anonymity are certainly not the same thing, but in the contemporary world they can share a common purpose. So let’s first consider how our expectations of privacy in the real world have evolved politically and struggle to find an equal home on the web.

Privacy as a political right

Rights to privacy are enshrined in democratic constitutions all over the world, even though billions of people still do not benefit from them. Of course, political rights are contextual and jurisdictional, depending on the country or society. Yet there are particular moments in history that helped advance the growing case for privacy as a universal right.

The U.S. constitution

In the United States, the “right to privacy” comes from the 4th Amendment to its constitution of 1791:

“The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated…”

This has an important context. Prior to the U.S.’s independence from the colonial rule of England, Americans often had their personal belongings searched and their personal “papers” or correspondences confiscated and read by colonial authorities. These arbitrary searches were often conducted without legal justification (“warrants”) by colonial police, and tended to be politically motivated: searching for untaxed items and anti-colonial political associations, or simply indiscriminate sweeps for blackmail material.

Note: The large majority of our “papers,” and even “houses” and “effects,” are now digital assets.

The US is just one country amongst others. And yet a law now over two centuries old importantly set the foundation for a common struggle in which people all over the world continue find themselves to ensure access to information and free communications. In the end, the US constitution was not simply a fight to stop one authoritarian government, but also to prevent the resurrection of authoritarianism in the future.

Privacy as a human right

Privacy also became a core part of the Universal Declaration of Human Rights (UDHR) which members of the United Nations have in principle committed to protect “against such interference or attacks”:

“No one shall be subjected to arbitrary interference with his privacy, family, home or correspondence…” (Article 17)

As the UDHR makes clear, these personal privacy protections are inseparable from other fundamental human rights, such as the “freedom of thought” (Article 18) and the “freedom of opinion and expression,” including the right to “seek, receive and impart information and ideas through any media and regardless of frontiers” (Article 19).

Privacy, in this context, is much more than a personal right: it is a democratic capacity that enables us to associate and communicate freely with others. No matter who and where we are.

Moreover, privacy is linked to shared information, but information itself is complicated. And the advent of the internet has accelerated these complications.

Online privacy

More and more of our lives are taking place online, and as they do there is thus a tremendous amount of personal information that is being made available, vulnerable to surveillance and data tracking. Even when the content of what we do or say is encrypted, our encrypted data leaks a lot of metadata that is being systematically harvested and used under our noses.

New legal protections for online privacy have thankfully started to emerge worldwide to curb the exploitation of personal data. The biggest so far is the EU’s General Data Protection Regulation (GDPR) of 2018. The GDPR pledges to protect the data security and privacy of members of the EU, regardless of the companies that may handle the data globally. This is the beginning and not the end of the story.

In addition to defending the importance of end-to-end encryption to protect the privacy of the content of what we do and say online, the GDPR also introduces a number of new rights for “data subjects,” that is, we who use the internet. This includes, for example, the right to be informed about how our personal data will be used, and even the right to erasure or “to be forgotten,” that is, to remove our name from the digital public domain.

What’s largely left untouched by GDPR provisions, however, is what can be done with our metadata. But more on this later.

From privacy to anonymity

So far, this is a story about the evolution of privacy: from the constitutional rights of a particular country, to universal rights, to (still regional) digital rights. But what does all this have to do with anonymity?

It is certainly not necessary to be anonymous in order to be private in all aspects of life. For example, we should be able to discuss things privately in the confines of our own homes and social groups while other people in our neighborhood or society know our real names.

But there are important contexts in which genuine privacy, freedom of speech, and freedom to information require anonymity as a tool to achieve it. So let’s start exploring how.

Anonymity for a private, safe life

Anonymity is in fact a social necessity for everyone. Our rights and security, both personal and collective, often depend on it. So here is a taste of the topics that the new Nym series Who’s Afraid of Anonymity? will explore in the coming months.

Voting

When we cast a vote in an election, we not only expect that this vote will be respected and counted, but also that it will be anonymous and not publicly known. The bare facts of a vote must be public, but our name and personal information should be publicly unlinkable to us. Identifying individual voters for a political party or candidate could lead to voter intimidation, social outcasting, or even targeted killings in authoritarian regimes. Anonymity can ensure democratic participation.

There are many techniques for anonymizing the hard ballots of voters worldwide while also preventing voter fraud. However, voting in person poses serious logistical difficulties for many people (work commitments, long distances to voting stations, or the threats of in-person intimidation) which digital voting can resolve. But data and privacy security is necessary to do this properly. In this new technological era of democratic participation, data anonymity rises to the forefront.

Health care

If we need to have a medical service, we expect that the hospital or clinic will keep this information private. Publicly accessible medical records can lead insurance companies to raise our rates, employers to consider terminating contracts, or malicious individuals to target us. Medical data is definitely important for charting social trends, but unlinking individuals from their personal data through data anonymization ensures everyone has unimpeded access to health services.

Perhaps hard records are definitely a thing of the past, but ensuring the privacy and anonymity of digital records are still the problem. With the increasing rise of data leaks and cyber attacks, it’s more important than ever to ensure that our efforts to seek social and medical services are protected. For example, when it comes to women seeking out-of-state abortion services who may face community or legal reprisal, digital anonymity can be a part of ensuring safe passage.

Whistleblowing

A lot of people work for private, public, and government enterprises, and they are potential witnesses to a lot of wrongdoing. But if these workers need to report abuses, maintaining their anonymity with journalists or regulators can allow them to safely speak freely about the oppression or exploitation of others without fear of reprisal. Anonymity can ensure social justice.

But even journalists, for example, need assured technological means to preserve the anonymity and security of their communications with sources. It is for this reason that private and end-to-end encrypted messaging apps have been such a revolution: not to allow people to commit crimes, but to finally have the privacy and security in communicating with others through a digital space ridden with malicious adversaries.

The list goes on. And it is for this reason that we need to stop stigmatizing anonymity when it comes to digital privacy. The threats surrounding us are too many, varied, and omnipresent.

The urgency of online anonymity

Political “rights” do not rain down to us from the sky as unconditional truths, and the right to privacy is a case in point. Rights are demanded, fought for, enacted, and defended over time in relation to the real needs of human beings. In some contexts it might be needed to fight against oppression and injustice, and in others simply to live better and more safely with others.

With all the threats posed to our digital lives, online privacy needs to continue its course to becoming a globally recognized right. But in this digital day and age, anonymity is a necessary tool to achieve it in particular contexts where privacy is under threat. In fact, the need for online anonymity is so much more urgent given the sheer amount of personal information that is being made available online, often without our knowledge and genuine consent.

Anonymity is meaningful particularly in a social context like the public web in which we need to act without compromising our personal identity. Anonymity allows us to be contributing members of society while shielding us from its dangers. The examples above are just some moments when this really matters.

Who’s Afraid of Anonymity?

Nym is dedicated to exploring all social needs for digital anonymity and what it would take to technologically make it possible. So this is the first installment in a series that will explore the in-and-outs of online anonymity, its real world consequences and use cases, and the way it is being unjustly politically targeted.

In coming installments we will explore the role of anonymity in voting online and off, the need to shield our identities when seeking health services, and in protecting ourselves while speaking out against social and institutional injustices. So stay tuned!

Who’s afraid of anonymity? was originally published in nymtech on Medium, where people are continuing the conversation by highlighting and responding to this story.


Nym node min profit margin and max operator cost implemented

Here’s what the change means for operators and stakers Earlier this year, the Nym node operator community voted on whether to implement a minimum profit margin and maximum operator cost. Languages: Русский // Türkçe // Bahasa Indonesia // 中文 // українська мова // Tiếng Việt This came from a suggestion by a community member as a way to prevent a “race to the bottom” when it comes
Here’s what the change means for operators and stakers

Earlier this year, the Nym node operator community voted on whether to implement a minimum profit margin and maximum operator cost.

Languages: Русский // Türkçe // Bahasa Indonesia // 中文 // українська мова // Tiếng Việt

This came from a suggestion by a community member as a way to prevent a “race to the bottom” when it comes to operator rewards and to ensure that operators can make a decent earning for providing a good service.

With the Caramello release last week, this community decision went on-chain. The core team has been monitoring how this has affected the distribution of rewards across Nym node operators and stakers. With this change, average operator rewards have doubled from 1.4 NYM per epoch (about 24 NYM per day) to more than 2.8 NYM per epoch (51.5 NYM per day) for high performing nodes of min. 20% stake saturation. Staker APY has changed roughly from 13% to 10%.

Before going into the details, let’s take a look at the bigger picture.

The bigger picture

The Nym network is starting to see a gradual increase in usage. The NymVPN is in its public Beta stage and now open for testing by anyone in the world. The goal of this new testing phase is to better troubleshoot the app and network ahead of its commercial launch.

Increasing usage of the Nym network also means the Nym core team is demanding more from the operator community. The goalpost of a good quality Nym node is moving regularly as new releases roll out and performance measurements come in (so check here to see if you are still a Panda).

Note: Very soon, any node that falls behind the past 3 releases will no longer be eligible to be in the active set. The Operators Guide changelog is the best place to follow up on new versions, updates, and release notes.

Meanwhile, the rewards going to Nym nodes must reflect the efforts that will be needed in these next few months to get the network up to scratch.

Adjustments to Nym token economics

The Nym network has been live for nearly two years now. This means that there is now real world data on how NYM token economics is performing. The primary objective of the token economic design is to ensure top quality of service and coverage of what may likely become the world’s most powerful privacy network.

One of the team’s findings is that staking, which was intended as an economic measure of community confidence in the quality of a given node, hasn’t worked very well. One possible reason is a tendency to “stake and forget,” leaving NYM tokens idle on outdated nodes instead of contributing to ensuring good quality.

At the same time, the community has been experiencing turbulent crypto markets, leaving some operators struggling and out of pocket. The Nym knows this has been difficult, but is confident that the value of the NYM token is fundamentally linked to the performance of the Nym network which is just now coming to fruition.

For these reasons, emphasis will be increasingly placed on performance rather than reputation as a proxy for good service. It is good to keep this bigger picture in mind when analyzing the reward data following last week’s change.

New minimum and maximum parameters

As of last week, the following ranges were set for operator cost and profit margin:

Profit margin (PM): 20%–50% Operator cost (OC): 0–1,000 NYM

These ranges were set on-chain and apply to all Nym nodes. This means that operators cannot set their PM and OC outside of these ranges going forward, and values which fell outside of these ranges were clipped to the maximum/minimum.

Nym nodes are selected to do work for periods of 1 hour called epochs. If they have good performance and high stake, they will have a higher chance of being selected to do work and receive rewards. Here is a simplified overview of OC and PM and how these parameters determine the share of rewards that node operators and stakers receive.

OC is a monthly value meant to cover the costs of actively operating a Nym node. OC is paid first by the reward algorithm before any other rewards. This means that stakers only start receiving staking rewards after OC has already been awarded to the operator.

Note: The OC parameter assumes 100% activity. For example, if your node is selected to do work for 50% of the epochs in a month, the OC for that month will be 50% the OC you configured. So if you set OC at 500 NYM and your node is selected to do work for 50% of the epochs throughout the month, then your node’s OC will be 250 NYM for that month.

PM is the node operator’s share of all the rewards that their node generates (after covering OC). Let’s say your node has a 20% PM and it generates 100 NYM in profit for an epoch. In this case you receive 20 NYM as operator rewards, and the remaining 80 NYM is split among the stakers of your node proportionally to the size of their stake stake. As the operator, you must have some stake on your own node.

Impact of minimum profit margin and maximum operator cost

Maximum PM and OC are simple safeguards protecting stakers from abusive behavior. Reducing the minimum OC from 40 NYM to 0 NYM was requested by a few old node operators with highly saturated nodes — OC doesn’t matter to them as much anymore, and 0 is a nice round number.

The most impactful change is the implementation of a minimum PM. Previously the network-wide average PM was around 5% and some nodes had even lower values at 1% or even 0%. As the minimum is now 20% across the board, the share of operator and staker rewards shifted notably.

Staker APY has changed roughly from 13% to 10%. Meanwhile, average operator rewards have doubled from 1.4 NYM per epoch (about 24 NYM per day) to more than 2.8 NYM per epoch (or 51.5 NYM per day) for high performing nodes with a min. 20% stake saturation.

These changes are in line with the projections shared and discussed with the Nym community during the vote on instituting the minimum profit margin.

Note: Not all operators will necessarily see an effect of these changes. The actual increase in rewards depends on their individual parameters and performance. For example, not all operators will be in the active set in every epoch, so their rewards will not reach the top level.

The changes only come into effect when a node actually gets into the active set and gets their first reward since the contract update. Underperforming nodes might therefore still show in the explorer as having a PM or an OC outside of the currently permitted bounds.

What next?

Nym node rewards will be changing significantly over the next months in line with the Nym roadmap. As announced earlier this year, this includes the completion of “project smoosh,” gateways joining the reward set (with reward ratios also the result of community vote), and credentials kicking in as evidence of work done in the network.

The Nym team has collated all of these upcoming changes in a Nym node roadmap. The roadmap will be presented at a Nym node town hall that will take place on the 10 October.

All current and potential Nym node operators and stakers are invited. Further details, agenda, and invitations will circulate over the next couple of weeks. In the meantime, please reach out to us on Telegram or elsewhere with any questions you have about these changes and how they affect you!

Resources

Nym Node Operator Guide

Join the Nym Community

Telegram // Element // Twitter

Privacy loves company

English // 中文 // Русский // Türkçe // Tiếng Việt // 日本 // Française // Español // Português // 한국인

Nym node min profit margin and max operator cost implemented was originally published in nymtech on Medium, where people are continuing the conversation by highlighting and responding to this story.


Greylock Partners

How These Software-as-a-Service Startups Are Making Life Harder for Everyone Else

The post How These Software-as-a-Service Startups Are Making Life Harder for Everyone Else appeared first on Greylock.

Global Digital Finance

GDF and Hogan Lovells Digital Assets Summit Update

Global Digital Finance (GDF) and Hogan Lovells hosted their 5th annual Digital Assets Summit, a key event that brought together industry leaders, policymakers, and regulatory experts to discuss the growing adoption of digital assets. This year’s summit focused on the continued institutional adoption of tokenized assets, the development of regulatory frameworks, and the critical role […] The post

Global Digital Finance (GDF) and Hogan Lovells hosted their 5th annual Digital Assets Summit, a key event that brought together industry leaders, policymakers, and regulatory experts to discuss the growing adoption of digital assets. This year’s summit focused on the continued institutional adoption of tokenized assets, the development of regulatory frameworks, and the critical role of interoperability and digital payments in driving the future of finance.

Read the full update from the Summit below.

The post GDF and Hogan Lovells Digital Assets Summit Update appeared first on GDF.


Nym - Medium

Nym welcomes new researcher to its censorship resistance team

Ramping up the fight against global censorship Combatting censorship is one of the biggest challenges that Virtual Private Networks (VPNs) face. Languages: Русский // Türkçe // Bahasa Indonesia // 日本 // 中文 // Español // Português // українська мова // Tiếng Việt Censorship measures against privacy technologies are becoming more intense worldwide, especially in authoritarian countries.
Ramping up the fight against global censorship

Combatting censorship is one of the biggest challenges that Virtual Private Networks (VPNs) face.

Languages: Русский // Türkçe // Bahasa Indonesia // 日本 // 中文 // Español // Português // українська мова // Tiếng Việt

Censorship measures against privacy technologies are becoming more intense worldwide, especially in authoritarian countries. The future of VPN privacy technology will be found only in novel network designs which can circumvent them.

Recognizing that the stakes for millions of users is the freedom to information and expression, Nym is rising to the challenge by expanding its team of researchers and developers. Yesterday, Nym welcomed Jack Wampler as the newest addition to its censorship resistance team.

“Censorship resistance is important because information access is leveraged as a weapon to squash dissent and remove accountability,” Jack Wampler reminds us. “Circumvention ensures that those who need it most have the ability to: access reliable news and education materials, communicate with loved ones during times of crisis, and organize without fear of reprisal.”
Jack’s path to Nym

Jack completed his Ph.D. in Computer Engineering with a focus on Privacy & Security from the ECEE department at the University of Colorado Boulder. His work explores ways to deploy solutions for security, privacy, and anonymity, especially as they involve cryptography, hardware security, and emerging internet protocols.

He is the core developer of Conjure, the first widely deployed refraction networking proxy, built primarily in Golang for censorship circumvention. Conceptually, refraction networking works by incorporating the network infrastructure (Internet Service Providers, etc.) into proxy systems to make censorship more difficult.

Jack is also actively working on rust implementations for pluggable transports and network traffic obfuscation as part of a Pluggable Transports in Rust (PTRS) project. The goals are to (1) lower the barrier to entry for writing secure pluggable transports in Rust, and (2) to implement several cutting-edge protocols using that framework.

Jack’s work at Nym

Jack is joining the Nym team as a Research and Development Specialist to help design the next generation of privacy technologies. Alongside a growing team of censorship resistance researchers and testers on the ground across the world, Jack will help Nym provide secure and safe communications for users in regions where it matters the most. This includes deploying cutting edge protocols, ensuring resistance against future cryptographic threats like quantum computing, and developing measurement techniques to understand the needs of numerous unique user-bases.

“One of the biggest challenges in community driven censorship resistance is finding ways to support your volunteer community,” Jack notes. “Nym is at this unique intersection of network privacy and web3 technologies. It has more agility than a traditional VPN, supported by the distributed nature of the network which ensures that everyone contributing is backed by the community as a whole. This is a powerful combination with huge potential for positive impact when married with circumvention technologies.”
Conclusion

Please join us in welcoming Jack to the Nym team! Fighting censorship requires the best minds and technicians, and Nym is thrilled to put Jack’s extensive expertise to work for NymVPN users worldwide. Check out his research projects here.

And stay tuned for more news this week and a new series of posts providing a deep dive on the threat of censorship today and what Nym is doing about it.

Join the Nym Community

Telegram // Element // Twitter

Privacy loves company

English // 中文 // Русский // Türkçe // Tiếng Việt // 日本 // Française // Español // Português // 한국인

Nym welcomes new researcher to its censorship resistance team was originally published in nymtech on Medium, where people are continuing the conversation by highlighting and responding to this story.


Brave Browser

Nebula: Brave’s differentially private system for privacy-preserving analytics

Introducing Nebula, a novel and best-in-class system developed by Brave Research for product usage analytics with differential privacy guarantees.

This post describes work by Ali Shahin Shamsabadi, Peter Snyder, Ralph Giles, Aurélien Bellet, and Hamed Haddadi. This post was written by Brave’s Privacy Researcher Ali Shahin Shamsabadi.

We are excited to introduce Nebula, a novel and best-in-class system developed by Brave Research for product usage analytics with differential privacy guarantees. Nebula combines several cutting-edge privacy enhancement technologies (including verifiable user-side thresholding and sample-and-threshold differential privacy) to get useful insights about the product usage/feedback of a population (i.e., many Brave users) without learning anything about the choices of individuals in the population (i.e., each individual Brave user).

Nebula benefits users: it guarantees user privacy without requiring prohibitive trust assumptions. Nebula also enables Brave to achieve better utility and lower server-facing overhead compared to existing solutions based on the local and shuffling models of differential privacy.

Brave Research shares the implementation so that other projects can use it.

Product analytics 

Developers would need to learn how Brave features are being used by many users to be able to improve user experience on the Web. For example, Brave developers would need to learn how well Brave privacy-preserving Cookie Banner removal works by asking the question “If you have viewed the cookie consent block prompt, how did you react?” However, each individual user’s choice needs to be private. As shown in the below chart, Nebula allows Brave to determine how much and where resources need to be allocated to block privacy-harming cookies for users and understand the popularity order of reactions without learning the reaction and cookie banners that each individual user sees. Nebula’s differential privacy guarantees ensure that the same conclusions, for example most users react with blocking cookie notices when seeing cookie banners, can be made independently of whether any individual user opts into or opts out of contributing their answer to the above question. 

Nebula enables developers to learn how Brave features are being used and to improve user experience on the Web, without risking users’ privacy. 

Nebula put Brave users first in product analytics

Nebula guarantees users’ privacy. Brave cares about safeguarding the privacy of our users. Nebula provides formal differential privacy guarantees ensuring that the product analytics leak as little as possible about each user’s choice. Brave does not even get to know the presence or absence of a user, let alone learning their data. Differential privacy is a robust, meaningful, and mathematically rigorous definition of privacy.

Nebula enables better auditability, verifiability, and transparency. Nebula’s usage means that users do not have to blindly trust Brave. Users are always in control of whether they want to contribute any data, know which data they are contributing, and when they are contributing. We open-sourced Nebula as part of the Brave’s STAR. In previous blog posts, we’ve talked about how we originally designed our Privacy-Preserving Analytics system and then improved it with Brave’s STAR. Now we’ve developed a further improvement using differential privacy, giving our users unmatched privacy protection. 

Nebula is efficient with very little user-facing cost. Local computation on the user side similarly requires very little effort.

Nebula’s inherently efficient design ensures that companies of all sizes can affordably protect their users’ privacy in their product analytics while also minimising negative environmental impacts.

Nebula design

At a high level, Nebula works as follows:

Local and verifiable user-side sampling.  The user decides to submit their data with a small probability, otherwise they abstain from submitting their real data. 

Local user data encryption. Users that do decide to participate (based on the outcome of their coin flip) locally encrypt their value using Brave’s STAR secret sharing scheme. This secret sharing process has a negligible (compute/memory) overhead for users.

Dummy data. A small number of users submit additional dummy data to obscure the distribution of uncommon (i.e., unrevealed) values.

Data aggregation. Brave recovers values and their associated counts using the inverse of the secret-sharing system which ensures that Brave cannot learn user values unless sufficiently many users sent the exact same value. 

Nebula satisfies differential privacy

Nebulas enforces formal differential privacy protection for users through three steps: 1) the uncertainty of any particular user contributing any value; 2) blinding Brave to uncommon values through the secret-sharing mechanism (i.e., thresholding); 3) having some users contribute precisely defined amounts of dummy data to obscure the distribution of uncommon values. 

Acknowledgements 

We would like to thank Shivan Kaul Sahib, Darnell Andries, and François Marier for their feedback and for their help with infrastructure implementation and production maintenance.

Tuesday, 17. September 2024

Circle Blog

USDC and CCTP are coming to Sui

We’re excited to announce native USDC and Cross-Chain Transfer Protocol (CCTP) are coming soon to Sui!

We’re excited to announce native USDC and Cross-Chain Transfer Protocol (CCTP) are coming soon to Sui!


USDC now available in Brazil and Mexico

Fiat transfers through real-time payment systems accelerate access to digital dollars.

Fiat transfers through real-time payment systems accelerate access to digital dollars.


Nym - Medium

Nym Warpcast art contest: Bring the Noise!

Earn NYM for your art and memes about network noise Following the success of Nym Art Gallery contest on X/Twitter, Nym is launching another round for Warpcast users! Languages: Русский // Türkçe // Bahasa Indonesia // 日本 // 中文 // Español // Português // українська мова // Tiếng Việt Want to win some NYM for your cool arts, designs, or memes? Don’t miss your chance to participate
Earn NYM for your art and memes about network noise

Following the success of Nym Art Gallery contest on X/Twitter, Nym is launching another round for Warpcast users!

Languages: Русский // Türkçe // Bahasa Indonesia // 日本 // 中文 // Español // Português // українська мова // Tiếng Việt

Want to win some NYM for your cool arts, designs, or memes? Don’t miss your chance to participate before 30 September.

Here’s everything you need to know about how to participate to Bring the Noise.

Theme: Bring the Noise!

Noise: Glitch, interference, camouflage, scrambling, asynchrony, confuse the adversary, the unclear. What does all this have to do with privacy? A lot, actually.

Surveillance systems work by closely tracking the microdetails of all of our interactions and communications online. Disturbing these privacy-invasive technologies requires novel network tactics, like making a network so noisy that no one can be identified and tracked.

These days we can only be truly private online if we have a sea of network noise into which we can disappear into. This noise comes from the community of network users, and so privacy loves company.

So what does noise mean to you? How can we imagine drowning out the thousands of snoopers of surveillance capitalism? Show us how you all envision noise in the fight for network privacy!

Prizes

Each day of the week, a prize of 1000 NYM ERC-20 tokens will be awarded to 1 winner.

14,000 NYM are up for grabs, and daily winners will be chosen by the Nym team.

Contest timeline

16 September, 00:00 CET — 30 September, 00:00 CET

How to enter

The rules are simple:

Follow NYM on Warpcast Join the channel /bringthenoise Recast the ANN Cast Cast in the channel /bringthenoise artwork or memes around the themes privacy, noise, censorship, surveillance capitalism

One cast per day will count to your chance to win 1000 NYM tokens.

Art for privacy

With the coming launch of the NymVPN, the first commercial product to run on Nym’s noise-generating mixnet, the Nym team wants to make as much noise around noise as possible. People need to know how at risk their personal data is online, and that protecting it requires inventing novel defense strategies. Network noise is one of these, and art can help us get these tools into people’s awareness and hands.

Join the Nym Community

Telegram // Element // Twitter

Privacy loves company

English // 中文 // Русский // Türkçe // Tiếng Việt // 日本 // Française // Español // Português // 한국인

Nym Warpcast art contest: Bring the Noise! was originally published in nymtech on Medium, where people are continuing the conversation by highlighting and responding to this story.

Monday, 16. September 2024

Circle Press

Circle & Sony Block Solutions Labs to Enable USDC on Soneium

Singapore, SINGAPORE, 16 September 2024 — Circle, a global financial technology firm, today announced a strategic collaboration with Sony Block Solutions Labs to drive innovation and creativity through decentralized technologies on Soneium, a public Ethereum layer 2 blockchain ecosystem. Through this initiative, Soneium will integrate Bridged USDC Standard and establish bridged USDC as

Singapore, SINGAPORE, 16 September 2024 — Circle, a global financial technology firm, today announced a strategic collaboration with Sony Block Solutions Labs to drive innovation and creativity through decentralized technologies on Soneium, a public Ethereum layer 2 blockchain ecosystem. Through this initiative, Soneium will integrate Bridged USDC Standard and establish bridged USDC as one of the blockchain’s primary tokens for value exchange.

Saturday, 14. September 2024

a16z Podcast

Apple’s Big Reveals, OpenAI’s Multi-Step Models, and Firefly Does Video

This week in consumer tech: Apple’s big reveals, OpenAI’s multi-step reasoning, and Adobe Firefly’s video model. Olivia Moore and Justine Moore, Partners on the a16z Consumer team, break down the latest announcements and how these product launches will shape the tech ecosystem, transform AI-powered experiences, and impact startups competing for attention in a fast-moving market.   Resource

This week in consumer tech: Apple’s big reveals, OpenAI’s multi-step reasoning, and Adobe Firefly’s video model.

Olivia Moore and Justine Moore, Partners on the a16z Consumer team, break down the latest announcements and how these product launches will shape the tech ecosystem, transform AI-powered experiences, and impact startups competing for attention in a fast-moving market.

 

Resources: 

Find Justine of Twitter: https://x.com/venturetwins

Find Olivia on Twitter: https://x.com/omooretweets

 

Stay Updated: 

Let us know what you think: https://ratethispodcast.com/a16z

Find a16z on Twitter: https://twitter.com/a16z

Find a16z on LinkedIn: https://www.linkedin.com/company/a16z

Subscribe on your favorite podcast app: https://a16z.simplecast.com/

Follow our host: https://twitter.com/stephsmithio

Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures.

Friday, 13. September 2024

Panther Protocol

Panther Protocol’s Vision for Secure and Private DeFi

Panther Protocol will provide private access to DeFi while providing access to compliance tools supporting the new frontier of on-chain finance. Last month, Panther Ventures’ Head of Product, Saif Akhtar, and Chief Architect, Vadim Konstantinov presented the protocol’s latest vision for secure and private decentralized finance (DeFi)

Panther Protocol will provide private access to DeFi while providing access to compliance tools supporting the new frontier of on-chain finance. Last month, Panther Ventures’ Head of Product, Saif Akhtar, and Chief Architect, Vadim Konstantinov presented the protocol’s latest vision for secure and private decentralized finance (DeFi) ecosystem at the Ethereum Community Conference 2024 (EthCC). This blog delves into that vision. 

A New Era for DeFi: Privacy and Compliance

Panther Protocol’s mission is to provide on-chain privacy infrastructure for regulated financial services providers, enabling them to operate private trading Zones with customizable rules that can be aligned with local or regional regulatory requirements. As regulatory scrutiny intensifies, financial service providers who deal in digital assets face increasing challenges in finding solutions that are compliant without compromising the privacy and security of their users. Panther Protocol is being developed specifically to address this challenge.

KYC and KYT Attestations: Ensuring Trust and Security

KYC (Know Your Customer) and KYT (Know Your Transaction) attestations are critical in today’s regulatory landscape. In its initial releases, Panther Protocol will support the use of first and (for use where permitted) third-party KYC/KYT services, including Panther Protocol partner, PureFi

However, Panther’s use of circuit-friendly signatures and Zero-Knowledge Proofs will have the capability to enable verification without exposing sensitive data, thus preserving even more user privacy. Where allowed, Panther’s Zero-Knowledge Proofs could be used to establish that regulatory conditions have been met. In the future, we believe using Zero-Knowledge Proofs will become the preferred way to satisfy regulatory requirements while preserving the user’s privacy. 

Panther’s client-side proof generation will also be conducted through user’s browsers, as this method best preserves the privacy of the user’s information. Panther will use the Groth16 proving system to meet basic browser requirements while minimizing computational resource demands.  

zAccount and Zones: A Dual-Layer Approach to Privacy

zAccounts and Zones are integral components of Panther’s architecture. zAccounts function as private on-chain identifiers that link user transactions without revealing personally identifiable information (PII). Zones, configurable by operators, enforce KYC/KYT rules and transaction limits, helping to align activities within the Zone to compliance standards. This dual-layer approach allows VASPs, and other regulated operators, to tailor privacy and compliance measures to their specific needs.

Data Escrow: Safeguarding Transactional Privacy

When transactions occur within Panther’s Shielded Pools, they will generate encrypted data that is stored in a data escrow system. This transaction data will be encrypted using Zero-Knowledge Proofs and zkSNARKs, ensuring that the details remain confidential and can only be accessed under predefined circumstances (rules set by a Zone Manager). The escrow system will be configurable, where allowed, to be managed by the Forensic Data Escrow Operator (FDEO), which holds the decryption key necessary to access the escrowed data but cannot do so unilaterally. Access to the escrowed data will require the activation of the Zone’s rules, as defined by the respective Zone Manager. Alternatively, the escrow system could be configured to provide unfettered access to the Zone Manager, as required. 

When regulators require information, (for example, for  the purpose of submitting reports to regulators, such as a suspicious activity report, or when requested by law enforcement), the protocol will be able to be configured so that the Zone Manager can access it on demand, relaying relevant data to regulators as required. Alternatively, where permissible, this data could bebe decrypted and disclosed in accordance with pre-programmed rules, set by the Zone Manager.

The system also supports selective disclosures, allowing specific parts of a transaction history to be revealed (for example, for auditing purposes) and providing mathematical proofs to validate these disclosures without revealing the entire transaction history. Disclosing information in this way ensures that only the minimum information necessary for the check or audit is revealed without including any additional data. This mechanism balances user privacy with compliance needs, leveraging advanced cryptographic techniques and a collaborative access model.

Smart compliance: the future of compliant privacy

Smart Compliance includes zAccounts, Data Escrow, and KYT Attestations, which will help to align Panther’s Zones to regulatory standards but also reduce the risk of illicit activities by enforcing rules by allowlisting customized lists of participants and assets, placing limitations on transactions and more.  While Panther Protocol will be used in its initial releases by Zone Managers who use first-person KYC/KYT with full access to the FDOE’s transaction data, in the future (and where allowed today), we expect that Zero-Knowledge smart compliance will become the preferred method of providing evidence of compliance. Using a combination of the technologies described above, will preserve the maximum privacy possible for users. 


BlueYard Capital

CPTx

Since its discovery in 2006, programmable DNA nanofabrication with DNA origami has attracted immense interest due to excellent programmability and biocompatibility with many applications including in cell and gene therapies, nanoelectronics and sensors, and delivery technologies in diverse areas such as cancer immunotherapy and even in materials science. However, the deployment and commercializati

Since its discovery in 2006, programmable DNA nanofabrication with DNA origami has attracted immense interest due to excellent programmability and biocompatibility with many applications including in cell and gene therapies, nanoelectronics and sensors, and delivery technologies in diverse areas such as cancer immunotherapy and even in materials science. However, the deployment and commercialization of the technology has thus far been slow and commercial traction limited due to challenges in scalability and in designing and assembling the DNA structures themselves.

CPTx founder and CEO Hendrik Dietz has been at the forefront of applied DNA nanofabrication research for decades and his work, which has been published in leading scientific journals, has become the centerpiece of the company. The first applications are tackling two complementary areas (a) creating a pathogen agnostic antiviral platform designed for targeted viral interventions and (b) the large-scale production of short and gene-length DNA single strands at massive scales (“GXstrands”). Both the antiviral platform and GXstrands applications are interrelated: the large-scale production of specific oligonucleotide pools is critical in creating more specialized antiviral motifs for a broader range of viral pathogens; while the high-quality gene-length DNA strands produced by GXstrands will be used in the R&D phases of the the new antiviral therapeutics, accelerating the time-to-market for additional iterations of the product.

Our Thesis

There are over 200 known viruses that could infect humans which cause a wide range of diseases, from mild illnesses like the common cold to severe and life-threatening conditions like HIV/AIDS, hepatitis, and COVID-19; not to mention the constant threat of novel zoonotic-related viruses (or the risk of viruses as emerging bioweapons). Although successful drugs can generate billions in revenue (such as the hepatitis C drug combo from Gilead which made $10 billion in 2015 alone), there are only limited options for effective antiviral treatments. We believe CPTx has an opportunity to leverage their knowledge and capabilities in DNA nanofabrication to build the first truly programmable antiviral platform, as well as to produce high-quality oligonucleotide pools and gene-length DNA single strands to supply the growing demand for nucleic acids, particularly from emerging cell and gene therapy therapies, with the aim to become a leading supplier for the entire pharma industry.

We are excited to be backing CPTx on their mission to pioneer next-gen antiviral and biodefense solutions, based on programmable DNA nanofabrication. Read more about their recent $29m financing here.


Circle Press

Circle Announces New Global Headquarters in New York City

The 87th Floor of One World Trade Center to be Circle’s Flagship 

The 87th Floor of One World Trade Center to be Circle’s Flagship 


Epicenter Podcast

Skip Protocol: The API for Seamless Cross-Chain Experiences on Cosmos & Beyond - Sam Hart

Interoperability is the holy grail for a multi-chain future or the ‘internet of blockchains’. However, while IBC (inter-blockchain communication protocol) revolutionised permissionless cross-chain transfers, it was Skip Protocol’s Go API that took advantage of Cosmos’ infrastructure (and its numerous messaging protocols), to create an end-to-end interoperability platform, allowing developers to de

Interoperability is the holy grail for a multi-chain future or the ‘internet of blockchains’. However, while IBC (inter-blockchain communication protocol) revolutionised permissionless cross-chain transfers, it was Skip Protocol’s Go API that took advantage of Cosmos’ infrastructure (and its numerous messaging protocols), to create an end-to-end interoperability platform, allowing developers to design seamless cross-chain user experiences.

We were joined by Sam Hart, head of product at Skip Protocol, to discuss the challenges of interoperability across Cosmos & Ethereum, and how Skip:Go API & Skip:Connect push the crosschain boundaries towards end user adoption.

Topics covered in this episode:

Sam’s background The Other Internet Skip Protocol: bringing seamless interoperability to Cosmos Monetization for Skip:Connect & Skip:Go The Skip:Connect oracle and how Cosmos chains pull data Cross-chain MEV MEV blockers Ethereum vs. Cosmos ecosystem development and activity Ethereum bridges vs. Cosmos IBC Cross-chain interoperability Skip’s multi-chain expansion

Episode links:

Sam Hart on Twitter Skip Protocol on Twitter The Other Internet

Sponsors:

Gnosis: Gnosis builds decentralized infrastructure for the Ethereum ecosystem, since 2015. This year marks the launch of Gnosis Pay— the world's first Decentralized Payment Network. Get started today at - gnosis.io Chorus1: Chorus1 is one of the largest node operators worldwide, supporting more than 100,000 delegators, across 45 networks. The recently launched OPUS allows staking up to 8,000 ETH in a single transaction. Enjoy the highest yields and institutional grade security at - chorus.one

This episode is hosted by Friederike Ernst.


PIVX

Introducing PIVX Toolbox V3 (Layman’s Changelog) by PIVX Core member, PalmTree.

Hello, Welcome to Version 3 of the PIVX Toolbox. First, lets look at some of the more notable changes: Unified styling on ALL pages (result pages now look similar to form and home pages) Even more mobile friendly compared to previous version Simpler user experience throughout the site Completely re-written front end utilizing the latest Bootstrap 5.3.x Removed jQuer

Hello, Welcome to Version 3 of the PIVX Toolbox.

First, lets look at some of the more notable changes:

Unified styling on ALL pages (result pages now look similar to form and home pages) Even more mobile friendly compared to previous version Simpler user experience throughout the site Completely re-written front end utilizing the latest Bootstrap 5.3.x Removed jQuery dependency True PIVX address detection and verification (previous version was pretty janky) Exchange addresses now fully supported Added hover animations and tooltips throughout site Themed language dropdown Simpler, faster, more modern looking navigation menu Webkit (Chrome based browsers) now have purple themed scroll-bars Reduced codebase contributes to an overall faster performance More noticeable errors and alerts Higher contrast button colors for better accessability

It’s crazy but the current iteration of the PIVX Toolbox (I’m calling it version 2) has been around for almost 4 years with almost 100% uptime working tirelessly and reliably helping PIVX users. If memory serves, version 1 was made in collaboration with Zenzo. While it was a start, it under-delivered in almost every way. Here’s an old screenshot comparing V2 to V1 for those that don’t remember:

A Few Before and After Pictures For Comparison:

V2-V3 performance comparison. A reduced code-base contributes to an overall performance increase.

My Transactions, simplified and easier to read. All pages (home, form, result) now have a similar theme with rounded and centered icon on top and similar backgrounds. Much less confusing. Also added is a permalink button so users can bookmark their addresses.

Masternode Status Check has similar improvements. Much easier to decipher results. Like the Transactions page, a permalink button is added so users can bookmark their Masternode status.

Rewards Estimator is simplified with a single toggle to include Masternodes. Notice the better contrast buttons.

All help sections boast significantly improved formatting and readability, for example this Snapshots help screen.

Themed language drop-down navigation with auto drop-down on large screens

Simpler navigation menu along with more modern looking hamburger and language buttons (removed backgrounds and borders)

Better errors and alerts, also notice the proper handling of PIVX addresses.

All modals (pop up boxes) have improved contrast for better visibility

Unified theme example, home, form, and result pages look similar.

Conclusion

Almost every file has been modified. Some slightly, others completely replaced. Also note that NO DAO funds have been used to create this Masterpiece (well, maybe not a Masterpiece but definitely better than before).Is there room for improvement? ABSOLUTELY! But in this case, you get more than you paid for.

Enjoy, Palm Tree
[PIVX Core Member, Community Contributor Web Development, Creator of PIVX Tool Box.]

PIVX. Your Rights. Your Privacy. Your Choice.
To stay on top of PIVX news please visit PIVX.org and Discord.PIVX.org.

Introducing PIVX Toolbox V3 (Layman’s Changelog) by PIVX Core member, PalmTree. was originally published in PIVX on Medium, where people are continuing the conversation by highlighting and responding to this story.


Nym - Medium

Nym Squads speak out: TupiNymQuim on the ground in Brazil

Reflections on the fight for democracy, online privacy, and freedom Following last week’s Nym Dispatch on the recent banning of X/Twitter in Brazil, as well as the use of VPNs to access the platform, Nym reached out to one of our beloved and dedicated Squads on the ground in Brazil, TupiNymQuim, to get their response and perspective. Languages: Русский // Türkçe // Bahasa Indonesia // 日本 //
Reflections on the fight for democracy, online privacy, and freedom

Following last week’s Nym Dispatch on the recent banning of X/Twitter in Brazil, as well as the use of VPNs to access the platform, Nym reached out to one of our beloved and dedicated Squads on the ground in Brazil, TupiNymQuim, to get their response and perspective.

Languages: Русский // Türkçe // Bahasa Indonesia // 日本 // Português // 中文 // Español // українська мова

A bit of background: TupiNymQuim is Nym’s first Brazilian community squad. Its name comes from the combination of “tupinimquim,” a way of referring to “things made in Brazil” and its indigenous people + Nym. The squad focuses on the decentralized management of the Lusophone community, operating more than 20 nodes, and representing Nym at events. The group is made up of programmers and software engineering students with an interest in hacktivism and the defense of privacy.

The following is a guest post authored by two members of TupiNymQuim on what is happening in their country regarding the social effects of big tech and social media platforms, democratic struggles, and the free access to information and communication technologies.

X (Twitter), VPNs, and the threshold between privacy, freedom, and crime

The issue is simple: if VPNs did what they were supposed to do, governments wouldn’t know which sites you access. But things aren’t so simple.

The problem

As commentators on the recent escalation surrounding the suspension of X in Brazil, it’s important to refuse two things simultaneously:

A false and empty notion of the freedom of expression to legitimize crimes and extremist political projects The overthrow of a platform that still remained relevant to the Brazilian public debate

Even if the criticism of the worsening quality of X, formerly known as Twitter, was legitimate, the reality is that the platform was a still relevant vehicle for information and communication in Brazil, especially in the political sphere.

But the controversy surrounding Minister Alexandre de Moraes’ decision to take down the platform in Brazil is the culmination of a more serious problem. How do we behave towards individuals who have a mega-infrastructure (and fortune) being used to make their extremist political positions echo so loudly?

The immunity in which big tech companies operate with their supposed political neutrality is just a chimera aimed at protecting their own image and business. When politically relevant, these companies end up supporting political projects out of self-interest, as recently happened in the Brazilian Congress.

The truth is that it’s easy to stay out of controversy when the criteria for what can and can’t be viralized are in their custody. Or rather, when it’s hidden in thousands of lines of closed code, ultimately unknown to society as a whole.

If big tech and X present themselves as convenient means of accessing information relevant to public life, why aren’t they auditable? And more importantly, what can we do when a select owner of one of these networks shouts in favor of an extreme right-wing agenda?

Why do billionaires see themselves as superhumans?

In the end, Elon Musk simply saw another business opportunity with his control of a network that yields such an enormous influence on political debate. Instead of strategic silence, the entrepreneur opted for the targeted use of X’s toxic noise in order to increase his revenue in other businesses.

Source

Hypocrisy is hardly not enough to describe Musk’s behavior. By rivaling a young democracy like Brazil’s, which recently resisted and defeated an anti-democratic coup, and by silencing himself and posing alongside authoritarian leaders, Musk reveals that his fight for the freedom of expression ends when it is no longer convenient.

Instead of questioning the absurdities of Musk’s Trumpist stance (and its necessary link to the implosion of democracy), we might as well ask: why do we allow the voice of a single individual, especially a wealthy one, to echo so loudly? Why do we let him dictate the axes of this debate on his network? Why do we listen to his empty rantings instead of putting him in his place?

We say: he is no more than a parasite who doesn’t care about any agenda other than his own enrichment.

The truth is that society lets this abyssal discrepancy grow as we fail to combat big tech. This happens when we accept abusive privacy terms, let them interfere in our elections, or allow network infrastructures to be increasingly concentrated in their hands.

In addition to combating Musk’s extremist messages, we need to regulate the actions of individuals who yield so much influence over the structure of communications, giving them disproportionate influence over ideological views (whether democratic or anti-democratic).

The democracy of judges

Moraes [who declared the ban on X] was appointed to the Supreme Court by the previous president Michel Temer. Temer took over the presidency after carrying out a coup in coalition with Congress to oust the then-president Dilma Rousseff. Moraes was also Temer’s Justice Minister before being appointed to the Supreme Court.

Having maintained close ties with the right-wing in São Paulo for more than 20 years, Moraes saw the political niche in which he had built himself collapse. After 2016, the traditional Brazilian right, which had been building itself up over 30 years of democracy, was quickly replaced by Bolsonarism. With the old right now almost extinct, Moraes migrated from conservatism to the opposite political spectrum.

In 2016, Moraes, as Minister of Justice, cutting cannabis plants in a campaign to “eradicate marijuana in Brazil

Having recently voted in favor of decriminalizing marijuana, Moraes now shows that, in addition to being a judge, he is really a political chameleon adapting himself to the situation he finds himself in. How might he have acted if he had been judging not the anti-democratic acts of Bolsonaro in 2023, but rather the beginning of that coup challenge in 2014 made by the candidate Aécio Neves, from his former party, the PSDB? Are the crimes committed by the right-wing political milieu from which Moraes grew up less serious than the opposite spectrum in which he now finds himself?

These rhetorical questions only serve to challenge the fairness of politically appointed judges. They also serve as a critique of the recent concentration of power in the hands of the judiciary and the possible risks to Brazilian democracy, which often sees the institutional position of judges shielded from the abuses they commit and the controversies in which they become involved.

Sculpture “A justiça” [To justice] vandalized. Source

A fundamental principle of democracy is the existence of a balance between the different poles of power (judicial, executive, and parliamentary). However, how can we protect ourselves when this balance is broken? Or when it is theoretically coherent but widely disregarded in practice? What risks do we run when we see a judiciary that increasingly rules by decree and gives extraordinary powers to its leaders?

How do governments watch you?

In a nutshell: through the exchange of information from Internet Service Providers (ISPs), either voluntarily or through judicial subpoenas. In Brazil, some of the ISPs are Claro, Vivo, and TIM. Even if your content is encrypted, all the metadata of your communication (such as where you talk from, or who you talk to, or the size of your data — whether it’s text, image, video — is exposed.

We inherited the choice made not to build privacy into the architecture of the Internet, and now we are suffering from a broken network dominated by large conglomerates — including a few companies forming the basis for the network’s hardware — who are symbiotically linked to corporate and state surveillance.

But all that was solved with VPNs, right?

Unfortunately, no.

Virtual Private Networks (VPNs) as a technology are weak due to an architectural choice: if we don’t trust our ISPs and believe that they will leak our information to third parties, why trust a VPN? In that case, we’re just transferring trust. In a privacy-oriented technology, “having to trust” some institution blindly is a sign of the technology’s weakness.

It would be better to present an open-source, auditable alternative that distributes the influence and management of the network to multiple operators.

Note: We are not supporting the use of X through a VPN right now. The truth is, we don’t even miss X (it was already awful). We’re just warning you that the VPN you use can and probably will reveal your online activity if it’s vehemently subpoenaed (or hacked) — and if you’re using a free VPN, check this out.

The abuse of power through decentralization

This is exactly what we’re doing at Nym: creating a new network infrastructure that allows your data packets to travel in a truly anonymous way.

Source

NymVPN is actually supported by the Nym mixnet, which is a decentralized network of servers (nodes) around the globe and managed autonomously by individuals — whether connected to the Nym community or not.

As well as encrypting your data and metadata, the mixnet adds fake traffic and obfuscates the time pattern of packets. This way, it is impossible to trace a packet within the network: not even the node operators can know whose data packet is being distributed.

The Nym mixnet is therefore the only technology that protects you from identification by analyzing the metadata between the network communications.

Thus, distanced from governments and big tech, we are finally enabling online privacy to the extent that we distribute ourselves the income, influence, and responsibility for managing this privacy market.

What’s bad can get worse

Finally, it’s worth noting that Musk’s libertarian hypocrisy is just a glimpse of the possibility of a much worse future.

What will happen if companies like Amazon, Microsoft, and Google, which own 65% of today’s cloud servers, start vehemently and publicly expressing their political and economic preferences, as the owner of X now does?

What steps can we take to end government and corporate surveillance on our lives with structural leakage through ISPs and VPNs?

How can we extinguish the plague of big tech that parasitizes human creativity, cognition, and privacy in favor of a freer and more democratic Internet?

Indeed, the road is long, but web3 is emerging as an alternative to break these monopolies and to block companies, governments, and cybercriminals from accessing our private lives once and for all.

Signed,

psydenst & supermeia

TupiNymQuim

Originally published in Portuguese on 12 September 2024.

All the opinions expressed are the authors’ own and do not necessarily reflect the views of Nym Technologies, nor do they necessarily diverge from them.

Join the Nym Community

Telegram // Element // Twitter

Privacy loves company

English // 中文 // Русский // Türkçe // Tiếng Việt // 日本 // Française // Español // Português // 한국인

Nym Squads speak out: TupiNymQuim on the ground in Brazil was originally published in nymtech on Medium, where people are continuing the conversation by highlighting and responding to this story.

Thursday, 12. September 2024

Horizen - Blog

ZenIP 42407 Passed – New ZEN Tokenomics for Horizen 2.0

We are excited to announce that the ZenIP 42407 proposal has passed with nearly 100% of participants voting in favor of the proposal! The ZenIP 42407 proposal is a comprehensive plan for future ZEN tokenomics as Horizen transitions from a Proof-of-Work (PoW) system to a more efficient Proof-of-Stake (PoS) model and an ecosystem optimized for […] The post ZenIP 42407 Passed – New ZEN Tokenomic

We are excited to announce that the ZenIP 42407 proposal has passed with nearly 100% of participants voting in favor of the proposal!

The ZenIP 42407 proposal is a comprehensive plan for future ZEN tokenomics as Horizen transitions from a Proof-of-Work (PoW) system to a more efficient Proof-of-Stake (PoS) model and an ecosystem optimized for ZK.

This shift is part of Horizen’s larger goal to enhance its ecosystem for decentralized applications, particularly those utilizing zero-knowledge technology. These tokenomics changes address the distribution of remaining ZEN tokens, vesting schedules, and reward systems to ensure sustainable growth, network security, and incentivization for ecosystem participants.

Key Highlights of ZenIP 42407

ZEN Allocations:

The ZenIP 42407 proposes to make notable changes to the allocation of block rewards:

Horizen Foundation: The Foundation’s share of block rewards is proposed to increase from 20% to 32.5%. This increase will support long-term initiatives such as ecosystem development, market stability, and infrastructure growth. The additional resources aim to strengthen the Horizen Foundation’s role in driving development across the ecosystem​. DAO Treasury: The DAO Treasury is set to receive 27.5% of the total emissions. This allocation includes specific tracks like the ZEN Sustainability Initiative (17.5%), grants (5%), and growth/marketing efforts (5%) to ensure the ecosystem’s continued expansion. Collator Rewards: The remaining 40% of the emissions is proposed to be used to fund the security budget, which is essential for compensating Collators and their Delegators under the new Proof-of-Stake model​

Emission Schedule & Vesting:
A gradual vesting schedule is proposed for the remaining $ZEN tokens. Only 25% will be released at the time of migration to Horizen 2.0, with the remaining 75% vesting monthly over 48 months. This schedule helps maintain the sustainability of the network’s resources over time.

New Way to Earn Staking Rewards :
Collators (validators) and Delegators will receive rewards through a smoothly declining emissions schedule, which preserves the max supply of 21 million $ZEN. This change is designed to create predictability and long-term participation incentives for network stakeholders​

Elimination of the Halving Schedule:
The traditional block reward halving schedule will be replaced with a gradual emissions reduction, smoothing out the supply decreases and reducing market volatility. This adjustment is expected to mitigate speculative behavior while promoting healthier long-term participation.

Why These Changes?

The motivation behind the new tokenomics is rooted in Horizen’s evolving vision.

As the platform shifts from being a privacy-focused cryptocurrency and a traditional UTXO blockchain to a robust ecosystem for decentralized apps, its tokenomics needs to reflect this new focus. By incentivizing participants through staking and ensuring a stable financial foundation for community and infrastructure growth, these changes aim to support Horizen’s leadership in the competitive ZK application space.

Impact on the Horizen Ecosystem?

The new ZEN tokenomics for Horizen 2.0 is designed to:

Enhance Network Security

Collators will take over from PoW miners, securing the Horizen 2.0 network and receiving a significant portion of $ZEN rewards.

Attract More Developers

By allocating more funds toward ecosystem development, the proposal aims to attract top-tier projects to build within Horizen’s ecosystem, ensuring a sustainable pipeline of innovative applications.

Increase Funding for Ecosystem Growth

More $ZEN will be allocated to the Horizen Foundation and DAO Treasury, ensuring we have resources to fund development, developer initiatives, and marketing efforts.

Reduce Market Speculation

A gradual emissions decline mitigates the market disruptions typically associated with halving events​.

Strengthen Governance

The inclusion of a DAO-managed treasury will empower the community to drive initiatives aligned with Horizen’s mission, enhancing decentralized governance.

Next Steps

The new ZEN tokenomics will go live with the mainnet launch of Horizen 2.0, which will be in Q1 2025. Follow Horizen on X and Discord for upcoming details

FAQ

1. What is the max supply of $ZEN tokens proposed?
The total supply of $ZEN is capped at 21 million tokens, which will remain unchanged despite the adjustments.

2. How will the new tokenomics impact staking rewards? 
Staking rewards for Collators and Delegators will be distributed through a gradually declining emissions schedule, ensuring predictable and stable rewards over time​. 

3. Why replace the halving schedule?
The proposal replaces halving with a smoother emissions reduction to avoid market volatility and speculative behavior triggered by abrupt changes in rewards​.

4. When will the new tokenomics be implemented?

It will be be implemented when Horizen 2.0 goes live on mainnet

 

The post ZenIP 42407 Passed – New ZEN Tokenomics for Horizen 2.0 appeared first on Horizen Blog.


Nym - Medium

Scaling blockchains securely

The Nym mixnet solution to the problem of selective disclosure attacks This report will go over an important vulnerability when it comes to ensuring the data availability of blockchains as they scale. Languages: Русский // Türkçe // Bahasa Indonesia // 日本 // 中文 // Française // Español // Português // українська мова Mixnets like Nym’s can help solve this problem by preventing attacks which
The Nym mixnet solution to the problem of selective disclosure attacks

This report will go over an important vulnerability when it comes to ensuring the data availability of blockchains as they scale.

Languages: Русский // Türkçe // Bahasa Indonesia // 日本 // 中文 // Française // Español // Português // українська мова

Mixnets like Nym’s can help solve this problem by preventing attacks which target chain consensus.

The security problem of scaling

Blockchains are permanent and public ledgers of transactions which cannot be interfered with by third parties. For new transactions to be verified (for example, to ensure that a coin isn’t being spent twice), this full ledger (contained in blocks) must be constantly updated and checked. Therein lies two unsolved problems for blockchain technology:

How can a blockchain scale without straining the resources of verifiers who must use their own bandwidth and resources to verify transactions for the communal chain? Do more efficient means of data verification come at the cost of chain security?

Mustafa Al-Bassam, the co-founder of Celestia and computer security researcher, diagnoses the problem in the following way: as block sizes grow, running “full nodes” for data availability verification becomes increasingly resource intensive. The introduction of committees and light nodes are more efficient alternatives, but they can be vulnerable to a specific network attack known as selective disclosures.

The Nym team has been working closely with Celestia, researching how the Nym mixnet might be able to add an anonymization layer to the blockchain verification process in order to protect against this particular attack. As we will see, this would involve a mode of Unlinkable Data Availability Sampling.

Before sketching Nym’s solution, let’s first look at the technical nature of the problem for blockchains in scaling without relying on full nodes, and how existing solutions leave blockchains open to selective disclosure attacks.

Data availability verification

Data availability refers to the guarantee that all the data in a blockchain block has been properly published and can be accessed when needed by the network. This ensures that all the necessary information related to a block is available for nodes to download and verify so the network can validate the correctness and completeness of the block’s transactions and state.

In blockchain systems, verifying data availability is crucial in preventing malicious actors from hiding or withholding parts of a block’s data while still claiming the block as valid. Without data availability, a block could be added to the chain with missing or incomplete data, potentially leading to invalid transactions, security risks, and an inconsistent state across the network.

Traditional blockchains require users (running full nodes) to verify all data by syncing the entire chain. But ensuring data availability can be challenging, especially as block sizes grow.

Techniques like data availability sampling allow light nodes to verify the availability of data without downloading the entire block, making data availability verification more efficient overall. Modern blockchains take this approach to verify data availability with fewer resources. Celestia is the leading example in the use of data availability to power all the modern chains.

Mechanisms for scaling, securely and insecurely

There are several ways that data availability verification can be achieved with different levels of security:

Full Nodes (maximum security)

Full nodes verify all data by downloading everything to ensure maximum security by rejecting incomplete blocks. This baseline solution, however, becomes increasingly inefficient as block sizes grow. Afterall, it takes physical and financial resources to perform this verification work.

No Data Availability Guarantee (zero security)

There is no guarantee that data is available, only a commitment (like IPFS URIs). This may be sufficient for scenarios like NFTs where safety isn’t required, but it is certainly not a solution for most chain transactions with real stakes.

Data Availability Committee

Through an honest majority, a select committee guarantees that data is available, thus balancing data availability with performance.

Data Availability Committee with Crypto Economic Security

This method also faces the same problem of scaling, since committee members face growing data overheads to perform verification work. To ameliorate this, committees can also be “crypto economically incentivized,” that is, provided tokens in proportion to the verification work performed.

Committees can also be penalized (“slashed” or “halted”) if they are dishonest, thus increasing overall security. This works when the committee is part of the chain’s consensus mechanism.

Light nodes

Introducing light nodes into this framework can also allow for data availability checks, thus further reducing resource requirements. There are two ways of doing this:

Data Availability Sampling without an Honest Minority of Light Nodes: Light nodes use sampling techniques to verify data without downloading the entire block, but they can’t guarantee full data recovery if some data is missing. This relies on the data availability committee and sampling interfaces. Data Availability Sampling with Honest Minority of Light Nodes: If there’s a minority of honest light nodes, they can reconstruct a block if any data is withheld to enhance security. Note that a synchronous network is needed for nodes to share data effectively. Unlinkable Data Availability Sampling

This advanced level prevents targeted attacks (like selective share disclosures, which we will delve into next) by making requests from light nodes unlinkable and uniformly random. This would require further advancements in anonymization technologies, which is where Nym’s solution comes into play.

But first, what makes these prior solutions inadequate and vulnerable exactly?

The problem: Selective disclosure attacks

A selective disclosure attack is a type of data availability attack where a malicious adversary tries to convince a node (or multiple nodes) that a block’s data is fully available when, in reality, part of it is withheld. This effectively makes the block incomplete or unrecoverable.

The attacker’s goal is to manipulate the verification process by selectively responding to queries for block data in a peer-to-peer network. In the end, it breaks consensus, forks the chain, and compromises real transactions and overall trust.

So here’s how it works, based on the updated understanding.

Overview of the Attack

The attack has two simultaneous components:

The adversary withholds enough data shares from the block so that it cannot be reconstructed by the network, making the block unavailable. Simultaneously, the adversary selectively responds to queries from the target light nodes so that they believe the block is available. Mechanism of the attack

The network relies on data availability sampling (DAS) where light nodes request random samples of block data from other nodes to verify the availability.

In a selective disclosure attack, the adversary identifies a portion of the block’s data to withhold, ensuring the block cannot be reconstructed. However, the adversary selectively responds to the queries of honest nodes, such as light clients, by providing data from the shares that were not withheld. This creates the false appearance that the block is fully available. Challenges for Honest Nodes Since adversarial nodes are indistinguishable from honest ones and respond correctly when queried, they cannot be blacklisted unless they are detected. Honest nodes make sample requests, and the adversary’s responses appear valid because the withheld data is hidden from the specific queries of the honest nodes. Two solutions One suggested countermeasure is adding an anonymization layer where the source of each sample request cannot be linked to the client (light node) and requests are processed in a random order across the network. This prevents the adversary from targeting specific nodes with selective disclosure. Another approach involves ensuring each node makes a sufficient number of queries (increasing the chances of detecting withheld data), or relying on a large enough number of nodes to cover the missing shares. Simulation results If a client makes a small number of queries (e.g., 15), the probability of a successful attack is relatively high (~0.0133) — the adversary could trick the client after about 75 attempts. As the number of queries per client increases (e.g., to 50 queries), the probability of the attack succeeding drops dramatically (to nearly 0). Similarly, targeting more clients increases the attack’s success rate, but the probability decreases as the number of queries rises. Summary of the problem

The selective disclosure attack manipulates data availability sampling by selectively revealing data to specific nodes, convincing them that the block is fully available. Countermeasures involve using (1) anonymization techniques and (2) ensuring that clients make enough random queries to detect missing data.

Possible solutions investigated

In an “Evaluation of private networks for Celestia,” researchers analyzed different possible solutions for adding an anonymity layer to Celestia’s blockchain. Possible private network solutions included:

A Tor network overlay with Snowflake to mask Tor traffic with WebRTC and prevent eavesdropping Mixnet integration, such as with Nym’s Loopix, for traffic anonymization Latency tolerances that implement randomized delays in traffic to desynchronize the queries of clients Cover traffic or dummy traffic to conceal real query patterns The use of VPNs for additional protections

These solutions each have advantages and downsides (e.g., in terms of possible latency).

Setting aside the advantages and many problems posed by a Tor overlay, it’s important to note that many of these core solutions are network techniques (e.g. cover traffic and randomized delays) already in operation with Nym mixnet. So the Nym core team decided to pursue a R&D investigation to see what Nym could do to help Celestia and others make a more private blockchain experience.

Nym’s Anonymous Sampling solution

Nym Technologies proposes to integrate its infrastructure with modular networks, leveraging the Nym mixnet as an anonymization layer to address selective disclosure attacks through Private Data Availability Sampling (P-DAS).

P-DAS enables requests via the Nym Mixnet, decoupling the request from the requester, which ensures that the requester cannot be targeted. This method provides a privacy-preserving, secure approach to data availability sampling, allowing nodes to verify data availability without exposure to adversarial attacks.

A Nym Mixnet integration could offer several key benefits:

An Anonymized Data Availability Sampling module compatible with the Nym mixnet, ensuring privacy-preserving data verification Prevention of selective disclosure attacks by routing encrypted traffic through the Nym mixnet, obscuring user activity while maintaining data integrity using techniques like cover traffic, mixing, and timing obfuscation Performance optimization through simulations to determine ideal mixnet parameters for balancing performance and security

The Mixnet integration would strengthen data security for modular networks, making them more resilient to manipulation while protecting user privacy. Nym’s research will continue its work to offer robust protection to a modular future.

Stay tuned for more updates on the project!

Join the Nym Community

Telegram // Element // Twitter

Privacy loves company

English // 中文 // Русский // Türkçe // Tiếng Việt // 日本 // Française // Español // Português // 한국인

Scaling blockchains securely was originally published in nymtech on Medium, where people are continuing the conversation by highlighting and responding to this story.


Circle Blog

Circle's Web3 Services Now Supports Arbitrum

As a platform company, Circle is focused on enabling developers to innovate and build leading-edge applications using Circle’s Web3 Services product suite. Today, we’re excited to announce that Circle’s Web3 Services product suite now fully supports the Arbitrum One blockchain! With existing support for a wide array of blockchains — including Avalanche, Ethereum, Polygon PoS, and Solana

As a platform company, Circle is focused on enabling developers to innovate and build leading-edge applications using Circle’s Web3 Services product suite. Today, we’re excited to announce that Circle’s Web3 Services product suite now fully supports the Arbitrum One blockchain! With existing support for a wide array of blockchains — including Avalanche, Ethereum, Polygon PoS, and Solana — this exciting development solidifies a new addition to our full-stack blockchain infrastructure platform, and provides robust solutions for businesses and developers ready to build cutting-edge on-chain applications.

Wednesday, 11. September 2024

RadicalxChange(s)

Janine Leger & Timour Kosters: Co-Founders of Edge City

Join host Matt Prewitt in an inspiring conversation with Edge City co-founders Janine Leger and Timour Kosters, as they dive into the transformative world of pop-up villages and cities. Discover the story behind Edge City's latest experiment, Edge Esmeralda, and learn how temporary communities are reshaping the way we live and work. Janine and Timour share their passion for experimentation, collab

Join host Matt Prewitt in an inspiring conversation with Edge City co-founders Janine Leger and Timour Kosters, as they dive into the transformative world of pop-up villages and cities. Discover the story behind Edge City's latest experiment, Edge Esmeralda, and learn how temporary communities are reshaping the way we live and work. Janine and Timour share their passion for experimentation, collaboration, co-creation, and their vision for building healthier, more dynamic environments.

From the Whole Earth Catalog to the Chautauqua movement, this episode explores the rich history of pop-up communities while introducing groundbreaking ideas like community currencies ("∈dges") and iterative social technologies. Tune in for an engaging and forward-thinking discussion that reveals fresh perspectives on the future of community building, collaboration, and social innovation. Don’t miss this illuminating discussion!

Links & References: 

References:

About Edge City Edge Esmeralda Recap Why I Built Zuzalu by Vitalik Buterin | Palladium Magazine 2023: First Zuzalu Balaji Srinivasan’s on network states: The Network State Digital nomad - Wikipedia Whole Earth Catalog - Wikipedia Back-to-the-land movement - Wikipedia Burning Man - Wikipedia History of the Regional Network | Burning Man Project Michel Bauwens - Wikipedia The Seeds of The Commons: Peer-to-Peer Alternatives for Planetary Survival and Justice | Postdigital Science and Education Chautauqua - Wikipedia   What is a Chautauqua “Scenius” = Scenes of genius Scenius, or Communal Genius | WIRED Further notes on scenius - Austin Kleon RadicalxChange(s) | Barry Threw: Executive & Artistic Director of Gray Area Secret Societies, Network States, Burning Man, Zuzalu, and More - RadicalxChange Edges: A Plural Money Experiment - RadicalxChange Fork Edges here Plural Money: A New Currency Design - RadicalxChange

Bios:

Janine Leger is the co-founder of Edge City, an organization that convenes leaders and builders across tech, science, and society in pop-up villages around the globe. Previously, she co-created Zuzalu and led the Public Goods Funding team at Gitcoin.

Timour Kosters is also a co-founder of Edge City. Prior, he spent ten years building and investing in startups, including Artsy, the largest online art marketplace; Kama, a leading health-tech app; and Impact, an impact-focused social media brand. He was most recently a partner at Seed Club Ventures.

Links: 
Janine and Timour’s Social Links:

Janine Leger (@JanineLeger) / X timour kosters (@timourxyz) / X Edge City (@JoinEdgeCity) / X Edge City

Matt Prewitt (he/him) is a lawyer, technologist, and writer. He is the President of the RadicalxChange Foundation.

Matt’s Social Links:

ᴍᴀᴛᴛ ᴘʀᴇᴡɪᴛᴛ (@m_t_prewitt) / X

Connect with RadicalxChange Foundation:

RadicalxChange Website @RadxChange | Twitter RxC | YouTube RxC | Instagram RxC | LinkedIn Join the conversation on Discord.

Credits:

Produced by G. Angela Corpus. Co-Produced, Edited, Narrated, and Audio Engineered by Aaron Benavides. Executive Produced by G. Angela Corpus and Matt Prewitt. Intro/Outro music by MagnusMoone, “Wind in the Willows,” is licensed under an Attribution-NonCommercial-ShareAlike 3.0 International License (CC BY-NC-SA 3.0)

Tuesday, 10. September 2024

Greylock Partners

Deepfakes and the New Era of Social Engineering

The post Deepfakes and the New Era of Social Engineering appeared first on Greylock.

Circle Blog

Airtm builds next-gen payments & a USDC lifeline in Venezuela

Airtm is rapidly scaling into one of the world’s most innovative payment companies. Their business is based on enabling people to send and receive payments in USDC and seamlessly convert these digital dollars to local currency. This type of borderless, interoperable, dollar-based payment capability is a breakthrough in global finance, because dollars are often hard to get for individual

Airtm is rapidly scaling into one of the world’s most innovative payment companies. Their business is based on enabling people to send and receive payments in USDC and seamlessly convert these digital dollars to local currency. This type of borderless, interoperable, dollar-based payment capability is a breakthrough in global finance, because dollars are often hard to get for individuals who live outside the US.

Monday, 09. September 2024

Greylock Partners

Congrats, WarpStream and Confluent on Joining Forces

The post Congrats, WarpStream and Confluent on Joining Forces appeared first on Greylock.

44 of the most promising AI startups of 2024

The post 44 of the most promising AI startups of 2024 appeared first on Greylock.

Saturday, 07. September 2024

a16z Podcast

Grand Challenges in Healthcare AI

Vijay Pande, founding general partner, and Julie Yoo, general partner at a16z Bio + Health, come together to discuss the grand challenges facing healthcare AI today. The talk through the implications of AI integration in healthcare workflows, AI as a potential catalyst for value-based care, and the opportunity for innovation in clinical trials. They also talk about the AI startup they each wish w

Vijay Pande, founding general partner, and Julie Yoo, general partner at a16z Bio + Health, come together to discuss the grand challenges facing healthcare AI today.

The talk through the implications of AI integration in healthcare workflows, AI as a potential catalyst for value-based care, and the opportunity for innovation in clinical trials. They also talk about the AI startup they each wish would walk through the door. 

 

Resources: 

Find Vijay on Twitter: https://x.com/vijaypande

FInd Julie on Twitter: https://x.com/julesyoo

Listen to more episode from Raising Health: https://a16z.com/podcasts/raising-health/

 

Stay Updated: 

Let us know what you think: https://ratethispodcast.com/a16z

Find a16z on Twitter: https://twitter.com/a16z

Find a16z on LinkedIn: https://www.linkedin.com/company/a16z

Subscribe on your favorite podcast app: https://a16z.simplecast.com/

Follow our host: https://twitter.com/stephsmithio

Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures.

Friday, 06. September 2024

Epicenter Podcast

Arcium: Parallelized Confidential Computing Network - Yannik Schrade

In this day and age, privacy and confidentiality are more important than ever. Advancements in the cryptographic research of zero knowledge proofs (ZKPs), fully homomorphic encryption (FHE) and multi-party computation (MPC) paved the way for computational integrity and confidential computing. While FHE allows for computation to be performed on encrypted data without the need for prior decryption,

In this day and age, privacy and confidentiality are more important than ever. Advancements in the cryptographic research of zero knowledge proofs (ZKPs), fully homomorphic encryption (FHE) and multi-party computation (MPC) paved the way for computational integrity and confidential computing. While FHE allows for computation to be performed on encrypted data without the need for prior decryption, it is MPC that enables compliance with regulations (e.g. AML). Arcium aims to build a global super computer for parallelised confidential computing, powered by custom MXEs (multi-party computation execution environments).

Topics covered in this episode:

Yannik’s background Confidentiality & decentralised compliance Confidential computing TEEs (trusted execution environments) & side-channel attacks ZKP vs. MPC vs. FHE Arcium’s global super computer architecture How Arcium differentiates itself from other privacy protocols Use cases Censorship risks Ecosystem development

Episode links:

Yannik Schrade on Twitter Arcium on Twitter

Sponsors:

Gnosis: Gnosis builds decentralized infrastructure for the Ethereum ecosystem, since 2015. This year marks the launch of Gnosis Pay— the world's first Decentralized Payment Network. Get started today at - gnosis.io Chorus1: Chorus1 is one of the largest node operators worldwide, supporting more than 100,000 delegators, across 45 networks. The recently launched OPUS allows staking up to 8,000 ETH in a single transaction. Enjoy the highest yields and institutional grade security at - chorus.one

This episode is hosted by Sebastien Couture & Felix Lutsch.


PIVX

PIVX Perspectives Spotlight Shines on PIVX’s OG Core Developer, Fuzzbawls.

PIVX Perspectives: Inspiring Conversations. “Can you tell us a little about yourself please?” is what we usually ask but we decided to change it up this time and wrote our own introduction. Fuzzbawls is PIVX’s Lead Core Developer who has been with PIVX since the beginning, over 7 years ago. As an early contributor, his technical expertise and commitment to the PIVX community have been invaluable

PIVX Perspectives: Inspiring Conversations.

“Can you tell us a little about yourself please?” is what we usually ask but we decided to change it up this time and wrote our own introduction.

Fuzzbawls is PIVX’s Lead Core Developer who has been with PIVX since the beginning, over 7 years ago. As an early contributor, his technical expertise and commitment to the PIVX community have been invaluable. Fuzzbawls first got involved in crypto out of necessity, but lucky for all of us he discovered PIVX along the way. Fuzzbawls has been a constant force, driving the project’s growth and development with his collaborative spirit and unwavering dedication.

When and why did you get into cryptocurrency?

I got into crypto more out of necessity then anything else. Needed a way to secure income whilst not submitting to a corpo office job. My background in programming and build systems in particular gave me the perfect opportunity to offer freelance work.

When and why did you choose to get involved with PIVX?

I actually first noticed PIVX by chance when viewing the github activity of someone else that I had worked with before. Started out by offering a DNS based seeder service (with code PR)

What do you think sets PIVX apart from other cryptocurrencies in the market?

Cliché as it sounds, but the multitude of community members that devote their time to a vast array of different areas of focus. We really are a global “team”. And, the best part? unlike the swarm of NFT scamcoins and ICOs before them, there is no central authority that can “rug pull”.

Being with PIVX since the beginning, what moment over the course of the past 7 years stood out to you the most?

The first thing that comes to mind is the zerocoin exploit. I was out working the local voting booths when my phone started going off. One very long break later we had determined that PIVX was **not** exploited and had neutralized the threat by way of a network SPORK.

As a developer, what are some of the technical aspects or challenges you encountered while working on PIVX core wallet and how did you overcome them?

One of the more involved challenges for me personally is maintaining concurrency across a multitude of build targets. Linux, Windows, MacOS for release binaries. Also CI/Launchpad for multiple CPU architectures. To help ease this, I’ve invested in hardware to have locally that covers a broad range of these scenarios.

Finally, what advice or encouragement would you give to aspiring developers and entrepreneurs who are looking to make an impact in the cryptocurrency industry?

My advice to aspiring developers would be to shed off the archaic notion of corporate structure. We are literally living in a time where personal demonstrations and perseverance outweigh the olden times. If you see something you can contribute, just do it! Continued opportunities will likely follow.

Many thanks to Fuzzbawls for taking the time to answer my questions.

PIVX. Your Rights. Your Privacy. Your Choice.
To stay on top of PIVX news please visit PIVX.org and Discord.PIVX.org.

PIVX Perspectives Spotlight Shines on PIVX’s OG Core Developer, Fuzzbawls. was originally published in PIVX on Medium, where people are continuing the conversation by highlighting and responding to this story.


Zcash Foundation

The Zcash Foundation’s Q1 2024 Report

The Zcash Foundation is committed to transparency and openness with the Zcash community and our other stakeholders. Today, we are releasing our Q1 2024 report, which provides an overview of the work undertaken by the Zcash Foundation’s engineering team, as well as an overview of other activities during this period. The report also provides a […] The post The Zcash Foundation’s Q1 2024 Report app

The Zcash Foundation is committed to transparency and openness with the Zcash community and our other stakeholders. Today, we are releasing our Q1 2024 report, which provides an overview of the work undertaken by the Zcash Foundation’s engineering team, as well as an overview of other activities during this period.

The report also provides a financial update, describing our income and expenditure, with a detailed breakdown of our expenses, and a snapshot of the Foundation’s financial position, in terms of liquid assets and liabilities that must be met using those assets.

Download the Q1 2024 report here.

Our previous quarterly reports can be found here.

The post The Zcash Foundation’s Q1 2024 Report appeared first on Zcash Foundation.

Wednesday, 04. September 2024

PIVX

Join PIVX for our next #PrivacyRoundtable with our friends and hosts, Firo: Private cryptocurrency…

Join PIVX for our next #PrivacyRoundtable with our friends and hosts, Firo: Private cryptocurrency infrastructure, The Particl Project, and BasicSwapDEX 📅Date: September 25th, 2024 ⏰Time: 9am EST/ 2pm UTC 🌏: XSpace event 🧑🏻‍💻 Special Guest: Luke Parker Lead Dev from SeraiDEX. ALL past Privacy Roundtable Space Events recordings can be found here: Privacy Roundtable Recordings PIVX. Your Right

Join PIVX for our next #PrivacyRoundtable with our friends and hosts, Firo: Private cryptocurrency infrastructure, The Particl Project, and BasicSwapDEX

📅Date: September 25th, 2024
⏰Time: 9am EST/ 2pm UTC
🌏: XSpace event
🧑🏻‍💻 Special Guest: Luke Parker Lead Dev from SeraiDEX.

ALL past Privacy Roundtable Space Events recordings can be found here:
Privacy Roundtable Recordings

PIVX. Your Rights. Your Privacy. Your Choice.
To stay on top of PIVX news please visit PIVX.org and Discord.PIVX.org

Join PIVX for our next #PrivacyRoundtable with our friends and hosts, Firo: Private cryptocurrency… was originally published in PIVX on Medium, where people are continuing the conversation by highlighting and responding to this story.

Tuesday, 03. September 2024

Greylock Partners

For Postscript, the Future of Marketing is AI-Powered and Delivered Via Text

The post For Postscript, the Future of Marketing is AI-Powered and Delivered Via Text appeared first on Greylock.

Nym - Medium

The Nym Dispatch: X blackout in Brazil

VPNs caught in the crosshairs in row over content regulation Brazilian authorities last Friday ordered the blocking of X/Twitter throughout the country, affecting over 20 million X users in a country of more than 215 million people. Languages: Русский // Türkçe // Bahasa Indonesia // 日本 // Française // Español // 中文 // Português // українська мова And this weekend, the blackout rolled out
VPNs caught in the crosshairs in row over content regulation

Brazilian authorities last Friday ordered the blocking of X/Twitter throughout the country, affecting over 20 million X users in a country of more than 215 million people.

Languages: Русский // Türkçe // Bahasa Indonesia // 日本 // Française // Español // 中文 // Português // українська мова

And this weekend, the blackout rolled out to worldwide surprise.

Brazil’s decision is a stunning turn of events, but it did not come out of nowhere. It was the result of a concerted political and legal campaign in Brazil since 2022 to neutralize the spread of socially damaging “misinformation” online. Despite these efforts, “digital militias” attacked the capital of one of the world’s largest and youngest democracies in 2023.

For months, the X platform has been in the crosshairs. And Elon Musk, X’s CEO, has gone out of his way to make a spectacle of pushing back against Brazil’s effort.

Lurking behind Brazil’s banning of X, however, is another legal decision which prohibits the use of Virtual Private Networks (VPNs). VPNs are estimated to be used by 37% of the Brazilian population (almost 3x the amount of X users in the country). Now, they face a fine of almost $9,000 a day (half the average annual salary for Brazilians) for any Brazilian citizen caught using a VPN to access X.

In this installment of the Nym Dispatch, we will first delve into how this conflict between X and Brazilian authorities originated and recently escalated. But we will also try to complicate the prevailing narrative behind the conflict. How does a ban on VPNs work exactly, and how can this affect the privacy of and access to information for VPN users in Brazil?

This threat against VPN users may have been nothing more than a scare tactic which was quickly backtracked on, and may ultimately be completely rescinded. However, it does raise awareness about how VPN service providers might be coerced to cooperate with state authorities, democratic or not.

Brazil v. X (2024)

So far the media focus has largely pitted X’s CEO, Elon Musk, as a “defender of free speech” against “authoritarian” censorship measures being taken by Brazil’s highest court. There is some truth to both of these takes, but the situation is much messier than this narrative lets on. In the end, the focus on the conflict between two ideological egos obscures too many real questions.

Can large tech companies claim immunity from jurisdictional laws? How should these companies navigate between democratic laws vs. authoritarian ones across the world? Should protections of freedom of speech online also include hate speech, incitements to violence, state propaganda justifying military invasions of sovereign countries, or deliberate misinformation campaigns to sway democratic voters? If not, who should decide what kinds of content gets to circulate on digital platforms? What is the line between “moderation” and censorship?

These are questions that so far remain unanswered: neither by Musk’s sensationalist and largely self-serving antics, nor the wholesale censorship of a platform as widely used as X. They are problems we need to think deeply about and discuss democratically as members of a digital society that is increasingly polarized into extremes.

The addition of VPNs to the equation, however, is a clear case of unnecessary governmental overreach. It ultimately undermines Brazil’s potentially legitimate case for some content “moderation,” giving fodder to its critics’ accusations of being authoritarian.

Brazil is not wrong about the threat of political misinformation. But to be at the forefront of combating it, it need not resort to draconian measures against the privacy tools of its own citizens. Targeting ordinary people for the use of VPNs, which have many legitimate and wide use-cases, is doing nothing to help the cause of digital democracy.

But first things first: how did this whole thing get started?

A row long in the making

Alexandre de Moraes (image wikipedia)

At the center of this story is, perhaps disconcertingly, one man: Alexandre de Moraes, a Justice for Brazil’s Supreme Court and the nation’s elections chief. Moraes’ appointment and rise to power in Brazil is a complicated one, with close ties to the political right prior to the rise of Jair Bolonsaro.

As the New York Times reported in October 2022, Brazil’s Supreme Court made an unparalleled decision granting Moraes “unilateral power to order tech companies to remove many online posts and videos — one of the most aggressive actions taken by any country to combat false information.” While unusual for a democracy, it was seen as an urgent effort to curb the flood of misinformation on social media platforms in the middle of a pivotal election.

At the time, the right-wing populist Bolsonaro was seeking reelection against the returning candidacy of Luiz Inácio Lula da Silva (“Lula”). But as early as before the first round of voting, unfounded accusations of voter fraud began circulating from Bolsonaro’s camp through social media, priming supporters to reject the outcome whether it was democratic or not. As later criminal charges against Bolsonaro alleged, this was a concerted effort to contest the democratic results of what would become his demise and ultimate banning from seeking political office until 2030.

Despite Bolsonaro’s clear defeat, the trouble continued. In an reiteration of the violent reaction of Trump supporters to the 2020 election in the U.S., these disinformation bubbles online also led to the storming of the capitol by Bolsonaro supporters in 2023 in an effort to reverse Lula’s inauguration.

Since then, Moraes has been given a nearly unilateral mission: to save the democratic institutions of Brazil by ridding the Brazilian web of posts and whole accounts that spread political misinformation. In addition to removing particular contents deemed to be politically harmful, it has also targeted the accounts of social media influencers, politicians, and business people.

And now access to X has gone down following an extended battle between Moraes and X regarding the latter’s refusal to comply with these requests. As Moraes’ legal decision charges X, the social media platform

“allow[s] the massive spread of disinformation, hate speech and attacks on the democratic rule of law, violating the free choice of the electorate, by keeping voters away from real and accurate information.”
The bigger picture

This weekend’s escalation is the result of a months-long spat between Moraes and X’s chief executive Elon Musk. While certain media are portraying Musk as a victim of free speech infringements, the series of events in Brazil’s legal jurisdiction complicates his public self-victimization.

First, X refused to comply with orders from Brazilian authorities to take down content and accounts, ultimately threatening to jail X employees in response. Musk then closed X offices in Brazil. In reaction, Moraes summoned X to produce legal counsel in Brazil, as per Brazilian law for foreign companies. X failed to do so despite the ultimatum. Moraes then unilaterally announced the banning of X and began coordination with Brazilian ISPs to roll out the blackout.

Update: Four additional Supreme Court Justices have today upheld Moraes’ decision.

This is certainly not a cut and dry case for Musk and X, and it’s certainly not just about free speech. More than that, it’s a serious dispute over jurisdictional authority for foreign companies and the effects of information on a country, let alone one coming out of the authoritarian politics of Bolsonaro.

And yet Moraes’ program is by far from democratic, as many commentators are noting. He has so far maintained unilateral power to decide what content, accounts, and perspectives are permissible. While this may be politically justifiable today for one democratic society in turmoil, what does it mean for tomorrow, or elsewhere? Ultimately what is more important than a legal decision is the precedent it sets.

And with the targeting of civilians with interdictions on the use of privacy technologies like VPNs, the issue gets messier for a country struggling to defend democratic institutions against authoritarian threats.

So let’s assume that Moreas’ threat against VPN users in Brazil is a serious one, even if it will soon be, if not already, fully rescinded. How does VPN censoring work in the first place, not just in Brazil but everywhere?

How will Brazil censor VPNs?

When you use a VPN, all of your internet traffic is encrypted and routed directly from your device to the VPN’s servers via your Internet Service Provider (ISP). Keep in mind that an ISP is what provides you access to the internet in the first place, so it is always a first point of contact.

Whether you’re in Brazil or elsewhere, your ISP will see with whom you are immediately connecting, whether it’s a web service like X or an intermediary proxy like a VPN. But while using a VPN with encryption, the ISP cannot see the content of what you do, nor will they see who you are connecting with on the web. They will only see that you are connecting with the VPN.

So how can a country like Brazil enforce a law that regulates who accesses a web service like X with the use of a VPN if they can’t see the destination after the VPN proxy?

There are two possibilities:

Rather than blocking access to X via a VPN, Brazil is actually ordering all ISPs in the territory to block known VPN access wholesale. The only other way is for VPN services to report a user’s X connection to the Brazilian government. In most cases, this would require the government to subpoena the VPN provider to hand over account and traffic records, if they have them, for users behind the specific connection. Or they do it of their own accord through government pressure.

Both of these possibilities underscore an infrastructural problem of traditional VPNs when it comes to protecting users’ privacy.

The VPN data problem

Most Virtual Private Networks, despite their name, are not really private. They are predominately centralized infrastructures fully capable of maintaining records not only of clients’ payment information, but also the metadata of all their traffic (encrypted content exempted).

While many commercial VPNs claim to keep “no logs” or “zero logs” of user traffic, this is ultimately a matter of faith for clients. And metadata records are likely kept in any case for “operational” network purposes.

VPN data leaks have exposed the extent to which client data is kept by many VPNs, and especially free ones. Government interventions like what is happening in Brazil further show the possible cooperation between legal surveillance orders, ISPs, and complying VPN companies.

One way around this privacy dilemma is to choose a VPN that is structurally incapable of keeping records because of its decentralized design.

But this may not solve a big problem for the censorship of information and access to via VPNs worldwide.

VPN censoring

This is certainly not the first time that VPNs have been the target of legal interventions and censorship at the state level.

One important function of VPNs is the ability to circumvent unjust censorship laws in certain countries. Their prohibitions, it shouldn’t be forgotten, are being used to block people’s access to any information which deviates from state propaganda. Privacy messaging apps like Signal have also become targets in an effort to interfere with people’s ability to maintain private contacts with dissident communities.

And even in democratic countries, Web3 services are increasingly becoming the targets of new legal and policing strategies in the name of combating social problems like the dissemination of child pornography, narcotics, and “terrorist” recruitment. The arrest of Pavel Durov, CEO of Telegram, in France is one case in point, but there are and will be others.

The digital information problem

There is definitely something rotten in the state of social media. Information bubbles generated by algorithmic surveillance, foreign bot farms inundating markets with targeted propaganda, and increasingly hostile ways of relating to one another: this is less free speech than it is information hijacking.

What these new battles surrounding access to expression and information on social media show is how powerful, and dangerous, information can be. In one context it may provide people the means to speak out against their own social oppressions. In another, it might just add foreign fuel to fascistic and authoritarian sentiments in sovereign countries so people don’t even understand what’s at stake where they live.

But to say there is one global problem of digital “misinformation” is a dramatic oversimplification. First, states and government-sponsored media laid the groundwork for mass propaganda campaigns long before the internet. US intelligence services used to have to drop propaganda leaflets over foreign cities to try to convince people to support a government favored by the West. Now similar things are happening with lightning speed through digital technologies.

And targeting VPNs in a round-up sweep against the refusal of some false “free speech” messiah is not going anywhere. Let’s have a real conversation about what it means to be citizens of the web, with our feet planted in diverse soils across the world. But doing so requires online privacy as a default.

The Nym Dispatch: X blackout in Brazil was originally published in nymtech on Medium, where people are continuing the conversation by highlighting and responding to this story.

Monday, 02. September 2024

Empiria

Empeiria to Showcase Innovative Data Solutions at 24 Fintech in Riyadh

As the Fintech industry in Saudi Arabia continues to surge, Empeiria is thrilled to be part of the 24 Fintech event from September 3–5 in Riyadh. We’ll be showcasing our groundbreaking End-to-End Verifiable Data Infrastructure (EVDI) at standH2.G30, offering attendees a firsthand look at how this technology is set to transform the industry. In August, we launched our EVDI Showcase Demo, a signifi

As the Fintech industry in Saudi Arabia continues to surge, Empeiria is thrilled to be part of the 24 Fintech event from September 3–5 in Riyadh. We’ll be showcasing our groundbreaking End-to-End Verifiable Data Infrastructure (EVDI) at standH2.G30, offering attendees a firsthand look at how this technology is set to transform the industry.

In August, we launched our EVDI Showcase Demo, a significant milestone that allows users to explore the power of verifiable data through practical scenarios. The demo features five distinct workflows, including passwordless logins for seamless authentication and Proof of Purchase (PoP), which securely stores and verifies purchase details as Verifiable Credentials. These workflows highlight the key aspects of web3 adoption — enhancing security, ensuring privacy, and facilitating interactions with decentralized systems.

This demo offers a hands-on experience of how EVDI can help create a decentralized and trustworthy digital environment. We’re excited to bring this experience to 24Fintech, where you can see these innovations in action.

To explore the demo yourself, visit our showcase and use it with the Empe Verifiable Data Wallet, available on the Apple App Store or Google Play Store.

Join us at 24 Fintech to discover how Empeiria’s EVDI is shaping the future of Fintech. We look forward to connecting with you in Riyadh!

Follow Empeiria on X, or LinkedIn for the latest news & updates. For inquiries or further information, contact Empeiria at media@empe.io


August Development Update: EVDI Showcase Demo, Empe Blockchain Upgrade, and More

TL;DR: In August, we launched our EVDI Showcase Demo, featuring key features like passwordless logins and Proof of Purchase (PoP). We also introduced new VC issuance flows, optimized QR codes for easier credential claiming, and upgraded the Empe Blockchain with enhanced modules and security improvements. Over the past month, we’ve made strides in our End-to-End Verifiable Data Infrastructure (EVD

TL;DR: In August, we launched our EVDI Showcase Demo, featuring key features like passwordless logins and Proof of Purchase (PoP). We also introduced new VC issuance flows, optimized QR codes for easier credential claiming, and upgraded the Empe Blockchain with enhanced modules and security improvements.

Over the past month, we’ve made strides in our End-to-End Verifiable Data Infrastructure (EVDI) technology, all thanks to your priceless feedback and our commitment to pushing boundaries with cutting-edge R&D. Here’s what we’ve built:

EVDI Showcase Demo Launch

In August, we achieved a significant milestone with the launch of our new End-to-End Verifiable Data Infrastructure (EVDI) Showcase Demo, which provides a unique opportunity to experience the power of verifiable data in practical scenarios.

The demo features five distinct workflows designed to showcase the capabilities of our EVDI platform. Among these key features are passwordless logins, which allow for seamless user authentication without the need for traditional passwords, and Proof of Purchase (PoP), which securely stores and verifies purchase details as Verifiable Credentials.

Each workflow is crafted to illustrate crucial aspects of web3 adoption. By leveraging verifiable data, we enhance security through robust verification mechanisms, ensure privacy by empowering users with control over their personal information, and facilitate interactions with decentralized systems, thus empowering users in the digital space.

Why does this matter?

Hands-on Experience: The demo provides a hands-on experience, making it easier for users and businesses to grasp the benefits of Verifiable Credentials and drive wider adoption.

Boosting Security: Passwordless logins and Verifiable Credentials reduce password theft and fraud risk, enhancing overall digital security.

Protecting Privacy: Users retain control over their data with Verifiable Credentials, ensuring privacy and compliance with data protection laws.

Simplifying Interactions: Seamless digital experiences are facilitated through easy issuance, claiming, and verification of verifiable credentials, making user interactions smoother and more intuitive.

Driving Web3 Adoption: Real-world use cases in the demo help bridge the gap between Web3 concepts and practical applications, encouraging broader adoption.

Ensuring Compliance: Verifiable Credentials help meet stringent data privacy regulations by prioritizing user data ownership and control.

Try it out now

To access the demo, visit https://issuer-demo.empe.io/ to issue and verify credentials. Use the Empe Verifiable Data Wallet app on your mobile device to claim and present your credentials. You can download the Empe Wallet from the Apple App Store and Google Play Store now.

VC Issuance Flows & Passwordless Login with Empe Wallet

Later in the month, our development team added two new flows to the Empe Issuer Demo, with automatic VC verification and authorization, as well as manual approval for claiming Verifiable Credentials (VCs).

Why does this matter?

Enhanced Security: New flows allow for VC issuance only to verified users. Conditional Issuance: Set specific requirements for VC issuance, ensuring only eligible users receive credentials. Personalized VCs: Create customized VCs tailored to individual user needs. Seamless User Experience: Passwordless login improves convenience and security.

Empe Issuer is now even more flexible and user-friendly for issuing and authenticating verifiable credentials.

Lighter QR Codes for Seamless VCs Claiming

Based on your feedback and with our ever-present goal of improving user experience, we are introducing new, lighter QR codes for easier, faster, seamless claiming of Verifiable Credentials (VCs) from Empe Issuer Demo. The new QR codes link directly to API endpoints, making them more readable for a wider range of devices.

Why does this matter?

Fewer Errors: The improved structure reduces the likelihood of scanning errors, ensuring a smoother claiming experience. Faster Scanning: The optimized QR codes enable faster and more efficient scanning, providing a quicker and more convenient process for users.

This update further enhances the user experience and efficiency of our Empe Issuer Demo.

Chain Upgrade to Version 0.2.2

Following the recent launch of Empe Testnet and in anticipation of Empe Mainnet, we’ve successfully upgraded Empe Blockchain to version 0.2.2 with the following key changes:

Enhanced DID Repository Module: Improved data integrity with new validations and optimizations. New Minter Module: Introduces controlled token emission with flexible minting configurations for adaptable distribution. New Vesting Module: Manages vesting accounts with features for creating, splitting, and controlling token release. Security Fixes: Includes critical updates to enhance overall network security.

Why does this matter?

These upgrades and new features ensure that our blockchain is even more robust and flexible. This makes it a solid foundation for our End-to-End Verifiable Data Infrastructure (EVDI).

All these latest advancements mark significant progress in improving the security, efficiency, and developer experience of our verifiable data infrastructure. We deeply value your ongoing support and insightful feedback, which are essential to our continued development. Thank you for your contributions.

Follow Empeiria on X, or LinkedIn for the latest news & updates. For inquiries or further information, contact Empeiria at media@empe.io


a16z Podcast

Governing democracy, the internet, and boardrooms

with @NoahRFeldman, @ahall_research, @rhhackett Welcome to web3 with a16z. I'm Robert Hackett and today we have a special episode about governance in many forms — from nation states to corporate boards to internet services and beyond. Our special guests are Noah Feldman, constitutional law scholar at Harvard who also architected the Meta oversight board (among many other things); he is also the

with @NoahRFeldman, @ahall_research, @rhhackett

Welcome to web3 with a16z. I'm Robert Hackett and today we have a special episode about governance in many forms — from nation states to corporate boards to internet services and beyond.

Our special guests are Noah Feldman, constitutional law scholar at Harvard who also architected the Meta oversight board (among many other things); he is also the author of several books. And our other special guest is Andy Hall, professor of political science at Stanford who is an advisor of a16z crypto research — and who also co-authored several papers and posts about web3 as a laboratory for designing and testing new political systems, including new work we'll link to in the shownotes.

Our hallway style conversation covers technologies and approaches to governance, from constitutions to crypto/ blockchains and DAOs. As such we also discuss content moderation and community standards; best practices for citizens assemblies; courts vs. legislatures; and much more where governance comes up. 

Throughout, we reference the history and evolution of democracy — from Ancient Greece to the present day — as well as examples of governance from big companies like Meta, to startups like Anthropic.

Resources for references in this episode:

On the U.S. Supreme Court case NetChoice, LLC v. Paxton (Scotusblog) On Meta's oversight board (Oversightboard.com) On Anthropic's long term benefit trust (Anthropic, September 2023) On "Boaty McBoatface" winning a boat-naming poll (Guardian, April 2016) On Athenian democracy (World History Encyclopedia, April 2018) The Three Lives of James Madison: Genius, Partisan, President by Noah Feldman (Random House, October 2017)

A selection of recent posts and papers by Andrew Hall:

The web3 governance lab: Using DAOs to study political institutions and behavior at scale by Andrew Hall and Eliza Oak (a16z crypto, June 2024) DAO research: A roadmap for experimenting with governance by Andrew Hall and Eliza Oak (a16z crypto, June 2024) The effects of retroactive rewards on participating in online governance by Andrew Hall and Eliza Oak (a16z crypto, June 2024) Lightspeed Democracy: What web3 organizations can learn from the history of governance by Andrew Hall and Porter Smith (a16z crypto, June 2023) What Kinds of Incentives Encourage Participation in Democracy? Evidence from a Massive Online Governance Experiment by Andrew Hall and Eliza Oak (working paper, November 2023) Bringing decentralized governance to tech platforms with Andrew Hall (a16z crypto Youtube, July 2022) The evolution of decentralized governance with Andrew Hall (a16z crypto Youtube, July 2022) Toppling the Internet’s Accidental Monarchs: How to Design web3 Platform Governance by Porter Smith and Andrew Hall (a16z crypto, October 2022) Paying People to Participate in Governance by Ethan Bueno de Mesquita and Andrew Hall (a16z crypto, November 2022)

As a reminder: none of the following should be taken as tax, business, legal, or investment advice. See a16zcrypto.com/disclosures for more important information, including a link to a list of our investments.


Epicenter Podcast

Polymer: A New Era for Interoperability...on Ethereum! - Bo Du

The future is multi-chain, scalable and modular. However, while Cosmos’ IBC set the standard for interoperability, Ethereum’s L2 shift revealed a huge problem of liquidity fragmentation across the many rollups fighting for market share. Polymer aims to bridge the two ecosystems and bring the best of both worlds: Ethereum’s native liquidity and Cosmos’ interoperability, through a modular framework

The future is multi-chain, scalable and modular. However, while Cosmos’ IBC set the standard for interoperability, Ethereum’s L2 shift revealed a huge problem of liquidity fragmentation across the many rollups fighting for market share. Polymer aims to bridge the two ecosystems and bring the best of both worlds: Ethereum’s native liquidity and Cosmos’ interoperability, through a modular framework using the OP-stack and IBC.

Topics covered in this episode:

Bo’s background and the evolution of Polymer Scaling limits of L1s vs. L2s The ‘endgame’ for rollup frameworks IBC Interoperability & network topology: 70’s/80’s vs. blockchains Polymer hub Monomer framework Pre-confirmation & finality trade-offs Cross-rollup interoperability Building appchains with Monomer Modularity

Episode links:

Bo Du on Twitter Polymer Labs on Twitter

Sponsors:

Gnosis: Gnosis builds decentralized infrastructure for the Ethereum ecosystem, since 2015. This year marks the launch of Gnosis Pay— the world's first Decentralized Payment Network. Get started today at - gnosis.io Chorus1: Chorus1 is one of the largest node operators worldwide, supporting more than 100,000 delegators, across 45 networks. The recently launched OPUS allows staking up to 8,000 ETH in a single transaction. Enjoy the highest yields and institutional grade security at - chorus.one

This episode is hosted by Sebastien Couture.

Friday, 30. August 2024

XRSI

MedXRSI Announces Strategic Partnership with the Journal of Medical Extended Reality (JMedXR)  

Elevating Industry Practices and Community Insights in Medical XR through Shared Expertise San Francisco (USA), August 30th, 2024 – MedXRSI is thrilled to announce a significant partnership with the Journal of Medical Extended Reality (JMedXR), a leading publication, by Mary Ann Leibert Inc., at the forefront of Medical XR research and innovations. MedXRSI, also a [...] Read More... from MedXRSI
Elevating Industry Practices and Community Insights in Medical XR through Shared Expertise

San Francisco (USA), August 30th, 2024 – MedXRSI is thrilled to announce a significant partnership with the Journal of Medical Extended Reality (JMedXR), a leading publication, by Mary Ann Leibert Inc., at the forefront of Medical XR research and innovations. MedXRSI, also a host of the Medical XR Advisory Council, is a critical program of the world’s leading standard-developing organization in emerging and immersive technologies, X Reality Safety Intelligence (XRSI). This collaboration marks a milestone in advancing the safety, security, privacy, ethics, and overall efficacy of immersive technologies within the healthcare sector.

As part of this collaboration, MedXRSI will work closely with the highly regarded Editor-in-Chief and Guidelines Editor of JMedXR to publish groundbreaking medical healthcare research and safety guidelines specific to the Medical XR domain. This joint effort aims to set a new standard for safety in the Medical XR ecosystem, ensuring that practitioners, researchers, and developers alike can operate with robust guardrails that prioritize patient safety and ethical practices.

Development of Comprehensive Safety Guidelines

MedXRSI is committed to developing safety guidelines for the immersive healthcare industry. Through this collaboration, it will create a comprehensive series of safety guidelines for the Medical XR ecosystem. These guidelines will provide the medical community with essential resources to navigate the complex landscape of immersive technologies safely and effectively.

Joint Research Initiatives and Policy Advocacy

The collaboration goes beyond just publishing safety guidelines; it also provides opportunities to pursue joint research projects that focus on the safety and ethics of immersive technologies. MedXRSI and JMedXR are committed to advancing the Medical XR ecosystem through thorough research and creative solutions that tackle the distinctive challenges of this rapidly evolving field.

Moreover, XRSI and the journal’s owners will collaborate on data governance, safety advocacy, policy, and awareness efforts to shape the future of Medical XR. In addition to developing and publishing joint position papers and statements, the two organizations will work closely to ensure critical voices and experts can easily leverage their platforms to promote standards and best practices.

Community Partnership and Industry Awareness through Metaverse Safety Week

As part of this collaboration, JMedXR will actively participate in the upcoming Metaverse Safety Week (MSW) 2024, held from December 10th to 15th. With the theme of “Embrace Change,” this year’s campaign promotes safe and responsible practices within emerging technologies. JMedXR will participate in the discussions focused on balancing innovation and shared responsibility in open systems as part of the “Immersive Healthcare” day on December 11, 2024.

Educational Workshops and Training Sessions

Education and awareness are critical components of MedXRSI’s mission. As part of these collaborative efforts, MedXRSI and JMedXR may jointly develop educational workshops or training sessions to promote best practices in immersive healthcare. These sessions will provide valuable insights and training to professionals and stakeholders within the Medical XR community, further strengthening the safety and efficacy of immersive healthcare platforms and applications.

A Unified Vision for the Future

This collaboration between MedXRSI and JMedXR represents a significant step forward in pursuing safer and more effective Medical XR applications. By combining expertise and resources, both organizations are poised to contribute substantially to the field, benefiting patients and healthcare providers worldwide.

If you are interested in helping MedXRSI in these efforts, contact us via this Get Involved form. 

For more information about MedXRSI and its mission, visit the website. To learn more about the JMedXR and its contributions to the field, explore its official publication page.

WEBSITE AND SOCIAL MEDIA

For more information about XRSI’s mission and initiatives, visit MedXRSI’s Who We Are.

Website: https://medical.xrsi.org/  | Twitter: @XRSafetyMedical | LinkedIn: The Medical XR Advisory Council (MedXRSI)

ENDS

For any inquiry or more information, please contact Julia Scott, MedXRSI Executive Lead:

julia@xrsi.org | medical@xrsi.org 

The post MedXRSI Announces Strategic Partnership with the Journal of Medical Extended Reality (JMedXR)   appeared first on X Reality Safety Intelligence (XRSI).


Brave Browser

Introducing cross-chain swaps on Brave Wallet

Announcing native bridging support in Brave Wallet. This allows users to transfer assets from one blockchain to another with the familiar user experience of swaps.

Today we’re excited to announce native bridging support in Brave Wallet. This allows users to transfer assets from one blockchain to another with the familiar user experience of swaps.

The rise of multi-chain ecosystems

The blockchain landscape has evolved dramatically, with hundreds of active, public blockchains and L2 solutions with an aggregate TVL exceeding $80b. This highlights the need for seamless interoperability.

Bridges act as connectors, allowing users to move assets across chains, and to access DeFi opportunities across multiple ecosystems. Without bridges, users would be limited to the functionalities and liquidity of a single network, limiting interoperability and broader use cases of blockchains. Beyond merely moving funds across different blockchain networks, bridges are particularly important when considering the privacy implications of using centralized exchanges for onboarding to L2 networks. By utilizing bridges, users can avoid exposing their transaction data to centralized entities, maintaining a higher degree of anonymity and control over their financial activities across multiple chains.

LI.FI: powering Brave Wallet’s bridging functionality

Bridging functionality in Brave Wallet is powered by LI.FI, an API solution that allows us to find the best execution price for any swap/bridge intent across all major DEX aggregators and bridges.

Brave Wallet currently supports all major EVM chains including Ethereum, BNB Smart Chain, Arbitrum, Base, Polygon, Optimism, and zkSync Era. We also allow bridging between EVM chains and Solana for supported pairs. Brave Wallet facilitates cross-chain, any-to-any swaps in a single transaction, which reduces the number of steps it takes to move funds across chains. Brave does not add fees to swaps or bridges performed within Brave Wallet.

As we design Brave Wallet, we aim to find a balance between offering sensible defaults and giving users as much control and choice as possible, without sacrificing the user experience. We’ve therefore designed the Brave Wallet bridging feature to let users pick the route of their choice, to focus on various parameters like output amount, execution speed, and more. Route selection is currently available in version 1.70 in the Beta channel.

An example of bridge route selection in Brave Wallet.

Enhanced security with Safe Sign

Brave Wallet users may already be familiar with the Safe Sign feature, which allows them to verify the actual intent of a swap transaction before approving. Safe Sign leverages our improved EVM ABI decoder in Brave Core to bring safety and transparency to the transaction approval process. We are happy to share that all bridge transactions powered by LI.FI come with the same transparent signing experience in Brave Wallet.

Bridging as the next step in Brave Wallet utility

With the growing adoption of L2s and the increasing complexity of the multi-chain ecosystem, bridging has become an essential component of the Web3 user experience. Brave Wallet’s native bridging support represents a significant step forward in making cross-chain interactions more accessible, secure, and user-friendly for Brave Wallet users.

We encourage Brave Wallet users to explore these new bridging capabilities, and to provide feedback as we continue to refine and expand our offerings to other bridge providers and networks.

Learn more about Brave Wallet, or click in the address bar (desktop) or settings menu (mobile) to get started.

Thursday, 29. August 2024

Panther Protocol

Could Panther Protocol enable private crypto donations?

Even when using open and permissionless blockchains, some things are best left private Polarizing views Surveys like the Edelman Trust Barometer reveal increasing polarization, particularly on social and political issues, with many participants expressing reluctance to collaborate with those who hold opposing views. At the same time, as crypto donations

Even when using open and permissionless blockchains, some things are best left private

Polarizing views

Surveys like the Edelman Trust Barometer reveal increasing polarization, particularly on social and political issues, with many participants expressing reluctance to collaborate with those who hold opposing views. At the same time, as crypto donations gain popularity, so does donor traceability.

Whether donating to a social cause, charity, or other non-profit, donors may fear backlash or unwanted attention for supporting certain causes. Real-life examples underscore the risks associated with traceable donations. Chainalysis has tracked donations back to donors' wallets linked to events like the January 6th Capitol Riot, the Canadian "Freedom Convoy" and high-profile alt-right organizations. While these cases involve highly controversial donations made using cryptocurrency, they also highlight the potential for donor identities to be exposed, potentially leading to serious repercussions. 

The problems with “private” donations today

Many high-profile charities have adopted technology that enables donors to contribute using cryptocurrencies. Organizations such as Habitat for Humanity, United Way, Oxfam, and Doctors Without Borders now accept crypto donations; however, these solutions often rely on centralized third parties, such as Bitpay or Silentdonor. This reliance introduces not only the risks associated with centralization but also the potential disconnect between the donation and the donor, which can pose challenges for donor stewardship and require charities to find alternative methods to capture donor information for relationship-building. Additionally, inefficiencies can arise when crypto is automatically converted to fiat currency before the donation is processed. It's also important to note that donors who wish to claim tax benefits for their charitable contributions must maintain accurate records of their donations. 

This blog article explores whether Panther Protocol could solve this problem.

Could Panther Protocol preserve the privacy of donors?

Panther Protocol is being developed to enable privacy-enhanced on-chain transactions, without compromising compliance enablement. Panther Protocol is set to accomplish this by supporting licensed digital asset service providers to operate trading zones (Zones) where rules can be established to help Zone Managers align their Zones with the regulatory requirements of the jurisdictions in which they operate.

In its initial releases, Panther Protocol will support use cases tailored to financial service providers, such as swaps. However, the same technology will also address the privacy concerns surrounding charitable donations, allowing donors to support causes without fear of exposure or backlash while safeguarding their identities and financial information. Below, we explore how this innovative approach offers a solution to the challenges of donation privacy.

How Panther preserves user privacy for financial transactions

Users will be able to create zAccounts to access Panther's Shielded Pool. Supported crypto-assets will be secured in a Vault smart contract, and an equivalent zAsset will be issued. These zAssets will be 1:1 collateralized "shielded" versions of the original tokens, usable within the Panther dApp. 

These features preserve user privacy through the use of zk-SNARKs and zero-knowledge proofs, which allow transactions to be validated without revealing any details. Users will deposit cryptoassets into Shielded Pools and receive zAssets, privacy-enhanced versions of the original tokens. These assets will be useable across multiple blockchains while maintaining confidentiality. For more in-depth information about how Panther Protocol works, please see here

Panther Protocol for Charities?

Panther Protocol has the potential to integrate into non-profits’ tech stacks. Each Zone within the Protocol can be customized with specific rules that adhere to the regulatory requirements of the jurisdiction in which it operates. For example, in the case of a charity, a Zone could be configured to disclose information only to law enforcement while preserving donor privacy. By doing so, charities can establish rules and compliance measures that satisfy regulatory requirements and ensure transparency.

Applying Zones to donations

An entity operating as a Zone Manager in Panther Protocol can facilitate private donations by setting up a Zone within Panther's Multi-Asset Shielded Pool (MASP or Shielded Pool). Donors would deposit cryptoassets into the Zone, where the recipient receives donations. In return, donors would receive zAssets—collateralized, privacy-enhanced versions of the original assets—that preserve the privacy of the donor’s identity and transaction details. The donor would then transfer the zAssets to the charity, which could convert them back to their original form (non-zAsset) and withdraw to a digital wallet, if desired. We explain how, below. 

Panther Protocol could be a viable solution to charities looking to accept privacy-enhanced donations. The charity and donor would each register for a zAccount, using PureFi, Panther Protocol’s KYC partner’s process. 

Having connected their respective wallets, the donor would deposit the asset(s) intended for donation into the Panther Vault, receiving, in exchange, a fully-collateralized zAsset that represents the digital asset being donated (e.g. zEth). At this point, the zAssets will reside within Panther’s Shielded Pool, which enables the transfer of digital assets to other zAccount holders (i.e. the charity recipient of the donation). Within the Shielded Pool, a set of smart contracts, safeguarded by cutting-edge cryptographic techniques, will preserve the privacy of pool users. 

The donor can then initiate a transfer of ownership of the donated assets, with the transaction taking place within a Zone, managed by a VASP or other licensed entity operator. Once transferred, the charity recipient can choose to either keep the assets in the Shielded Pool, or withdraw the zAssets back to their original digital asset form using a similar process to the donor’s deposit. 

When transacting using Panther Protocol, parties will  receive rewards for depositing and sending assets. Rewards are received in Panther Rewards Points. The charity recipient, upon withdrawal, would pay fees to the Protocol. Similar to how some charities encourage donors to cover PayPal or Credit Card transaction fees, charities using Panther Protocol to accept privacy-enhanced donations may also encourage donors to contribute a small amount to offset any costs incurred by the platform. 



For future iterations of the protocol, it is worth noting that Panther Protocol’s Zones are scalable and modular. A charity could, in theory, utilize different Zones to receive donations across various jurisdictions, each with its own set of rules tailored to meet the specific regulatory requirements of that region.

Privacy requirements for charities

Beyond addressing the lack of privacy on open blockchains, Panther Protocol’s Zones could, in the future, help uphold privacy obligations that charities have towards their donors’ information. Privacy legislation in many countries, including the European Union, Australia, Canada, the UK and more, requires that donor information should not be used or disclosed for purposes other than those for which it was collected, except with the donor’s consent or as required by law. Panther Protocol’s advanced cryptography will help to keep this information safe. 

Conclusion

Panther Protocol offers a promising solution to the challenges of private crypto donations in a polarized world. By leveraging Panther’s privacy-preserving features, charities will be able to ensure that donor identities and transaction details remain confidential, protecting them from potential backlash or unwanted attention. 


Sequoia

Partnering with Bridge: A Better Way to Move Money

The post Partnering with Bridge: A Better Way to Move Money appeared first on Sequoia Capital.
Partnering with Bridge: A Better Way to Move Money

Zach, Sean and their team are building critical infrastructure to support stablecoin payments.

By Shaun Maguire and Josephine Chen Published August 29, 2024 Bridge co-founders Zach Abrams and Sean Yu.

If you own a U.S.-based business serving primarily U.S.-based customers, you may not think much about stablecoins—yet. But in many parts of the world—in unstable economies, and where inflation is unchecked—moving money is more difficult, and the transformative power of this technology is already becoming clear. Where traditional payment rails are too difficult, slow, and expensive, some 30 million active users are now moving $3.2 trillion in stablecoins every month. And with industry giants including Stripe rolling out new payment options for these assets, those numbers are growing fast.

But from cryptocurrencies to credit cards, each new medium for moving money requires new infrastructure to support it. At Sequoia, we’ve been fortunate to partner with leaders in fintech including PayPal, Block, Stripe, Nubank and Klarna, and we’ve been looking intently for the founders who will usher in the next wave of payment innovation.

In Bridge co-founders Zach Abrams and Sean Yu, that’s exactly what we found.

Zach, Sean and their team are making it possible for developers to seamlessly and instantly convert between any two dollar formats, with a single API. A company in Brazil can use Bridge to send USDC payments to their supplier in China, a consumer in Nigeria can pay for YouTube or ChatGPT, and a small business in the U.S. can take payments in PYUSD from customers around the world. Because Bridge is built on blockchains, it works 24/7, in virtually every country—and for as little as 10% of the cost of traditional foreign exchange rails.

We loved the idea—and we loved the team. Zach and Sean are longtime co-founders, and longtime members of the Sequoia family, as well; their first company together, Evenly, was acquired by Block (then Square) in 2013. Zach went on to lead consumer products at Coinbase and Brex, while Sean became an early engineer and culture-bearer at DoorDash before joining Airbnb. As you might imagine, we had plenty of references on these two, and they described Zach and Sean as visionary, beloved and deeply knowledgeable on what it takes to move money at scale.

In addition to their insight into the stablecoin landscape and their passion for the problem they’re solving, we are impressed by Bridge’s maturity around regulation. While many companies in their space have adversarial relationships with government, Bridge not only complies with all U.S. and European regulations but lists the U.S. State Department and U.S. Treasury among their customers!

Since partnering, we’ve also been impressed by the sheer diversity of Bridge’s client list, which ranges from rocket companies to aid organizations to global fintechs. The applications of this platform seem to be endless—and the growth has been exponential, with payment volume crossing $5 billion annualized. 

Yet even those rapid gains only scratch the surface of what we believe Zach, Sean and their team can accomplish, and we are grateful for the opportunity to partner with Bridge and lead their Series A. As adoption of stablecoins continues to grow, Bridge is building the foundation of this new core payment rail and making sure financial transactions will be faster, easier, more affordable and more accessible for people around the world.

Share Share this on Facebook Share this on Twitter Share this on LinkedIn Share this via email Related Topics #Crypto #Funding announcement Partnering with Foundry: AI Compute, On Demand By Shaun Maguire News Read Partnering with Privy: Bridging the Gap for Web3 By Shaun Maguire and Josephine Chen News Read AMP Robotics’ Dirty Ride and Green Dreams Spotlight Read JOIN OUR MAILING LIST Get the best stories from the Sequoia community. Email address Leave this field empty if you’re human:

The post Partnering with Bridge: A Better Way to Move Money appeared first on Sequoia Capital.


Zcash Foundation

Zebra 1.9.0 Release

The Zcash Foundation is pleased to announce the release of Zebra 1.9.0. This release includes the necessary changes to activate NU6 on Testnet as well as a number of additional NU6 updates and other improvements. To support NU6 activation on Testnet, this release of Zebra adds the NU6 network upgrade variant, minimum protocol versions for […] The post Zebra 1.9.0 Release appeared first on Zcash

The Zcash Foundation is pleased to announce the release of Zebra 1.9.0. This release includes the necessary changes to activate NU6 on Testnet as well as a number of additional NU6 updates and other improvements.

To support NU6 activation on Testnet, this release of Zebra adds the NU6 network upgrade variant, minimum protocol versions for NU6, current protocol version, and Testnet activation height. It also supports configuring an NU6 activation height and configurable funding streams on Regtest and custom Testnets as well as the implementation of all of the ZIPs relating to future funding streams after the current devfund expires at the next halving. Finally, it updates Zebra’s EoS (End of Support) to ensure that it will not run ahead of the expected activation height of NU6 on Mainnet. This means that there will be at least one other Zebra release before NU6 activates on Mainnet.

This release also includes changes to add a new zebra-scanner binary and functionality to support access to Zebra’s best chain state from another process, and notify clients when Zebra’s best tip changes. All of this work is in support of a replacement to the zcashd built-in wallet and gets us closer to enabling zcashd deprecation.

You can see a full copy of all of the included changes in the v1.9.0 Release Notes on GitHub.

The post Zebra 1.9.0 Release appeared first on Zcash Foundation.

Wednesday, 28. August 2024

a16z Podcast

It’s Time to Build in Healthcare

Half of prescribed medications are never taken, and 88% of Americans are metabolically unhealthy. Despite spending 20% of our GDP on healthcare—twice that of any other developed nation—our outcomes still lag behind. In this episode, we explore why technologists must step into the healthcare ring. Solving medicine isn’t enough; we need to make healthcare a consumer-focused industry. a16z’s Vijay P

Half of prescribed medications are never taken, and 88% of Americans are metabolically unhealthy. Despite spending 20% of our GDP on healthcare—twice that of any other developed nation—our outcomes still lag behind.

In this episode, we explore why technologists must step into the healthcare ring. Solving medicine isn’t enough; we need to make healthcare a consumer-focused industry. a16z’s Vijay Pandey and Daisy Wolf discuss the rise of tech founders in healthcare, the potential of AI to transform patient care, and how this shift could lead to the next trillion-dollar company.

Can tech truly disrupt this complex, regulated industry?

 

Resources: 

Read the article ‘It’s Time to Build in Healthcare’: https://a16z.com/hey-tech-its-time-to-build-in-healthcare/

Find Vijay on Twitter: https://x.com/vijaypande

Find Daisy on Twitter: https://x.com/daisydwolf

 

Stay Updated: 

Let us know what you think: https://ratethispodcast.com/a16z

Find a16z on Twitter: https://twitter.com/a16z

Find a16z on LinkedIn: https://www.linkedin.com/company/a16z

Subscribe on your favorite podcast app: https://a16z.simplecast.com/

Follow our host: https://twitter.com/stephsmithio

Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures.


Zcash

New Release 5.10.0

zcashd 5.10.0 is a maintenance and security bugfix release, including support for NU6 on Testnet. A further upgrade will be needed for NU6 on Mainnet. All zcashd users should upgrade […] Source
zcashd 5.10.0 is a maintenance and security bugfix release, including support for NU6 on Testnet. A further upgrade will be needed for NU6 on Mainnet. All zcashd users should upgrade […]

Source

Monday, 26. August 2024

Horizen - Blog

Mainnet Node Software Upgrade: ZEN 5.0.4 is Available to Download

The new version ZEN 5.0.4 is available to download on GitHub and via Docker. The post Mainnet Node Software Upgrade: ZEN 5.0.4 is Available to Download appeared first on Horizen Blog.

The new version ZEN 5.0.4 is available to download on GitHub and via Docker.

DOWNLOAD ZEN 5.0.4 NOW ZEN 5.0.4 is the official Release for Mainnet and Testnet ZEN 5.0.2 and ZEN 5.0.3 (previous versions) are going to deprecate on Mainnet at block #1627572, on September 12th, 2024 at approximately 4:00 PM UTC. Please, update to ZEN 5.0.4 before September 12th, 2024. ZEN 5.0.4 will not perform any network upgrade on Mainnet via Hard Fork Nodes running on Mainnet and Public Testnet should be updated with this version.

See release notes Here

Please let us know if you have any questions or need further support by contacting us on our Discord.

The post Mainnet Node Software Upgrade: ZEN 5.0.4 is Available to Download appeared first on Horizen Blog.


Brave Browser

Chrome is Entrenching Third-Party Cookies For Some Sites In A Way That Will Predictably, Inevitably Mislead Users

Related Website Sets is a user-hostile weakening of the Web's privacy model, plainly designed to benefit websites and advertisers, to the detriment of user privacy.
Summary

This post presents research on the privacy harms and risks of Google’s recent Related Website Sets feature, to be presented at the 2024 Internet Measurement Conference. The research finds both that the Related Website Sets feature would reverse some of the privacy benefits of deprecating third-party cookies, and that Google’s justification for reintroducing this privacy harm (i.e., that Web users can tell when two different sites are run by the same organization) is untrue for many, potentially most, Web users. The study supports other browsers’ decision to reject the feature because of its privacy risks, and highlights the risk Related Website Sets poses to Chrome users.

The study was conducted by researchers at University of St Andrews, Imperial College London, Hong Kong University of Science & Technology (GZ), and Brave Software. This post was written by Principal Privacy Researcher Peter Snyder.

Background: Related Website Sets and Third-Party Cookies

Related Website Sets (RWS) is a recent Chrome feature, proposed by Google in anticipation of the end of third-party cookies. The privacy and security harms caused by third-party cookies are well documented, and have led every major Web browser to either block third-party cookies, or announce plans to do so (even if Google has, again, pushed its planned deprecation date back).

According to Google, Related Website Sets reenables third-party-cookie-like-behavior where it benefits users, without reintroducing the broader privacy harms of third-party cookies. In reality, RWS aims to allow (for example) Google to link the videos you watch on YouTube to your Google profile, even when you’re not logged into YouTube, and even after third-party cookies have been deprecated in Chrome. While the research described in this post presents and evaluates Google’s stated motivations with RWS, the core truth is that RWS exists for advertiser-serving situations like the above. 

The broad idea behind RWS is that if two different sites are run by the same organization (for example, instagram.com and facebook.com are both run by Meta), then there is no need for the browser to block third-party cookies between the two sites, since the user already expects that both sites will share information with each other.

More casually, the motivation behind RWS is something like this: there’s no point in telling your mom a secret, and then trying to keep that secret from your dad; you should assume your parents are going to share everything with each other.

RWS is a user-hostile weakening of the Web’s privacy model, plainly designed to benefit websites and advertisers, to the detriment of user privacy. Google argues that RWS actually benefits users, either because the privacy exceptions help fix “site compatibility issues” or to keep users “signed in” across related domains. But a quick look at the actual Related Website Sets exceptions list reveals many examples unrelated to even these (even hypothetically) user-benefiting use cases, and these sites work correctly in browsers that do not implement RWS (i.e., almost all other browsers).

In reality, the primary motivation behind Related Website Sets is as frustrating as it is unsurprising: to benefit advertisers to the detriment of users (or, as Google euphemistically says, to “show you personalized content”). As with so many other user-harmful and needlessly-complex choices in Chrome’s overarching “Privacy Sandbox” proposal, RWS exists to make sure Chrome continues to serve advertisers’ needs first, even once Google has been shamed into (finally) deprecating third-party cookies.

Study Description: Users (Understandably) Do Not Anticipate Site Relations

The study considered RWS impact on Web privacy by testing whether the underlying assumption in RWS is correct: can Web users accurately determine if two different sites are related to each other? More specifically, if a Web user is presented with two different websites, how accurately are they able to decide whether the two sites are related to each other, given the existing site-relationships defined by Chrome’s RWS list.

In general, we found that Web users cannot accurately determine if two sites are related to each other (as determined by the Related Sites Set feature). We conducted a user study with 30 Web users, recruited over social media, and presented them each with 20 pairs of websites. Website pairs were randomly selected from both the Related Website Sets list (i.e., sites Google designates as “related”, and so warranting reduced privacy protections), and the Tranco list of popular websites. Each user was presented with different pairs of websites, asked to view the sites, and then decide if they thought the two sites were operated by the same organization. This resulted in 430 determinations of whether unique pairs of websites were related (some of the 30 users did not provide an answer to all of the 20 website pairs they were presented).

We found that users’ expectations for which sites were related often didn’t match the Related Website Sets list, and as a result, the RWS feature re-enables third-party cookie-like behavior in many cases users could not anticipate. In our study, the large majority of users (~73%) made at least one incorrect determination of whether two sites were related to each other, and almost half (~42%) of the determinations made during the study (i.e., all determinations from all users) were incorrect. Most concerning, of the cases where both sites were related (according to the RWS feature), users guessed that the sites were unrelated ~37% of the time, meaning that users would have thought Chrome was protecting them when it was not.

We conclude from this that the premise underlying RWS is fundamentally incorrect; Web users are (understandably, predictably) not able to accurately determine whether two sites are owned by the same organization. And as a result, RWS is reintroducing exactly the kinds of privacy harms that third-party cookies cause.

Lest anyone judge the study participants for being uninformed, or not taking the study seriously, consider for yourself: which of the following pairs of sites are related?

hindustantimes.com and healthshots.com

vwo.com and wingify.com

economictimes.com and cricbuzz.com

indiatoday.in and timesofindia.com

Keep in mind, a user needs to determine whether two domains are related before clicking on a link; once a site has been loaded, any information sharing and tracking has already occurred.

In conclusion, we find that RWS will be harmful to user privacy, and reintroduce the kinds of privacy harms the Web has been moving away from by removing third-party cookies. The full paper will be presented at the 2024 Internet Measurement Conference.

(For the above quiz, if you chose “4”, then, unfortunately that is incorrect. That is in fact the only pair of the four that isn’t considered “related” to each other.)

Beyond the Study: Additional Privacy Harms from Related Website Sets

However, beyond the findings from the user study, we note a more fundamental privacy harm with RWS. RWS rests on the idea that if two sites are related to each other, then it’s harmless (or, at least “acceptable”) for the browser to reduce privacy protections between those two sites. Or, to go back to the previous analogy, if mom already knows something, then there’s no harm in telling dad; dad is going to find out regardless.

This assumption is wrong; modern Web browsers are perfectly capable of preventing (say) Meta from knowing that your Facebook account and your Instagram account are owned by the same person if you register them with different email addresses and information. In fact, this is the default behavior of most Web browsers today, both browsers focused at a popular audience (e.g., Brave, Firefox, Safari) and browsers targeting specialized audiences (e.g., Tor Browsers, Icefox). Unless you use the same credentials to register an account on two different sites, modern browsers can absolutely prevent two sites operated by the same organization from linking your behaviors across those sites. Or, in other words, modern Web browsers can absolutely prevent Mom from telling Dad your secrets.

Finally, we acknowledge that some companies do try to circumvent the privacy protections in Web browsers, to try and allow two sites run by the same organization to link your accounts across sites. Some sites use techniques like link decoration or bounce tracking to try and continue tracking you. But the difference here between privacy respecting browsers (which include link decoration and bounce tracking protections) and Chrome (which is explicitly designed to allow cross-site linkage) is damning: some browsers are experimenting with techniques to prevent organizations from tracking you across sites, and some browsers are designing features with the explicit intent of allowing such tracking.

Conclusions

In conclusion, our study finds that RWS is harmful for Web privacy, and in three ways:

First, RWS assumes users can anticipate which sites are related to each other, but in practice users cannot.

Second, RWS introduces privacy harm even before users have the the opportunity to decide if two sites are operated by the same organization; by the time users can view a webpage to try and decide if two sites are related to each other, the privacy harm has already occurred, and sites have had the opportunity to track the user across site boundaries.

And third, RWS entrenches a privacy-harmful assumption in the Web platform, instead of working to excise it. RWS assumes that if two sites are owned by the same organization, then the organization should be allowed to track you across those two sites. In contrast, privacy respecting browsers have gone in the opposite direction, and tried to prevent all sites from tracking you, regardless of what organization owns them.

Additional Concerns with Related Website Sets The Web has Rejected Related Website Sets

Although Related Website Sets is being presented as a general Web proposal, the truth is that most of the Web has already considered and rejected it. Most browsers, including Brave, Firefox, and Safari, have publicly stated that they believe Related Website Sets (previously called First-Party Sets) is bad for users, and bad for the Web. The proposal has been removed from the W3C Privacy Community Group and is no longer being considered by any privacy-focused group in the W3C.

When Websites Change of Hands

What happens if / when the domains in the list change hands? This is a common concern with all sorts of “pin trust to a domain” proposals across this history of the Web. Just because domains A, B, and C are operated by the same organization today does not (at all) guarantee that they’ll be owned by the same organization tomorrow.

Security and privacy attacks from exactly these kinds of assumptions have happened with browser extensions that have been sold from “trustworthy” parties to malicious parties, or when popular software libraries / dependencies have been taken over by a malicious actor.

The broader concern is that, even if these sites are meaningfully related at the time they’re included in the list, there is no mechanism that will remove them when they (often silently) change hands.

Language / Perception Concerns

As mentioned above, the underlying justification (as flimsy as it is) for RWS is that users can perceive that they’re operated by the same organization. Our study finds that, even for English speaking users evaluating English sites, users can’t anticipate what sites Google judges to be related. This problem will (of course) get much worse when people are visiting sites in languages they do not speak.

Timing

The intuition behind RWS is that users will be able to determine if site B is related to site A, and then only visit site B if that arrangement is acceptable. However, this is a catch 22. In order to determine if site B is related to site A, I need to visit site B and see the “shared branding or logo” (or similar), indicating the relationship between these sites. However, once I’ve loaded the site to view it, it’s already too late, and my information has been shared between the two sites.

Saturday, 24. August 2024

a16z Podcast

Latin America: A Tech Powerhouse?

Latin America is emerging as a tech powerhouse, but it's not a one-size-fits-all market.  In this episode, we explore why what works in Argentina won’t necessarily fly in Brazil or Mexico, and how companies are adapting to these unique regional dynamics. Join Dileep Thazhmon, Cofounder and CEO of Jeeves; Santiago Suarez, Cofounder and CEO of Addi; Gabriel Vasquez, a16z investment partner; an

Latin America is emerging as a tech powerhouse, but it's not a one-size-fits-all market. 

In this episode, we explore why what works in Argentina won’t necessarily fly in Brazil or Mexico, and how companies are adapting to these unique regional dynamics. Join Dileep Thazhmon, Cofounder and CEO of Jeeves; Santiago Suarez, Cofounder and CEO of Addi; Gabriel Vasquez, a16z investment partner; and Angela Strange, a16z General Partner, as they discuss the future of fintech in LatAm and the unique approach required to succeed in this diverse market.

​​Whether you're interested in the nuances of product development, the complexities of scaling across diverse markets, or the future of fintech in Latin America, this episode offers perspectives from industry leaders deeply invested in the region's tech ecosystem who believe the next big tech giants might just come from Latin America.

Resources: 

Find Dileep on Twitter: https://x.com/thazhmon

Find Santiago on Twitter: https://x.com/santiasua

Find Gabriel on Twitter: https://x.com/gevs94

Find Angela on Twitter: https://x.com/astrange

Stay Updated: 

Let us know what you think: https://ratethispodcast.com/a16z

Find a16z on Twitter: https://twitter.com/a16z

Find a16z on LinkedIn: https://www.linkedin.com/company/a16z

Subscribe on your favorite podcast app: https://a16z.simplecast.com/

Follow our host: https://twitter.com/stephsmithio

Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures.

Friday, 23. August 2024

Epicenter Podcast

Zeal: The Day-to-Day Crypto Wallet for EVM Chains - Hannes Graah

Self-custodial wallets often represent the first point of contact for crypto users that venture away from centralised exchanges. As a result, their security and user experience should be paramount. This often explained Metamask’s first mover advantage and the users’ reluctance to change. However, as infrastructure evolves, new wallets are equipped from the get-go with features that are designed fo

Self-custodial wallets often represent the first point of contact for crypto users that venture away from centralised exchanges. As a result, their security and user experience should be paramount. This often explained Metamask’s first mover advantage and the users’ reluctance to change. However, as infrastructure evolves, new wallets are equipped from the get-go with features that are designed for normie adoption, such as: account abstraction, gas fee abstraction, easier on- and off-ramp, etc. Zeal was envisioned as a day-to-day wallet solution, allowing users true freedom to transact, both on-chain, as well as off-chain (i.e. real world spendings).

Topics covered in this episode:

Hannes’ background Existing wallet solutions and Zeal’s userbase Wallet UX, account abstraction and passkeys Security assumptions and passkey recovery Smart contract interactions Zeal’s mobile-first focus IBAN & Gnosis Pay integrations Capturing market share Banking the unbanked Gas fees Cross-chain interoperability DeFi & staking Monetisation Privacy Zeal’s backend Growth & business scaling in Web3 Zealot recruitment

Episode links:

Hannes Graah on Twitter Hannes Graah on LinkedIn Zeal on Twitter

Sponsors:

Gnosis: Gnosis builds decentralized infrastructure for the Ethereum ecosystem, since 2015. This year marks the launch of Gnosis Pay— the world's first Decentralized Payment Network. Get started today at - gnosis.io Chorus1: Chorus1 is one of the largest node operators worldwide, supporting more than 100,000 delegators, across 45 networks. The recently launched OPUS allows staking up to 8,000 ETH in a single transaction. Enjoy the highest yields and institutional grade security at - chorus.one

This episode is hosted by Friederike Ernst.


Panther Protocol

Privacy and Safety: Why you should value your anonymity

A recent article by Wired underscores the alarming rise in physical threats and violence used to coerce individuals into transferring their valuable digital assets to criminal accounts. When malicious actors can link a digital wallet to its owner, high-net-worth individuals become prime targets for theft, market manipulation, and identity fraud.

A recent article by Wired underscores the alarming rise in physical threats and violence used to coerce individuals into transferring their valuable digital assets to criminal accounts. When malicious actors can link a digital wallet to its owner, high-net-worth individuals become prime targets for theft, market manipulation, and identity fraud. In the most severe cases, this can lead to blackmail, extortion, home invasions, or worse.

The visibility inherent in public blockchain ledgers is a significant risk factor, as it exposes users to such dangers by making it easy to trace and associate financial activity with specific individuals. Privacy is crucial not only for security but also for individual safety. 

This article delves into how bad actors can use open blockchains to identify and target victims and how on-chain privacy solutions like Panther Protocol can mitigate risk.

How can bad actors identify potential victims? 

In most cases, bad actors require access to the password or keys to the digital assets to achieve their goals. Typically, these criminal activities involve identifying a target or target wallet, gaining access to its credentials, and then transferring the assets to their own wallet or account. According to Chainalysis, One third of hacks in DeFi involve off-chain activities, using off-chain data to access this sensitive information. 

Blockchain cybersecurity firm, Halborn, identifies the most common methods of compromising private keys as phishing, malware, weak passwords, insecure key storage, weak key generation, social engineering and cloud storage breaches. While some compromised keys have come from mass attacks (“spray and pray”), where a large number of potential victims are hacked indiscriminately with a low probability of success, targeted attacks (such as spear phishing) have a much higher probability of success. Open transactions may leave users more vulnerable to these types of attacks, allowing hackers target a particular wallet or set of wallets in an attempt to obtain the keys. 

When you make a cryptocurrency payment at a café or any public venue, your transaction details, including the send addresses, can potentially be observed and linked to your other digital wallets. Panther Protocol's Shielded Pool will mitigate this risk by enabling you to transact using zAssets—private, mirror tokens that conceal the true origin of the underlying assets.

By using zAssets for your transactions, the public visibility of your send address will be obfuscated, making it difficult for anyone to link the transaction back to your other wallets. This will effectively shield your wallet addresses from observers, preventing them from tracing back to your broader collection of wallets and other sensitive financial information.

Decoupling identity from wallet addresses

Unique identifiers like .Eth domains or NFT profile pictures can inadvertently expose your on-chain identity, linking your social presence to your financial activities and making you vulnerable to unwanted tracking and privacy invasion.

Panther Protocol is designed to allow you to interact with DeFi platforms using zAssets, ensuring that your on-chain transactions will be conducted privately. Even if you use a public-facing identity, such as a .Eth domain, transactions made in Panther’s Shielded Pool will not reveal your wallet’s holdings or activities. Additionally, Panther Zones are intended to ensure that any identity-related actions you take remain private and secure.

Preventing blockchain analysis and clustering

Blockchain analysis can reveal patterns in your transaction history, potentially exposing your identity or linking multiple wallets under your control. This information could be exploited by bad actors to target your assets through various phishing or hacking methods.

Transactions conducted through Panther Protocol’s zTrade and zSwap functionalities are designed to be private and unlinkable. By leveraging ZKPs, these transactions will be shielded from blockchain analysis tools. The unique architecture of the Shielded Pool, which utilizes append-only Merkle trees, is designed to ensure that each transaction is recorded privately, without disclosing its details or history. This will make it difficult for attackers to use clustering algorithms to trace multiple addresses back to a single entity. Additionally, as more users transact via Panther Protocol, the anonymity set will grow. The increasing number of unique deposit wallets and UTXOs will exponentially complicate the efforts of clustering algorithms to identify the source of each transaction.

When a bad actor connects you to any of the wallets you frequently engage with, they can potentially discover other wallets you own by analyzing transaction patterns in your data trail. If a wallet regularly interacts with known exchange addresses, an attacker may link these transactions to external data, such as exchange account information, which could reveal the wallet owner’s identity​

To uncover wallet owners' identities, malicious actors could combine blockchain data with other information sources. For example, if a wallet regularly interacts with exchange addresses acquired through data leaks or breaches, an attacker could use exchange records, which often contain personal identification details to identify the owner. 

Transaction patterns, such as recurring payments or transfers to specific addressel personal or business relationships. This information can be cross-referenced with publicly available data or information obtained through the various methods described above.

Clustering algorithms can be used by law enforcement to identify bad actors. Still, they can equally be used by bad actors to group related addresses that a single entity may control. By integrating blockchain data with metadata (such as IP addresses) and off-chain information, attackers can more accurately identify high-value targets, matching them with identities that are collected offline. 

Defending Against Dusting Attacks

Dusting attacks involve sending small amounts of cryptocurrency to a wallet in order to analyze its transaction patterns and potentially uncover the identity behind it. By tracking how this dust is moved, attackers can gain insights into the wallet owner’s activities.

With Panther Protocol, even if your wallet is targeted by a dusting attack, any subsequent transactions using zAssets will occur within its Shielded Pool, which is designed to prevent the dust from revealing any meaningful information. Panther’s privacy-enhancing technology will ensure that the dust cannot be traced through the Panther system, as transactions within the Shielded Pool are shielded and do not disclose the movement of underlying assets or include deposits of unwanted assets.

Securing Against Social Engineering and Off-Chain Attacks

Bad actors may also use more traditional scams like phishing, smishing, and social engineering to link wallets to their owners. Other scams involve real-world interactions, such as SIM swapping attacks; hacking into your email or using dark-web obtained email addresses and passwords or KYC data to access personal information. 

Many of the most effective attacks on digital assets occur off-chain and these methods typically rely on gathering sufficient information about a target to gain access to their wallets and keys.

Although Panther Protocol is primarily designed to enhance on-chain privacy, the protection it will offer is intended to extend indirectly to off-chain attacks. By keeping your on-chain activities private and untraceable, Panther aims to significantly reduce the data available for attackers to exploit in off-chain schemes. After all, how can attackers attempt to socially engineer their way into your crypto assets if they don't even know you possess them?

Thursday, 22. August 2024

Zcash

Released: Currency conversion, transparent history, and TEX addresses in Zashi

Today’s release represents another baby step toward our goal of making Zashi an easy-to-use, all-in-one user interface for securely storing, spending, and sending ZEC.  We’re excited to deliver currency conversion […] Source
Today’s release represents another baby step toward our goal of making Zashi an easy-to-use, all-in-one user interface for securely storing, spending, and sending ZEC.  We’re excited to deliver currency conversion […]

Source


PIVX

Here We Grow Again! New Listing.

PIVX is happy to announce that it is now listed on one of the worlds biggest instant exchangers, Changelly. Exchange any crypto instantly, easily, securely on Changelly.com This is a massive development for PIVX and a true sign of what’s to come, more building on the way! PIVX. Your Rights. Your Privacy. Your Choice. To stay on top of PIVX news please visit PIVX.org and Discord.PIVX.org

PIVX is happy to announce that it is now listed on one of the worlds biggest instant exchangers, Changelly.

Exchange any crypto instantly, easily, securely on Changelly.com

This is a massive development for PIVX and a true sign of what’s to come, more building on the way!

PIVX. Your Rights. Your Privacy. Your Choice.
To stay on top of PIVX news please visit PIVX.org and Discord.PIVX.org

Here We Grow Again! New Listing. was originally published in PIVX on Medium, where people are continuing the conversation by highlighting and responding to this story.


Panther Protocol

A Perspective on Private Asset Velocity Using Zero-Knowledge Proofs

Abstract This article examines Panther Protocol, an upcoming decentralized application on the Polygon network that leverages zero-knowledge proof (ZKP) technology within a Multi-Asset Shielded Pool (MASP) to facilitate private asset transactions. Inspired by Muhammad Yusuf's "Diving Into Dark Pools," this article explores Panther Protocol's
Abstract

This article examines Panther Protocol, an upcoming decentralized application on the Polygon network that leverages zero-knowledge proof (ZKP) technology within a Multi-Asset Shielded Pool (MASP) to facilitate private asset transactions. Inspired by Muhammad Yusuf's "Diving Into Dark Pools," this article explores Panther Protocol's privacy mechanisms, regulatory considerations involving Virtual Asset Service Providers (VASPs) and the innovative concept of Panther Zones. Additionally, it examines the protocol's compliance strategies in the context of the SEC's regulatory framework and highlights the benefits for licensed entities.

Introduction

In the evolving landscape of decentralized finance (DeFi), Panther Protocol is positioned to preserve privacy and enhance security. Drawing parallels to the traditional dark pools discussed in a recent report on Delphi Digital,  "Diving Into Dark Pools," Panther Protocol is set to offer a sophisticated solution for private asset transactions. By integrating cutting-edge privacy-enhancing technologies and robust regulatory compliance enabling mechanisms around KYC/KYB and data management, Panther Protocol ensures that users, particularly licensed entities, can transact with confidence and confidentiality.

Traditional Market Comparison

The traditional market for dark pools in equities and equity options is substantial, with billions of dollars in daily trading volume. These dark pools allow institutional investors to execute large orders without causing significant market impact. However, this structure still relies on centralized operators who may potentially misbehave or leak confidential information outside of the tape print times. In contrast, Panther Protocol's Multi-Asset Shielded Pool (MASP) and its zTrade functionality will elevate this concept by incorporating zero-knowledge proofs and decentralized smart contracts. This approach not only maintains the anonymity of participants but also eliminates the need for trust in a central operator. In comparison, the market for on-chain dark pools in crypto is still nascent but holds tremendous potential.

Traditional dark pools in equities handle billions in daily trading volume, providing a significant liquidity pool for institutional investors. Crypto dark pools are emerging, with protocols like Panther Protocol paving the way for secure, private, and compliant trading environments. The potential growth of on-chain dark pools can mirror the success of traditional markets, offering a new value play for institutional and retail investors alike. Multi-Asset Shielded Pool (MASP / Shielded Pool)

At the heart of Panther Protocol's envisioned privacy capabilities lies the MASP. This innovative pool will allow users to deposit various underlying assets, including WETH, USDC, USDT, WBTC, and other ERC-20 tokens. Upon depositing, users will receive private tokens known as zAssets (e.g., zWETH, zUSDC, zUSDT, zWBTC), offering an additional layer of privacy.

The MASP is more accurately described as a collection of “append-only” Merkle trees, where each leaf represents a commitment to a UTXO, serving as an IOU of the asset deposited. This architecture ensures that the state of the blockchain remains confidential, enhancing both privacy and security. The growing number of deposits within this anonymity set further ensures that individual transactions remain indistinguishable within the pool, akin to the privacy provided by traditional dark pools.

Merkle trees, a key component in blockchain privacy architecture, play a significant role in Panther Protocol's functionality. Within the MASP, Merkle trees are used to store and manage zAsset transactions. Each leaf of the Merkle tree represents a commitment to a specific UTXO transaction, and these commitments are updated as transactions occur. This structure not only enhances privacy but also ensures the integrity and security of the transaction data.

zTrade

The zTrade feature within Panther Protocol will enable private, on-chain, over-the-counter (OTC) transactions within the MASP. This functionality is designed to ensure both confidentiality and security while mitigating the risk of MEV exploitation as swap occurs outside of DEXs. 

The process begins when a user, referred to as the maker, locks a specified amount of zAsset (Token A) in a zTrade smart contract. The taker, or counterparty, accepts the order after the protocol verifies the balance of Token B. Once the balance is confirmed, the trade is executed via an atomic swap, ensuring a simultaneous and secure exchange of assets.

Avoidance of MEV is a critical component of zTrade. By utilizing zero-knowledge proofs and executing trades directly between makers and takers, zTrade mitigates the risk of MEV exploitation, ensuring that trades are conducted fairly and privately. This approach will mirror the confidentiality of traditional dark pools, where large financial moves are shielded from public view to prevent market manipulation. 

zSwap

For transactions that extend beyond the MASP, Panther Protocol will employ zSwap. This feature will use DeFi adaptors to extend private asset velocity through on-chain decentralized exchanges (DEXs) such as Uniswap, Quickswap, and Curve. The process begins when the user selects the swap action and chooses the currencies for conversion. The DeFi aggregator smart contract would then generates a quote, valid for a short duration, reflecting the exchange value and transaction details. Upon user approval, the transaction would be initiated through the DeFi aggregator, which utilizes stealth addresses to maintain privacy. Once the transaction is approved and finalized, the new balance would be updated in the user's wallet.

Panther Protocol is designed to offer more than just dark pool functionality; it will enable a wide range of DeFi activities to be performed privately. By depositing assets into the MASP, users will be able to engage in various DeFi activities through plugins called DeFi Adaptors, which will connect the MASP with existing DeFi protocols in a private manner. This approach will ensure that transaction details remain confidential throughout the process, preserving user privacy even when interacting with external DeFi protocols.


SEC's Regulatory Framework and Compliance

Muhammad Yusuf's discussion in "Diving Into Dark Pools" highlights the importance of regulatory compliance, particularly in the context of the SEC's oversight. Panther Protocol addresses these concerns through the integration of compliance providers for KYB procedures and Zone management. By allowing regulated entities to manage Zones and enforce compliance, users can maintain compliance at their discretion, with zero-knowledge proofs validating KYC statements without revealing user data.

Regulatory Compliance and Panther Zones

In alignment with regulatory frameworks, Panther Protocol aims to integrate VASP-regulated entities to create an environment that enables and supports internal compliance and policies. Users can utilize an Ethereum wallet, but must verify their personal identification to obtain a credential for interacting with the protocol. Business entities undergo KYB verification, sharing their business details with a registered compliance provider via Panther Protocol integration. This verification process is conducted off-chain, and credentials must be periodically renewed through re-verification on the KYB/KYC provider side.

Panther Zones are designed to introduce a novel approach to regulatory compliance within the space of decentralised protocols solving on-chain privacy. These specialized trading areas will be managed by regulated entities known as Zone Managers, who will be responsible for whitelisting traders and assets to ensure compliance with regulatory standards. Each Zone is intended to operate under its own regulatory framework, providing a flexible and compliant trading environment.

Benefits for Licensed Entities

Licensed entities such as institutional investors, hedge funds, and family offices can benefit from Panther Protocol's compliance infrastructure. For instance, an institutional investor aiming to execute substantial trades without disclosing their strategies will be able to leverage Panther Protocol to conduct these transactions privately. The use of zAssets as a mechanism of private transactions ensures that their trades do not affect market prices, safeguarding their strategies from being front-run by competitors.

Similarly, hedge funds can leverage the MASP and Panther Zones to conduct large-scale transactions while adhering to regulatory requirements. By utilizing VASP-regulated entities, these funds can ensure that all transactions are KYC/AML compliant, reducing the risk of regulatory scrutiny and enhancing the security of their trades.

Family offices, managing substantial private wealth, can use Panther Protocol to diversify their investments in a secure and private manner. The ability to operate within Panther Zones ensures that these entities can maintain compliance with varying regulatory requirements while benefiting from the privacy and security offered by the protocol.

Conclusion

Panther Protocol's innovative use of zero-knowledge proofs and Multi-Asset Shielded Pool technology positions it as a leader in private asset transactions within decentralized finance. By incorporating regulatory compliance through VASP-regulated entities, Panther Zones, and protection against MEV, Panther Protocol ensures that privacy, security, and legality coexist harmoniously.

Wednesday, 21. August 2024

Zcash Foundation

Welcoming a New Chief Communications Officer

We are delighted to announce that Elise Hamdon will be joining the Zcash Foundation as our Chief Communications Officer. Elise will oversee all facets of our communications strategy and we are confident that her leadership will enhance our efforts to promote transparency, foster community trust, and drive the adoption of Zcash. Elise brings a wealth […] The post Welcoming a New Chief Communicati

We are delighted to announce that Elise Hamdon will be joining the Zcash Foundation as our Chief Communications Officer. Elise will oversee all facets of our communications strategy and we are confident that her leadership will enhance our efforts to promote transparency, foster community trust, and drive the adoption of Zcash.

Elise brings a wealth of experience and expertise in Zcash communications, having proven herself invaluable during her part-time work with the Foundation over the past two years. She also brings additional insights from her time at the Electric Coin Company, where she served as Communications Manager during 2018 and 2019. Most recently she has been advancing cryptocurrency education for youth as Executive Director of Mass Adoption Alliance, which was established with support from a ZF grant.

Elise’s deep understanding of Zcash, the broader privacy and cryptocurrency landscape, and her proven track record in strategic communications position her to make immediate contributions to our strategy. Her diverse work experience extends beyond Zcash, including roles with the World Bank and Australian government, as well as cryptocurrency-related positions at Casa and 21 Cryptos Magazine. Earlier this year, Elise spearheaded the My First Zcash project in collaboration with the Zcash community, and she will continue to advance this important educational initiative.

Please join us in welcoming Elise to the Zcash Foundation. We are excited to have her on board in this capacity and look forward to the positive impact she will have on our organization and the broader Zcash community.

The post Welcoming a New Chief Communications Officer appeared first on Zcash Foundation.


Brave Browser

Brave brings HTTPS by Default to iOS

Starting with version 1.68, Brave will become the first iOS Web browser to try to upgrade all sites to HTTPS by default.

This is the twenty-ninth post in an ongoing series describing new privacy features in Brave. This post describes work done by iOS Privacy Engineer Jacob Sikorski and was written by Shivan Kaul Sahib, Lead for Privacy Engineering.

Starting with version 1.68, Brave will become the first iOS Web browser to try to upgrade all sites to HTTPS by default. When you click or enter an insecure link like http://example.com, Brave will automatically redirect to its secure version, https://example.com. Using HTTPS is crucial to prevent Internet service providers (ISPs) and attackers from snooping on your browsing activity.

This update brings our existing “HTTPS by Default” feature to iOS, and represents a significant advancement beyond the earlier list-based approach that Brave used (and other iOS browsers still use). Previously, a site was only upgraded to HTTPS if its URL was included on specific lists, such as the once-useful but now-deprecated HTTPS Everywhere list. With this change, Brave will now do the opposite: all sites will be upgraded to be secure by default, and the only scenarios in which a site would not be upgraded are if the site appears on a much smaller exception list, or if the upgrade fails. This change guarantees that even new websites not yet on upgrade lists will receive a secure connection by default, a significant win for Web privacy.

As always, Brave prioritizes privacy-first defaults, advocating for a secure Web for all users. Just like on desktop and Android, users will also be able to select an optional Strict mode for an additional warning before the connection falls back to HTTP. You can read more about how “HTTPS by Default” works and why it’s important for your privacy in our previous blog post.

The “HTTPS by Default” feature will soon roll out to iOS users through the App Store. Once you have the latest version, you can try out “HTTPS by Default” by visiting a site (such as http://example.com) on your iOS device and watching it be auto-upgraded to HTTPS.

Monday, 19. August 2024

PIVX

New Listing for PIVX.

PIVX is excited to announce it is now listed on BitMart Exchange! You can “Buy, trade, and hold 1500+ crypto instantly” BitMart is a global cryptocurrency exchange platform that allows users to trade various digital assets. It is known for providing a wide range of cryptocurrencies for trading, including popular ones like Bitcoin, Ethereum, and many altcoins. BitMart also offers features such a

PIVX is excited to announce it is now listed on BitMart Exchange!

You can “Buy, trade, and hold 1500+ crypto instantly”

BitMart is a global cryptocurrency exchange platform that allows users to trade various digital assets. It is known for providing a wide range of cryptocurrencies for trading, including popular ones like Bitcoin, Ethereum, and many altcoins.

BitMart also offers features such as spot trading, futures trading, OTC trading services, and more.

To learn more please visit Bitmart.com

PIVX. Your Rights. Your Privacy. Your Choice.
To stay on top of PIVX news please visit PIVX.org and Discord.PIVX.org

New Listing for PIVX. was originally published in PIVX on Medium, where people are continuing the conversation by highlighting and responding to this story.

Sunday, 18. August 2024

a16z Podcast

The SSN Breach: What Now?

In this episode, we cover the recent data breach of nearly 3B records, including a significant number of social security numbers. Joining us to discuss are security experts Joel de la Garza and Naftali Harris. Incredibly enough, Naftali and his team were able to get their hands on the breached dataset and were able to validate the nature of the claims. Listen in as we explore the who, what, when,

In this episode, we cover the recent data breach of nearly 3B records, including a significant number of social security numbers. Joining us to discuss are security experts Joel de la Garza and Naftali Harris. Incredibly enough, Naftali and his team were able to get their hands on the breached dataset and were able to validate the nature of the claims. Listen in as we explore the who, what, when, where, why… but also how a breach of this magnitude happens and what we can do about it.

Resources:

Read 16 Steps to Securing Your Data (and Life) Find Naftali on Twitter: https://x.com/naftaliharris Check out Sentilink: https://www.sentilink.com/

Stay Updated: 

Let us know what you think: https://ratethispodcast.com/a16z Find a16z on Twitter: https://twitter.com/a16z Find a16z on LinkedIn: https://www.linkedin.com/company/a16z Subscribe on your favorite podcast app: https://a16z.simplecast.com/ Follow our host: https://twitter.com/stephsmithio

Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures.

Saturday, 17. August 2024

Epicenter Podcast

Alkimiya: Blockspace, the Digital Real Estate - Leo Zhang

As the crypto industry matures, more sophisticated market participants become involved. As a result, risk management and hedging will evolve to levels seen in TradFi. However, regardless of the preferred market activity, when interacting with a blockchain, every actor competes for the same limited blockspace. Based on demand levels, the cost for securing that blockspace can fluctuate (in the form

As the crypto industry matures, more sophisticated market participants become involved. As a result, risk management and hedging will evolve to levels seen in TradFi. However, regardless of the preferred market activity, when interacting with a blockchain, every actor competes for the same limited blockspace. Based on demand levels, the cost for securing that blockspace can fluctuate (in the form of miner fees for Bitcoin transactions, or gas fees for PoS blockchains). Therefore, a particular niche of power users could lower these costs by reserving blockspace in advance of elevated demand levels. Alkimiya set out to build just that - a marketplace for blockspace.

Topics covered in this episode:

Leo’s background The vision behind Alkimia Market participants for blockspace Hedging costs Use cases on Ethereum Proposer-Builder Separation and preconfirmations Reserving blockspace Polkadot’s blockspace allocation Current market sentiment. ETH vs. SOL

Episode links:

Leo Zhang on Twitter Alkimiya on Twitter

Sponsors:

Gnosis: Gnosis builds decentralized infrastructure for the Ethereum ecosystem, since 2015. This year marks the launch of Gnosis Pay— the world's first Decentralized Payment Network. Get started today at - gnosis.io Chorus1: Chorus1 is one of the largest node operators worldwide, supporting more than 100,000 delegators, across 45 networks. The recently launched OPUS allows staking up to 8,000 ETH in a single transaction. Enjoy the highest yields and institutional grade security at - chorus.one

This episode is hosted by Brian Fabian Crain.

Wednesday, 14. August 2024

Zcash Foundation

ZF says Farewell (but not goodbye) to Dan

Today the Zcash Foundation bids farewell to Dan Wolande, better known as decentralistdan amongst the Zcash community. Dan joined us in October 2021 as Ecosystems Relations Manager and, during his tenure at ZF, he was embraced by the Zcash community, who recognized his enthusiasm and deep commitment to Zcash. One of Dan’s primary responsibilities was […] The post ZF says Farewell (but not goodbye

Today the Zcash Foundation bids farewell to Dan Wolande, better known as decentralistdan amongst the Zcash community. Dan joined us in October 2021 as Ecosystems Relations Manager and, during his tenure at ZF, he was embraced by the Zcash community, who recognized his enthusiasm and deep commitment to Zcash. One of Dan’s primary responsibilities was fostering relationships within the Zcash ecosystem, convening conversations, and ensuring that diverse voices were heard. Dan excelled in this role of facilitating productive conversations and collaborations, notably Zcon planning and execution, ZCG and grantee support, and countless community calls including the Arborist and Light Client Working Group.

Dan’s departure is bittersweet for ZF because, while he will no longer be part of our team, we are thrilled that he will continue to be actively involved in the ecosystem in a different capacity at the Electric Coin Company. 

On behalf of Zcash Foundation and the broader ecosystem, we thank Dan for advancing Zcash conversations, look forward to continued collaboration, and we wish him the best of luck! 

The post ZF says Farewell (but not goodbye) to Dan appeared first on Zcash Foundation.


a16z Podcast

Building Innovation Hubs: The UK & Beyond

This episode from Web3 with a16z Crypto, is all about innovation on a global scale, exploring both ecosystem and individual talent levels. We examine what works and what doesn’t, how certain regions evolve into startup hubs and economic powerhouses, and what constitutes entrepreneurial talent. We also discuss the nature of ambition, the journey to finding one’s path, and broader mindsets for navig

This episode from Web3 with a16z Crypto, is all about innovation on a global scale, exploring both ecosystem and individual talent levels. We examine what works and what doesn’t, how certain regions evolve into startup hubs and economic powerhouses, and what constitutes entrepreneurial talent. We also discuss the nature of ambition, the journey to finding one’s path, and broader mindsets for navigating risk, reward, and dynamism across various regions, with a particular focus on London and Europe.

Joining us is Matt Clifford, who played a pivotal role in the London entrepreneurial and tech ecosystem since 2011, is the Chair of Entrepreneur First and the UK’s Advanced Research and Invention Agency (ARIA). Before this episode was recorded, Matt served as the Prime Minister’s representative for the AI Safety Summit at Bletchley Park. Recently, he was appointed by the UK Secretary of Science to deliver an “AI Opportunities Action Plan” to the UK government.

This episode was recorded live from Andreessen Horowitz’s first international office in London. For more on our efforts and additional content, visit a16zcrypto.com/uk.

 

Resources:

Find Matthew on Twitter: https://x.com/matthewclifford

Find Sonal on Twitter: https://x.com/smc90

 

Stay Updated: 

Let us know what you think: https://ratethispodcast.com/a16z

Find a16z on Twitter: https://twitter.com/a16z

Find a16z on LinkedIn: https://www.linkedin.com/company/a16z

Subscribe on your favorite podcast app: https://a16z.simplecast.com/

Follow our host: https://twitter.com/stephsmithio

Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures.


Brave Browser

BLaDE: Brave’s Performance Evaluation Testbed

Introducing BLaDE: Brave's open-source testbed for automated mobile performance evaluation. This system accurately measures device metrics while simulating user actions, enhancing mobile app assessment.

This post describes work done by Kleomenis Katevas, Stefanos Laskaridis, Aravindh Raman, Mihai Plesa, and Hamed Haddadi. This post was written by Senior Machine Learning Researcher Kleomenis Katevas.

In today’s fast-paced market, continuous evaluation of product releases is essential. Companies must regularly assess their products to stay competitive and meet evolving customer expectations. This process also provides crucial insights when comparing a product with competitors. However, there are significant challenges.

Automating cross-device mobile app activities—including shipping new features, simulating Web browsing user activity across different browsers and evaluating on-device machine learning models—is challenging, primarily due to inherent operating system (OS) restrictions. Moreover, accurately profiling the power consumption of these activities is difficult because of unreliable OS estimates and often lower sampling rates (number of samples per second).

To address the above challenges, we introduced BLaDE (BatteryLab Device Evaluations) as a successor to BatteryLab 1 2 (originally released in 2019), which serves as Brave’s open-source solution for automated performance evaluations on mobile devices. Supporting both Android and iOS, BLaDE is capable of recording power consumption, CPU utilization, device temperature, and network traffic (bandwidth) in a highly precise and configurable frequency, while simulating user actions using different types of automation methods. By open sourcing the system design and code base, any developer or researcher can replicate this setup to conduct comprehensive performance evaluations on their own mobile devices.

BLaDE System Design

Our system is designed with the following key hardware components:

Main controller (Raspberry Pi 4, 8GB): The central device coordinator (internally known as BLaDE Runner) that’s responsible for managing and orchestrating all system activities.

Mac controller (Mac Mini, M2, 8-core): A Mac-based machine to support iOS-based automation through Appium, and to build iOS apps through Xcode.

Power monitor (Monsoon HV): A critical component for measuring the power consumption of the connected devices at a maximum frequency of 5KHz.

Programmable AC switch (Keene KPS1): Utilized to control the power supply to the power monitor, enabling automated power cycling to the device.

Programmable relay board (Yizhet 5V, 8 Ch.): Utilized for controlling the power distribution from the power monitor to the devices. This is to programmatically select which device is powered on.

Programmable USB hub (YKUSH 3): These hubs allow selective powering on/off of USB connections, providing isolated power control for each port.

IR-thermometer (MLX90614): Deployed for continuous monitoring of device temperature.

LCD display (Waveshare LCD1602): A notification unit that reports the system state (device being monitored, real-time power consumption, etc).

Mobile devices: A variety of iOS (iPhone 14 Pro, SE 2022) and Android (Galaxy S23, Pixel 6a) devices, configured to be compatible with our BLaDE infrastructure.

Surveillance camera (Anker PowerConf C200): Employed to remotely monitor the status of evaluations. Although remote screencasting options are available via the OS, they are avoided as they could interfere with performance metrics.

Router (UniFi Dream Router): Connected to a 70MBit dedicated VLAN network, it provides Ethernet network access to the controller and Mac Mini, and WiFi 6 network access to all mobile devices. Additionally, it is capable of forwarding Internet traffic through a VPN service, if required, allowing simulation of Internet access from different geolocations.

Device configuration

To facilitate accurate power measurements, we stepped away from the available software-based options and utilized a battery bypass technique. This involves disassembling each device to remove its battery, extracting the internal battery controller, and exposing the power terminals via cables. This method allows us to monitor the devices’ power consumption directly from their power terminals, achieving precise measurements at a maximum frequency of 5 kHz using the power monitor.

Further configurations are made in the device’s OS to minimize the influence of extraneous factors. These include disabling automated OS and app updates, turning off adaptive brightness/charging/battery features, enabling dark mode, and standardizing the brightness level to 25% across devices.

Device testing automation

Automating testing on mobile devices, especially in this custom setup, can be a complex task. BLaDE provides a comprehensive set of APIs that facilitate various aspects of device management, such as reading device status, controlling power states, and managing measurement processes.

Consider the following example of the “switch on” API call for switching a mobile device:

Initialize and connect: Begin by switching on the power monitor. BLaDE waits until the monitor is available and establishes a connection.

Enable USB power: BLaDE then enables the USB power to the device, ensuring it is ready for further operations.

Relay activation: The appropriate relay on the relay board is activated to manage the power flow.

Voltage setting: Set the required voltage for the device to switch on.

Device synchronization: BLaDE waits for the device to become available and synchronizes its clock with the controller’s clock. This ensures time measurements are accurate.

Unlocking the device: For iOS devices, BLaDE connects using the Bluetooth HID service and enters the PIN to unlock the device. This step is required to allow the device to communicate via USB after boot.

When measuring the power performance of a phone, it is crucial to disable the USB power lanes. This step ensures that the USB current does not interfere with the power monitor readings. All communication with the device must be wireless, using either Wi-Fi or Bluetooth, before or after the measurement has happened.

Currently, BLaDE supports automation via Android Debug Bridge (ADB), Bluetooth HID, and Appium-based frameworks:

Android Debug Bridge (ADB) automations (Android only)

The Android Debug Bridge (ADB) is a versatile command-line tool that allows developers to communicate with an Android device. It supports a wide range of device operations, such as installing and debugging applications, transferring files, and interacting with the device’s screen and keyboard. ADB commands can be executed via USB, Wi-Fi, or Bluetooth. BLaDE builds on top of this tool to support actions that are required for device automations, such as “type text”, “tap”, “scroll down”, “close foreground app”, etc.

Bluetooth HID automations (iOS, Android)

Automating third-party apps on iOS presents challenges due to the absence of a publicly available ADB-like API that operates over Wi-Fi or Bluetooth. To address this, we have virtualized a Bluetooth mouse and keyboard by creating a virtual Human Interface Device (HID) service at the controller. We also created an automation library that translates keyboard keystrokes, mouse clicks, and gestures into HID actions, effectively simulating user interactions with the controlled device. This approach allows for the automation of tasks—such as locating and launching apps and interacting with them—by mimicking user actions. 

Appium-based automations (iOS, Android)

As an alternative approach, we experimentally support an Appium-based framework to automate actions, minimizing the overhead of the above automation methods when aiming multiple platforms. Appium uses ADB for Android and Apple’s XCTest for iOS automations. Although Appium is generally used for testing features of custom application builds, we currently experiment with the framework to predominantly measure the performance of Android and iOS apps.

BLaDE for product evaluations

The use of BLaDE is crucial to our product pipeline, ensuring the performance, robustness and reliability of various features, including on-device performance of LLMs, cost efficiency of modules, and browsing capabilities. When a new development build is ready and a specific feature needs to be tested, it is dispatched on-demand to BLaDE for an in-depth performance assessment across devices. This process helps identify new regressions and bottlenecks, enabling us to address issues swiftly and maintain the high standards our users expect. With BLaDE’s continuous monitoring, we can confidently release updates, assured that any potential performance issues have been proactively managed.

For more details about how BLaDE is used at Brave, please refer to the following blog posts:

MELTing Point: Mobile Evaluation of Language Transformers (July, 2024) 3

Brave 1.0 Performance: Methodology and Results (January, 2020)

Significant Battery Savings with Brave on Mobile: Brave Consumes 40% Less Battery than Other Leading Browsers (February, 2019)

The complete source code and system design of BLaDE are available at https://github.com/brave/blade. We welcome contributors willing to extend the system capabilities.

References

Matteo Varvello, Kleomenis Katevas, Mihai Plesa, Hamed Haddadi, Fabian Bustamante, Benjamin Livshits, “BatteryLab: A Collaborative Platform for Power Monitoring”. In the International Conference on Passive and Active Network Measurement (PAM ‘21). ↩︎

Matteo Varvello, Kleomenis Katevas, Mihai Plesa, Hamed Haddadi, Benjamin Livshits, “BatteryLab, A Distributed Power Monitoring Platform For Mobile Devices”. At the 18th ACM Workshop on Hot Topics in Networks (HotNets ‘19). ↩︎

Stefanos Laskaridis, Kleomenis Katevas, Lorenzo Minto, Hamed Haddadi, “MELTing point: Mobile Evaluation of Language Transformers”. To appear in the 30th Annual International Conference on Mobile Computing and Networking (MobiCom ‘21). ↩︎

Monday, 12. August 2024

Shade Protocol

July Syracuse Upgrade v0.2

Greetings community, Since the successful Alexandria upgrade, the ShadeDAO contributors have been fully focused on preparing for Argos in September — a key upgrade that continues to push the boundaries of privacy UX while also revealing information on ShadeX (world’s first permissionless/generalized private money market). In the interim, Syracuse 0.2 (pushed in July) is the set of changes t

Greetings community,

Since the successful Alexandria upgrade, the ShadeDAO contributors have been fully focused on preparing for Argos in September — a key upgrade that continues to push the boundaries of privacy UX while also revealing information on ShadeX (world’s first permissionless/generalized private money market).

In the interim, Syracuse 0.2 (pushed in July) is the set of changes tied to the Shade Protocol front-end, back-end and smart contracts.

Added support for Leap wallet Various logo fixes in app Shade staked balance loading bug Fixed SHD staking unbond bug (front end issue) Fixed dSHD unbonding front-end bug Minimum one decimal to silk basket currency display Fixed show of infinite price for silk on load Miscellaneous chart optimizations and bug fixes 2 decimals minimum for USD prices (prices page) Infrastructure improvements (moved to lavendar.5 nodes) Back end server upgrades (reduce downtime w/Fargate) Created SILK price feed & supply endpoint for Coingecko Fixed broken filters on SILK analytics page Fixed SILK stability pool 18 decimal reward claiming bug Onboarded ERIS as an LST provider (ampKUJI + ampWHALE oracle support) Created method to account for all SILK interest accrued in all vaults

The following is the list of pool fees updated to be in line with fee structure for all other pools:

SILK <> USDT (native) SHD <> JKL SHD <> SAGA SHD <> dSHD

In conclusion, the Syracuse v0.2 gets us one step closer to the launch of Argos — a significant milestones in Shade Protocol’s journey towards revolutionizing decentralized finance. Inspired by the legacy of ancient Argos, Shade Protocol stands as a beacon of innovation and financial autonomy in the decentralized landscape. The Argos update aims to bring symmetry, balance, and harmony to the Shade UX as we continue on the quest to refine user stories & the ease of traversing the Shade Protocol application.

With its commitment to privacy, ease of use, and continuous improvement through Syracuse upgrades, Shade Protocol is not only shaping the future of DeFi but also empowering individuals to reclaim control over their finances and data. As the community eagerly anticipates the release of Argos and beyond, Shade Protocol remains dedicated to its mission of building an unstoppable, decentralized financial ecosystem for all.

Website: app.shadeprotocol.io

Twitter: https://twitter.com/Shade_Protocol


Sequoia

The AI Supply Chain Tug of War

The post The AI Supply Chain Tug of War appeared first on Sequoia Capital.
The AI Supply Chain Tug of War

Big Tech is absorbing demand risk from the AI supply chain. How long can it last?

By David Cahn Published August 12, 2024

Here’s the question now being asked all across the AI ecosystem: Is there a way for someone else to take on the demand risk from AI, while I capture the profits?

Today, Big Tech companies have stepped up to alleviate some of this tension. They are acting as risk-absorbers within the system, taking on as much demand risk as they possibly can, and driving the supply chain toward greater and greater CapEx escalation. 

In part one of this piece, we’ll walk through the tug of war between supply chain players over risk and profit. In part two, we’ll unpack the instability of today’s equilibrium.

Who Should Bear the Demand Risk from AI? Who Should Capture the Profits?

In the supply chain, risks are transferred from suppliers, who need to build CapEx to manufacture products, upstream up to their customers, who pay a margin that compensates for this capital expenditure over time. 

Each player wants to maximize profit while minimizing risk. This creates supply chain conflict, which lurks behind the scenes, and exposes itself in pairwise game theoretic interactions between suppliers and their customers. Below, we’ll give one example of how this tension manifests for each layer of the supply chain.

Foundry Layer: TSMC is Nvidia’s manufacturing partner. The more manufacturing capacity TSMC builds for Nvidia, the more it is exposed to future demand fluctuations. The fewer fabs it builds, however, the more problems Nvidia will have with its supply shortage. Thus arises the core tension in this relationship: TSMC’s incentive is to have just enough availability to serve Nvidia, and nothing more. Nvidia’s incentive is for TSMC to build as much CapEx as possible, to maximize availability. In this relationship, TSMC has all the leverage—it is the dominant pure play foundry globally and serves many customers, including Nvidia’s competitors. Thus, as we try to forecast the future of AI, we should expect the stable equilibrium to be that TSMC underbuilds capacity relative to peak demand. An example where this will manifest: TSMC is currently planning out future CapEx for its 2nm node. We might expect that however much capacity TSMC chooses to build, it will be less than the amount that Nvidia and other AI chip companies will have requested. Semiconductor Layer: One of the great ironies of AI is that while the big cloud companies and Nvidia are all members of the “Magnificent 7”—and their fates tend to be correlated as investors wax and wane on AI’s potential—these companies are actually diametrically opposed on many dimensions. For example, the cloud companies are extremely resistant to Nvidia’s profit capture in AI, and they are all working on their own competitive chips. At the same time, Nvidia has been trying to compete with its biggest customers by directing chip supply to new entrants like Coreweave and by building its own cloud business with DGX cloud. The semiconductor tug of war is primarily about profit margins. Industrial Supply Layer: The industrial supply chain is another realm where we can see the ripples of risk transfer at work. When we talk to Big Tech companies, one consistent refrain we hear is that they are trying to buy out all the manufacturing capacity they can get for industrial components like diesel generators and cooling systems, and also for commodities like steel and electrical transformers. Their suppliers find these orders volumes almost hard to believe and are actually resistant to serving this demand; they are concerned that if they double their manufacturing capacity, they will be left with excess capacity in the future. To resolve this conflict, the cloud companies are making big commitments—promising to buy many years of supply ahead of time—to incentivize industrial CapEx. Cloud Layer: The cloud layer is the lynchpin holding everything together. We will discuss this in depth in part two of this post. Model Layer: If you are OpenAI, Anthropic or Gemini, you want to get as much compute as possible for your frontier model, because more compute means a more intelligent model. If you are Azure, AWS or GCP, however, you want to direct GPU or CPU compute to Enterprise customers—this is your main business. Thus, the main conflict at the model layer is over data center capacity allocation. Since the model layer today is not profitable, these allocations are negotiated between cloud executives and research lab leaders. These negotiations are made more complicated by ownership structures, where the research labs are either partially or wholly owned by the clouds. As model sizes grow by 10x, these power struggles will only be exacerbated.  Customer Layer: Hooray for the customer! At the end of this long and complex chain, there is an application layer AI startup or an Enterprise buyer calling an API and querying a foundation model. What is “demand risk” to everyone else is “the luxury of choice” to the customer. Customers can use AI models on-demand, and they can easily switch between vendors at their discretion. If customers ever decide the AI is not useful enough, they can turn it off. The entire supply chain is in service of this customer, who benefits from competition and supply chain efficiency. 

The entire supply chain hinges on the last link—the customer. The supply chain is to some degree positive sum: Everyone benefits as the total profit dollars in AI increases. However, as the prior examples highlight, it can also be zero sum: My revenue is your cost, in the case of Nvidia and the clouds. My CapEx is my risk and your benefit, in the case of TSMC and Nvidia.

The tug of war dynamic also helps to explain some of the supply shortages we continue to see in AI, such as shortages in the industrial supply chain. No matter how much pressure they get, there’s only so much demand risk suppliers are willing to take on.

A Fragile Equilibrium: Big Tech is Propping up the Supply Chain

Today, the big cloud giants are acting as risk-absorbers in this system. They absorb risk from their downstream partners Nvidia and TSMC through large orders that generate huge short-term profits for these companies. They also absorb risk from upstream partners: The cloud companies are the largest source of funding for frontier model companies and they subsidize end customers in the form of low API prices and bundled credits. 

Here are four concrete examples of this risk-absorption mechanism at work:

GPUs—Now or Later? It’s in Nvidia’s best interest to sell as many H100 GPUs as possible now, and then sell more B100s and next-gen chips in the future. It’s in hyperscalers best interest to fill their data centers with GPUs on as-needed basis (this means building data center “shells” and then only installing GPUs once demand materializes). What is actually happening in the market? Hyperscalers seem to be competing with one another for GPU supply and placing big orders with Nvidia to make sure they don’t fall behind. Rather than waiting, they are stockpiling GPUs now and paying twice: First, a hefty upfront expense, and second, higher expected future depreciation. Data Center Construction: The real estate developers who build and assemble data centers are getting a pretty sweet deal. These companies take on almost no demand risk. Developers like CyrusOne, QTS, and Vantage build data centers for big tech companies, but they will only start construction after they’ve signed a 15 or 20-year lease. And they structure these deals so that they can pay back their investment during the lease period alone—they limit their “residual risk” around the long-term value of the data center asset (e.g., even if prices collapse after the lease period, they can still make money). The long-term demand risk squarely sits with the cloud providers. Off Balance Sheet Arrangements: We’ve all seen headlines of late about GPU financing deals and the debt that’s being issued to finance GPU purchases. What many people don’t realize is that most of this debt is actually backed up by rental guarantees from Big Tech. These agreements seem so robust that many debt investors see themselves as investing in Big Tech corporate debt, not in GPUs. The incentive for Big Tech in these deals is to turn an upfront capital expense into a recurring operating expense. This is a very clear example of how Big Tech companies—even when they are not directly doing the financing—are actually backstopping much of the investment activity happening in AI today. Research Lab Funding and Exits: The Big Tech companies are the largest source of funding today for AI research labs. The biggest labs—OpenAI and Anthropic—are each backed by one of the large clouds. The recent exits of Inflection, Adept and Character demonstrate that it may be increasingly difficult to operate without such a backstop.  Conclusion

Supply chain players understand AI’s $600B question, and they are working to navigate it—maximizing their profit margins and minimizing their demand risk. The result is a dynamic tug of war between some of the most sophisticated companies in the world.

Today, the tug of war has resulted in a temporary equilibrium. Supply chain players are offloading their demand risk to Big Tech, to the maximum degree possible. Big Tech companies—either due to AI optimism or oligopolistic competition—are stepping in to absorb this risk and keep CapEx cranking.

This equilibrium is fragile: If at any point the tech giants blink, demand all along the supply chain will decline precipitously. Further, the longer the Big Tech companies continue to double down on CapEx, the more they are at risk of finding themselves deeply in the hole should AI progress encounter any stumbles.

Share Share this on Facebook Share this on Twitter Share this on LinkedIn Share this via email Related Topics #AI Series: AI’s $600B Question AI in 2024: From Big Bang to Primordial Soup By David Cahn Perspective Read The Next Billion Developers By David Cahn Perspective Read JOIN OUR MAILING LIST Get the best stories from the Sequoia community. Email address Leave this field empty if you’re human:

The post The AI Supply Chain Tug of War appeared first on Sequoia Capital.

Friday, 09. August 2024

Zcash Foundation

Farewell, Chelsea

Chelsea Komlo joined the Zcash Foundation (ZF) in August 2019, as a core engineer and researcher, and was later appointed to the role of Chief Scientist, leading ZF’s research efforts. Prior to joining ZF, Chelsea worked as an engineer on open source security and privacy projects, most notably Tor and the Off-The-Record (OTR) messaging protocol.  […] The post Farewell, Chelsea appeared firs

Chelsea Komlo joined the Zcash Foundation (ZF) in August 2019, as a core engineer and researcher, and was later appointed to the role of Chief Scientist, leading ZF’s research efforts. Prior to joining ZF, Chelsea worked as an engineer on open source security and privacy projects, most notably Tor and the Off-The-Record (OTR) messaging protocol. 

During her time with ZF, Chelsea’s primary research focus has been threshold signature schemes. She collaborated with Ian Goldberg at the University of Waterloo to create FROST, a Schnorr threshold signature protocol that is compatible with Zcash shielded transactions. Chelsea then worked with others across the industry to standardize the FROST signing protocol, which was recently published as an RFC. Most recently she has been participating in the NIST standardization effort for threshold signature schemes. 

Chelsea is leaving ZF to take up another research role, focusing on post-quantum cryptography. However, she remains an avid supporter of the Zcash mission, and will remain involved as a member of the ZF’s Technical Advisory Board. 

On behalf of the Zcash Foundation, and Zcash ecosystem as a whole, we want to thank Chelsea for her contributions to Zcash and the broader mission of financial privacy. While we are sad to see her leave, we are thrilled that she has found an opportunity to advance her already-distinguished career. 

The post Farewell, Chelsea appeared first on Zcash Foundation.


Epicenter Podcast

Nansen: AI & Blockchain Analytics - Alex Svanevik

The saying goes that knowledge is power, and this perfectly applies to blockchains due to their innate transparency and immutability. However, raw data could seem, at first, unusable. This is where analytics companies, such as Nansen, play a major role in demystifying blockchain data by labelling it. In turn, curated data holds value as it can give an edge to traders, by tracking ‘smart money’ wal

The saying goes that knowledge is power, and this perfectly applies to blockchains due to their innate transparency and immutability. However, raw data could seem, at first, unusable. This is where analytics companies, such as Nansen, play a major role in demystifying blockchain data by labelling it. In turn, curated data holds value as it can give an edge to traders, by tracking ‘smart money’ wallets. In the age of AI, most of the heavy lifting of data analysis is performed by LLMs, but human input is equally valuable for discerning nuances and fine tuning the process.

Topics covered in this episode:

Alex’s background, from AI to crypto Nansen’s values Querying blockchain data Supported chains Ensuring data accuracy Generating wallet labels The role of LLMs and AI Data privacy and monetisation On-chain transparency, privacy and ethics Roadmap and further enhancements

Episode links:

Alex Svanevik on Twitter Nansen on Twitter

Sponsors:

Gnosis: Gnosis builds decentralized infrastructure for the Ethereum ecosystem, since 2015. This year marks the launch of Gnosis Pay— the world's first Decentralized Payment Network. Get started today at - gnosis.io Chorus1: Chorus1 is one of the largest node operators worldwide, supporting more than 100,000 delegators, across 45 networks. The recently launched OPUS allows staking up to 8,000 ETH in a single transaction. Enjoy the highest yields and institutional grade security at - chorus.one

This episode is hosted by Friederike Ernst.

P.S.: Our friends from @nansen_ai have offered us 10 discount codes for 10% off on their professional and pioneer plans! If you are interested in unlocking Nansen's true power, DM us on Twitter (X - @epicenterbtc) and we'll hook you up with a code (FCFS).


a16z Podcast

The Olympics of Talent: France's Tech Boom

Once criticized for lacking ambition, French founders are now aiming to create the world’s largest companies. With a thriving ecosystem attracting talent from across Europe and the US, France is becoming a major player on the global stage. In this episode, we cover the unique advantages of building startups in France. Roxanne Varza, Director of Station F; Antoine Martin, co-founder of Amo and Zen

Once criticized for lacking ambition, French founders are now aiming to create the world’s largest companies. With a thriving ecosystem attracting talent from across Europe and the US, France is becoming a major player on the global stage.

In this episode, we cover the unique advantages of building startups in France. Roxanne Varza, Director of Station F; Antoine Martin, co-founder of Amo and Zenly; and Brian Kim, a16z consumer partner, discuss the key factors driving this transformation, including infrastructure, community, and government support.

Discover how international talent, a supportive community, and robust governmental backing are propelling France’s startup scene. This episode is filled with insights into why France is now an exciting place to build a startup.

Resources:

Find Roxanne on Twitter: https://x.com/roxannevarza

Find Antoine on Twitter: https://x.com/an21m

Find Bryan on Twitter: https://x.com/kirbyman01

Learn more about Station F: https://stationf.co/

Learn more about Amo: https://get.amo.co/en

Stay Updated: 

Let us know what you think: https://ratethispodcast.com/a16z

Find a16z on Twitter: https://twitter.com/a16z

Find a16z on LinkedIn: https://www.linkedin.com/company/a16z

Subscribe on your favorite podcast app: https://a16z.simplecast.com/

Follow our host: https://twitter.com/stephsmithio

Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures.

Wednesday, 07. August 2024

PIVX

Incoming PIVX Milestone!

Congratulations to the PIVX community! 2,000 PIVX Masternodes— a HUGE milestone for our thriving community has just been achieved! PIVX. Your Rights. Your Privacy. Your Choice. To stay on top of PIVX news please visit PIVX.org and Discord.PIVX.org. Incoming PIVX Milestone! was originally published in PIVX on Medium, where people are continuing the conversation by highlighting and responding to

Congratulations to the PIVX community!

2,000 PIVX Masternodes— a HUGE milestone for our thriving community has just been achieved!

PIVX. Your Rights. Your Privacy. Your Choice.
To stay on top of PIVX news please visit PIVX.org and Discord.PIVX.org.

Incoming PIVX Milestone! was originally published in PIVX on Medium, where people are continuing the conversation by highlighting and responding to this story.


It’s Time to Vote, Masternode Owners!

Well done PIVian you own a PIVX masternode! At the current reward rate of 15.8%, that’s a great passive income stream. Did you know, though, that your role as a Masternode owner extends far beyond just collecting those attractive rewards? That your vote on PIVX’s proposals and upgrades is crucial. By staying engaged and making your voice heard, you have the power to directly shape the futur

Well done PIVian you own a PIVX masternode!
At the current reward rate of 15.8%, that’s a great passive income stream.

Did you know, though, that your role as a Masternode owner extends far beyond just collecting those attractive rewards? That your vote on PIVX’s proposals and upgrades is crucial. By staying engaged and making your voice heard, you have the power to directly shape the future direction of the project. Don’t underestimate the impact your Masternode vote can have.

PIVX’s decentralized governance model relies on active participation from the Masternode community. When proposals are up for a vote, your input helps determine the path forward for the network. It’s your chance to be a driving force behind PIVX’s continued growth and success.

So while enjoying the passive income from your PIVX Masternode, remember to also get involved in the project’s governance. Your votes and feedback are invaluable in guiding PIVX’s evolution.

Together, Masternode owners can ensure PIVX remains aligned with the community’s vision.

PIVX. Your Rights. Your Privacy. Your Choice.
To stay on top of PIVX news please visit PIVX.org and Discord.PIVX.org.

It’s Time to Vote, Masternode Owners! was originally published in PIVX on Medium, where people are continuing the conversation by highlighting and responding to this story.

Monday, 05. August 2024

Sequoia

Steel, Servers and Power

The post Steel, Servers and Power appeared first on Sequoia Capital.
Steel, Servers and Power: What it Takes to Win the Next Phase of AI

The qualifying round is over—in the new era construction is king.

By David Cahn Published August 5, 2024

The race to model parity has been the defining project of the last 12 months in AI. This phase was characterized by the search for new research techniques, better training data and larger cluster sizes.

The next phase in the AI race is going to look different: It will be defined more by physical construction than by scientific discovery.

Up until now, you could fit your training cluster into an existing data center via colocation or retrofit. If you needed to increase cluster size from 15k GPUs to 25k GPUs, you found a way to plug-in more GPUs. This is changing: The “Bitter Lesson”—which most market participants in AI have internalized—says that model size is the number one driver of performance. As a result, the next generation of models are aiming for a 10x increase in model scale to 300k GPUs. To house one of these models, you need to build an entire new data center.

This changes AI in two fundamental ways: First, it changes the lead time between models. If before you could train your model in 6 to 12 months, now you need to add 18 to 24 months of construction time before you can actually start training. Second, it changes the source of maximum competitive advantage. In the new era, construction efficiency may matter more than research breakthroughs. 

This sea change in how AI works was a major theme of big tech earnings last week. Annualized CapEx for big tech increased from $138B to $229B year-over-year. This incremental $91B in run-rate spending is a good proxy for new AI data center construction—an enormous investment.

Today’s CapEx will likely yield fruit somewhere between late 2025 and early 2026, at which point we’ll find out if these larger models are intelligent enough to unlock new revenue streams and generate a return on investment. 

Source: Earnings transcripts, public filings. “Incremental CapEx” is a proxy for run-rate spend on new AI data centers.

So what exactly is going to happen over the next 1 to 2 years, and how does one “win” in this new phase of AI?

Building data centers is a messy and complex business. We think day-to-day operational execution is going to have the biggest impact on who is most successful. Here’s how it works behind the scenes:1

A real estate developer—QTS, Vantage and CyrusOne are three popular ones—goes and buys land and power that they believe a data center can be built on. The developer approaches the big tech companies and offers them a 15-year or 20-year lease on the data center, for a total cost of $2-10B. Once the developer has a signed deal, they go to the capital markets, and they raise debt against the deal, usually from banks or real estate investors. The debt investors are not underwriting the future AI demand in the data center—they are underwriting the credit of the customer, say Microsoft or Amazon, and expect a yield that is a slight premium to corporate debt. The developer goes and hires a general contractor—for example, DPR, one of the most popular data center builders. The general contractor goes and hires subcontractors. The subcontractors then go and recruit labor. Labor is a huge component of the cost of data center construction. Labor moves to the location where the data center is getting built—for example, a small town or city—and they are put up in hotels or other accommodations in the area. Over a two year period, a massive construction project takes place, starting with steel and concrete for the overall structure, and ending with industrial parts and GPUs being installed. During this whole process, the end-user of the data center (e.g., Microsoft or Amazon) is negotiating with their own supply chain for diesel generators, liquid cooling systems, and other necessary equipment.

Today, five companies have arrived at the starting line in this new race toward data center scale-up: Microsoft/OpenAI, Amazon/Anthropic, Google, Meta and xAI. Each has a model that has held up against serious benchmarks, and the necessary capital to proceed.

With the market structure now crystallized, we can begin to see how each player will take a unique approach—derived from their own business fundamentals—in order to win:

Meta and xAI are consumer companies, and they will both vertically integrate, hoping to benefit from each having a single founder decision maker who can streamline and tightly couple model building efforts with data center design and construction. Both companies will seek to launch killer consumer applications on the back of more intelligent models. Microsoft and Amazon have grizzled data center teams and deep pockets, and they’ve leveraged these assets to forge partnerships with the top tier research labs. They hope to monetize through 1) Selling training to other companies, and 2) Selling model inference. They will need to manage resource allocation between their frontier models (GPT 5 and Claude 4) and other data centers being built for Enterprise customer use. Google has both a consumer business and a cloud business, and also has its own in-house research team. On Friday, the company announced it was bringing Noam Shazeer back into the fold. Google also has vertically integrated all the way down to the chip layer with TPUs. These factors should provide long-term structural advantages.

With CapEx plans now firmly in place and the competitive landscape set, the new AI era begins. In this new phase of AI, steel, servers and power will replace models, compute and data as the “must-wins” for anyone hoping to pull ahead.

In this example, we assume a lease structure. In the case of an internal build, the hyperscaler acts both as the customer and as the real estate developer. Share Share this on Facebook Share this on Twitter Share this on LinkedIn Share this via email Related Topics #AI Series: AI’s $600B Question AI in 2024: From Big Bang to Primordial Soup By David Cahn Perspective Read The Next Billion Developers By David Cahn Perspective Read JOIN OUR MAILING LIST Get the best stories from the Sequoia community. Email address Leave this field empty if you’re human:

The post Steel, Servers and Power appeared first on Sequoia Capital.


a16z Podcast

Building the World's Most Trusted Driver

Waymo's autonomous vehicles have driven over 20 million miles on public roads and billions more in simulation. In this episode, a16z General Partner David George sits down with Dmitri Dolgov, CTO at Waymo, to discuss the development of self-driving technology. Dmitri provides technical insights into the evolution of hardware and software, the impact of generative AI, and the safety standards that

Waymo's autonomous vehicles have driven over 20 million miles on public roads and billions more in simulation.

In this episode, a16z General Partner David George sits down with Dmitri Dolgov, CTO at Waymo, to discuss the development of self-driving technology. Dmitri provides technical insights into the evolution of hardware and software, the impact of generative AI, and the safety standards that guide Waymo's innovations.

This footage is from AI Revolution, an event that a16z recently hosted in San Francisco. Watch the full event here:  a16z.com/dmitri-dolgov-waymo-ai

 

Resources: 

Find Dmitri on Twitter: https://x.com/dmitri_dolgov

Find David George on Twitter: https://x.com/DavidGeorge83

Learn more about Waymo: https://waymo.com/

 

Stay Updated: 

Let us know what you think: https://ratethispodcast.com/a16z

Find a16z on Twitter: https://twitter.com/a16z

Find a16z on LinkedIn: https://www.linkedin.com/company/a16z

Subscribe on your favorite podcast app: https://a16z.simplecast.com/

Follow our host: https://twitter.com/stephsmithio

Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures.


Empiria

July Development Update: Open Testnet Launch, CosmWasm Integration, and More

TL;DR: In July, we launched Empe Blockchain’s open Empe Testnet, drawing 100+ validators in 48 hours. Key updates include monitoring tools, wallet support, CosmWasm integration for smart contracts, enhanced vesting accounts, and a linear inflation model for stable token release. These steps are vital for a secure and efficient blockchain platform. Our journey to create the highest value End-to-En

TL;DR: In July, we launched Empe Blockchain’s open Empe Testnet, drawing 100+ validators in 48 hours. Key updates include monitoring tools, wallet support, CosmWasm integration for smart contracts, enhanced vesting accounts, and a linear inflation model for stable token release. These steps are vital for a secure and efficient blockchain platform.

Our journey to create the highest value End-to-End verifiable Data Infrastructure (EVDI) on the market is marked by continuous improvement and a deep commitment to our users. As we continue to pour precious hours into research and development, always listening closely to your feedback, we’re excited to share our progress in July.

Empe Blockchain Infrastructure Development

In anticipation of the upcoming Empe Blockchain mainnet launch, we have built and configured our core network infrastructure. At its heart are the main validator nodes, which form the backbone of the system. To safeguard these critical components, we’ve established sentry nodes as a protective shield. These nodes act as the primary interface between the validators and the external network, mitigating potential threats. This robust infrastructure is essential to ensure the network’s stability and security as we approach the highly anticipated mainnet launch.

Open Empe Testnet Launch

July marked the successful launch of our open Empe Testnet, attracting over 100 external validators within the first 48 hours of operation. To facilitate this successful launch, we integrated essential tools like testnet.ping.pub for comprehensive monitoring and Keplr wallet support for EMPE tokens, and published comprehensive technical documentation, to ensure a smooth onboarding experience.

The current validator count exceeds 150 and is steadily climbing. This surge in validator participation brings forth a multitude of advantages. Enhanced network security safeguards the system against potential attacks. Decentralization fosters trust and transparency within the Empeiria ecosystem. Moreover, the robust foundation laid by the Testnet paves the way for future scalability. This achievement wouldn’t be possible without the incredible support of our community, highlighting the growing excitement surrounding Empeiria’s innovations.

Integration of CosmWasm for Smart Contract Functionality

Empe Blockchain now supports CosmWasm, enabling smart contract functionality from day one. This integration allows developers to create and deploy smart contracts seamlessly, enhancing the versatility and utility of our platform. The addition of CosmWasm support marks a significant milestone in our journey towards creating a more versatile blockchain ecosystem.

Vesting Accounts Module Enhancements

To provide greater control and security for vested assets, we’ve enhanced the vesting accounts module. Users can now create new vesting accounts based on existing ones. These new accounts adhere to the same or more stringent lock-up periods and parameters as the original accounts. This functionality ensures that funds remain securely locked and transparent for all stakeholders, providing a robust framework for managing vested assets and fostering trust and confidence among all stakeholders.

Linear Inflation Model Adoption

To address the limitations of the dynamic inflation model traditionally used in Cosmos-based blockchains, we have adopted a linear inflation model for Empe Blockchain’s minting module. This new approach releases a fixed number of tokens over specified periods, as defined during the genesis stage. Additionally, the inflation rate will gradually decrease over subsequent periods. This predictable and transparent model aims to provide a more stable economic environment for our blockchain ecosystem.

Proof of Attendance with Passwordless Authentication based on the Proof of Purchase

We have successfully integrated a passwordless authentication system, allowing users to securely access their Proof of Attendance (PoAP) using Proof of Purchase (PoP) credentials. This system leverages unique PoP tokens to authenticate users without the need for traditional passwords. The backend services have been enhanced to generate and manage unique PoP tokens for each purchase, ensuring secure and verifiable access to related PoA records. The user interface has been updated to streamline the passwordless login process, providing a simple and intuitive flow for users to authenticate using their PoP credentials.

Schema Update for Proof of Attendance & Proof of Purchase

The new data schemas for Proof of Attendance (PoAP) and Proof of Purchase (PoP) have been finalized and approved, ensuring compatibility with industry standards. Backend services have been updated to support the new schemas, including adjustments to the data storage architecture and API endpoints to handle the additional data fields and types.

These advancements are crucial steps toward a more secure, efficient, and developer-friendly verifiable data infrastructure. Your ongoing support and insightful feedback play a vital role in this journey, and we sincerely appreciate both.

Follow Empeiria on X, or LinkedIn for the latest news & updates. For inquiries or further information, contact Empeiria at media@empe.io

Sunday, 04. August 2024

Empiria

How Did Empeiria Achieve Protocol Decentralization in 48 Hours?

Decentralization, the cornerstone of blockchain technology, has long been a formidable challenge. Building a robust, secure, and scalable network requires significant time, resources, and technical expertise. Yet, in just 48 hours, Empeiria defied expectations by establishing a decentralized network with over 100 validators. How did we accomplish such a rapid decentralization? 1. Foundation of In

Decentralization, the cornerstone of blockchain technology, has long been a formidable challenge. Building a robust, secure, and scalable network requires significant time, resources, and technical expertise. Yet, in just 48 hours, Empeiria defied expectations by establishing a decentralized network with over 100 validators. How did we accomplish such a rapid decentralization?

1. Foundation of Innovation: Prototype Development and Research

The journey began with a deep dive into the market’s needs. Last year, we embarked on a critical phase of our project by developing an initial prototype. This prototype served as a key tool for gathering feedback from over 70 prospective customers. Their insights were essential in refining our approach and aligning our solution with market needs.

To transform these insights into a viable and effective product, we dedicated over 15,000 hours to research and development. This extensive effort was focused on addressing the specific needs and challenges identified through customer feedback. The result was a solution designed to meet market demands effectively and demonstrate a clear product-market fit.

2. Testnet Phase 1: Launch and Testing Frameworks

To validate the prototype’s capabilities and identify potential issues, Empe Testnet launched Testnet Phase 1. A comprehensive testing framework was implemented, encompassing end-to-end testing, simulation testing, and continuous monitoring. This rigorous approach allowed the team to fine-tune the platform and prepare it for the next phase:

End-to-End (E2E) Testing Framework:
This framework allowed us to assess the complete functionality of the system from start to finish, ensuring that all components and processes were integrated smoothly and performed as expected. Simulation Testing:
We utilized simulation testing to create various scenarios and stress-test the network under different conditions. This approach helped us identify potential issues and optimize the system for real-world use. Monitoring Mechanisms:
Comprehensive monitoring tools were established to continuously track system performance and stability. These mechanisms provided real-time insights, enabling us to promptly detect and address any issues.

The rigorous testing during Testnet Phase 1 was crucial for refining our solution and preparing it for the next stage, setting the stage for successful external validation.

3. Intensive Testing and Optimization

Building on the insights gained from Testnet Phase 1, Empe Testnet embarked on 12 weeks of intensive testing and optimization. Every aspect of the platform was scrutinized, and necessary adjustments were made to enhance performance and reliability. This meticulous process was crucial in ensuring the platform’s readiness for the next stage and allowed us to:

Identify and Fix Issues: We systematically pinpointed problems and implemented solutions to address them. Each issue was carefully resolved to enhance the system’s stability and functionality. Optimize Performance: Beyond troubleshooting, we optimized the system to improve efficiency and responsiveness. This involved fine-tuning various components and processes to ensure they operated at peak performance.

The comprehensive testing and optimization during these 12 weeks were critical in refining our system, paving the way for a robust and reliable solution in subsequent phases.

4. Building Community Through Education

Recognizing the importance of community engagement, our strategic approach included reaching out to our community on X to release a series of educational content. By sharing valuable insights and fostering open dialogue, we not only increased awareness but also created a foundation for future growth. This initiative played a crucial role in:

Educating the Community: We provided valuable insights into the benefits and implications of verifiable data, helping our audience understand its importance and impact. Fostering a Supportive Network: By engaging with users through informative content, we cultivated a dedicated community that supports and values our technology. Strengthening the Ecosystem: This educational outreach not only informed, but also built a network of advocates who contribute to and enhance the overall strength of our ecosystem.

Through these efforts, we successfully created a knowledgeable and engaged community that reinforces the foundation of our project.

5. Testnet Phase 2: Opening to External Validators

The culmination of the project was the launch of Testnet Phase 2, which invited external validators to join the network. The response was overwhelming, with over 100 validators joining within 48 hours. This rapid expansion demonstrated the platform’s appeal and the strength of the community built through education and engagement. This phase marked a pivotal moment:

Onboarding External Validators: We began by integrating our first external validators into the network. Their involvement was crucial for expanding and testing the system’s capabilities in a real-world environment. Viral Engagement: The news of our open Testnet quickly spread, generating significant interest. The momentum was remarkable, with additional validators joining the network within hours of the initial launch. Conclusion

In just 48 hours, Empeiria successfully decentralized its network with over 100 validators. This achievement highlights the effectiveness of our approach, which combined thorough research, extensive testing, and proactive community engagement.

The key takeaway from this success is the importance of aligning product development with market needs and maintaining a rigorous testing and optimization process. This approach, coupled with a well-informed and engaged community, proved instrumental in reaching our decentralization goals swiftly.

Our focus remains on continuing to refine and advance our technology, leveraging the lessons learned to drive further innovation in decentralized networks.

Follow Empeiria on X, or LinkedIn for the latest news & updates. For inquiries or further information, contact Empeiria at media@empe.io


EMPE Network FAQ — Testnet

EMPE Network FAQ — Testnet EMPE validators group: https://t.me/EmpeValidators FAQ Version (04/08/2024) 1. Technical Setup & Troubleshooting 2. Rewards & Staking 3. Project Overview & Updates 1. Technical Setup & Troubleshooting Q: How many tokens do I need to register as a validator? A: You need 1 EMPE token, equivalent to 1,000,000 uEMPE tokens. Q: Why can’
EMPE Network FAQ — Testnet

EMPE validators group: https://t.me/EmpeValidators

FAQ Version (04/08/2024)

1. Technical Setup & Troubleshooting

2. Rewards & Staking

3. Project Overview & Updates

1. Technical Setup & Troubleshooting

Q: How many tokens do I need to register as a validator?
A: You need 1 EMPE token, equivalent to 1,000,000 uEMPE tokens.

Q: Why can’t I register as a validator?
A: Ensure your node is synchronized and you have enough tokens. Refer to the technical documentation for more details: [Technical Documentation.](https://docs.empe.io/validators/overview)

Q: How do I use the EMPE faucet?
A: Check the Telegram group for Validators for the faucet link: [Empeiria Validators](https://t.me/EmpeValidators)
1. Use the tokens to register your validator.
2. Opening a validator does not guarantee a position in the active set.
3. Tokens can be claimed only once.

Q: I ran a script and got an RPC error. What happened?
A: Your RPC might not be synchronized, or you may not have enough funds in your wallet.

Q: How can I access the binary code?
A: Binary code access will be available on the mainnet.

Q: What is a moniker?
A: A moniker is the name you give your validator to help identify it easily.

Q: Can I edit my moniker after registering?
A: Yes.

Q: What should I do if I have a technical issue?
A: You can always ask other validators for support: [Empeiria Validators](https://t.me/EmpeValidators). If it’s a major problem, contact the Empe team.

Q: How do I get my validator unjailed?
A: Use the command:
emped tx slashing unjail --from <<KEY_NAME>> --fees 40uempe

Q: Once my validator is jailed and I used the above command to unjail it, how many EMPE tokens do I need to return to the active set?
A: You need to have more tokens than the last validator in the active set.

Q: How do I set up EMPE GAS FEE?
A: If you are getting a gas error, increase the fee and make sure you have enough tokens to cover the gas cost.

Q: Can I park my validator, close it, and return later? Can I use any VPS for running the Empe testnet validator?
A: Yes, you can park your validator and use any VPS. However, be aware that shutting down a node with bonded tokens might result in jailing. To prevent this, avoid keeping your validator offline for extended periods. You can delegate your tokens to another validator while your node is offline. When restarting, make sure only one instance of your validator is running to avoid issues with trapping your validator and tokens. Always keep your private keys secure, and do not run a second instance with the same private keys, as this will lead to permanent jailing.

Q: Can I use the Empe wallet to register my validator on the Empe network?
A: No, the Empe wallet is designed for managing decentralized identifiers and verifiable credentials, not for crypto tokens. Please use the Keplr wallet for claiming tokens and validator registration.

Q: Do I have to create a DID (Decentralized Identifier) using the EMPE wallet to join the Empe testnet as a validator?
A: No, use the Keplr wallet only.

Q: When I create a fee, which values should I use for Empe tokens?
A: Use uEMPE, where 1 EMPE = 1,000,000 uEMPE, similar to how Satoshi is used for BTC.

Q: How can I provide feedback on the testnet?
A: You can share your thoughts and experiences through our official communication channels. Contact moderators from the Empeiria Validators Telegram group for any feedback you may have.

2. Rewards & Staking

Q: Is the Empe testnet incentivized? Will I get rewards or more delegation for creating validator services?
A: Yes, active testnet participants who maintain their nodes until the mainnet launch and for at least 60 days will receive a bonus in the form of foundation delegation on the mainnet. Additional incentives to increase the delegation allocation will be announced later.

Q: Is it possible to receive delegation from the team?
A: Team delegations are currently not available. Focus on building a strong validator reputation to attract delegations from the community.

Q: How do I join the testnet active set?
A: Be in the top 200 validators.

Q: What is the reward for helping other validators?
A: There is no direct reward for helping other validators, although contributing to the community can enhance your reputation and support the network’s overall health.

Q: Will I get rewards or additional delegation for creating validator services?
A: You may be considered for bonus delegation on the mainnet if your validator services provide additional value to the network.

Q: What is the duration for validators to vote?
A: The voting period for validators is 24 hours, and you must use your wallet with tokens to cast your vote.

3. Project Overview & Updates

Q: How long will the testnet run?
A: The testnet does not have a specified end date. It will continue to serve as an environment for testing new features and applications built on Empe.

Q: Where can I find the updated roadmap for validators and the Empeiria project?
A: The updated roadmap will be announced soon. Stay tuned for updates on our official communication channels.

Q: What are the best resources to learn more about the project?
A: Please check our Medium and X (formerly Twitter) for the latest news and updates.

Version 04/08/2024

Saturday, 03. August 2024

a16z Podcast

From AI to Instant Replay: The Technology Behind the Olympics

The Olympics features over 11,000 athletes competing in 32 sports, attracting an audience of more than 10 million. In this episode, Charlie Ebersol, co-founder of the Alliance of American Football and Infinite Athlete, explores how new innovations like AI and bespoke broadcasting technologies are shaping the future of sports. Charlie also reflects on the storytelling legacy of his father, Dick E

The Olympics features over 11,000 athletes competing in 32 sports, attracting an audience of more than 10 million.

In this episode, Charlie Ebersol, co-founder of the Alliance of American Football and Infinite Athlete, explores how new innovations like AI and bespoke broadcasting technologies are shaping the future of sports.

Charlie also reflects on the storytelling legacy of his father, Dick Ebersol, a legendary sports producer who transformed how we experience the Olympics. We discuss the importance of making sports more accessible and engaging through technology that enhances, rather than distracts from, the human stories at the heart of the games.

Whether you're a tech enthusiast or a sports fan, this episode offers a unique look at the convergence of these two worlds.

Resources: 

Find Charlie on Twitter: https://x.com/CharlieEbersol

Learn more more about Infinite Athlete: https://infiniteathlete.ai/

Stay Updated: 

Let us know what you think: https://ratethispodcast.com/a16z

Find a16z on Twitter: https://twitter.com/a16z

Find a16z on LinkedIn: https://www.linkedin.com/company/a16z

Subscribe on your favorite podcast app: https://a16z.simplecast.com/

Follow our host: https://twitter.com/stephsmithio

Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures.


Empiria

Data (in)Security and How to Fix It

How many days have passed since the last major data leak or security breach? Most likely, zero. From the phone records of AT&T110 million customers exposed in a recent leak to the 815 million passport details, names, phone numbers, and addresses stolen by hackers in India, one thing is clear — our private data is anything but secure. The repercussions of these breaches are profound. Beyo

How many days have passed since the last major data leak or security breach? Most likely, zero. From the phone records of AT&T110 million customers exposed in a recent leak to the 815 million passport details, names, phone numbers, and addresses stolen by hackers in India, one thing is clear — our private data is anything but secure.

The repercussions of these breaches are profound. Beyond financial implications — the average global cost of a data breach exceeds 4.45 million U.S. dollars per incident — trust is the main casualty, eroding confidence in digital interactions and services.

Given this ever-present vulnerability, the question becomes: how can we reclaim control of our data and navigate the digital world with confidence? At Empeiria, we understand these challenges and have developed a technological toolbox to help you achieve just that.

Why is Our Data So Vulnerable?

In today’s digital world, our private data forms the cornerstone of countless online interactions. We entrust companies with a vast amount of information, from financial details to browsing habits. However, a critical issue persists: the vulnerability of this data to unauthorized access and data breaches.

But why is that? Why is our data so vulnerable? The answer lies in the inherent weaknesses of the current data frameworks, which leave your information exposed in several ways:

Centralized Storage: Currently, user data is usually stored on centralized company servers. This creates a single point of failure — if hackers breach these servers, a massive amount of data can be compromised all at once. Data Silos: Our information is often scattered across numerous platforms and services. This fragmentation makes it difficult to track and control how our data is used and protected. Weak Encryption: While some companies encrypt data, encryption practices can vary. Weak encryption algorithms can leave information vulnerable to attackers. Human Error: Accidental data leaks by employees can occur due to mistakes or social engineering attacks. Traditional access controls might not be granular enough, granting access to more data than necessary. The Evolving Regulatory Landscape: The legal landscape surrounding data security is constantly evolving. Regulations like the General Data Protection Regulation (GDPR) aim to give individuals more control over their data. However, these regulations can be complex and challenging for companies to navigate, creating additional hurdles in securing user information.

In the end, the current state of data security leaves much to be desired. Our information resides in silos, on centralized servers, vulnerable to breaches and misuse. Traditional frameworks leave us at the mercy of companies, hoping they’ll act responsibly with our data — a trust that’s often broken.

But there’s a solution, and it lies in technological innovation. By leveraging cutting-edge tech advancements, we can build a future where users reclaim control of their data and change how we ensure the security of our data forever.

The Solution: End-to-End Verifiable Data Infrastructure (EVDI)

Here at Empeiria, we believe data ownership is the key to the future of digital interactions. This is why we built our End-to-End Verifiable Data Infrastructure (EVDI) on the principles of Self-Sovereign Identity (SSI). This framework empowers users with unprecedented control over their information. Within it, it’s the users themselves and not third parties who own and control their data.

By expanding upon the concepts of SSI, EVDI offers a new approach to data security by addressing the fundamental weaknesses of traditional data frameworks:

Data Ownership: In EVDI, instead of centralized data silos, data is securely stored in users’ verifiable data wallets. These wallets act as personal vaults where individuals manage their information securely. EVDI gives users full control over their data. No third party, or even Empeiria itself, has access to users’ data. By storing data in users’ wallets, EVDI fixes one of the main vulnerabilities of the current system: a single point of failure.

Built-in trust: EVDI operates on a trustless architecture of Empe Blockchain, where trust is not assumed, or delegated to a third party, but built into the technology itself. Traditional systems typically rely on centralized authorities to manage, protect, and verify data. In contrast, EVDI leverages decentralized networks and cryptographic techniques to ensure data integrity without the need for intermediaries. No sensitive data is stored on Empe Blockchain, which serves as an immutable record of data ownership and a tool for data verification, without compromising data security.

Streamlined Compliance with Regulations: Navigating the evolving regulatory landscape surrounding data security can be challenging. EVDI lets companies forget the compliance headache. It keeps data in the user’s control, automatically adhering to regulations. This simplifies compliance for businesses, saving them time, effort, and money.

In a world where data breaches are increasingly commonplace and traditional security measures fall short, Empeiria’s End-to-End Verifiable Data Infrastructure (EVDI) offers a practical solution. By empowering individuals with control over their data and leveraging advanced decentralized technologies, it paves the way for a more secure digital experience.

Follow Empeiria on Twitter/X, or LinkedIn for the latest news & updates. For inquiries or further information, contact Empeiria at media@empe.io

Friday, 02. August 2024

Panther Protocol

Testnet Stage 7 is now live with simulated Swaps, Data Escrow and more

Before we announce the exciting new features available for testing in Stage 7, we want to extend our gratitude to our community of over 3,500 testnet users. Your diligent testing and valuable feedback are critical to our mission. We are pleased to share that testing for Stage 6 is

Before we announce the exciting new features available for testing in Stage 7, we want to extend our gratitude to our community of over 3,500 testnet users. Your diligent testing and valuable feedback are critical to our mission.

We are pleased to share that testing for Stage 6 is now complete, providing valuable insights and resulting in updates to our codebase. 

Today, we are excited to announce that Stage 7 of our testnet is live. This stage includes simulated Swap functionality, Data Escrow, and major upgrades to the user history interface. 

Note: Panther’s test network is now on Amoy, Polygon’s test network and can be accessed here

Swap functionality

Swap functionality is now simulated in the testnet. Users can “swap” their shielded assets (zAssets) within Panther Protocol’s Shielded Pool while interacting with test versions of external decentralized Swap protocols like Uniswap V3 and QuickSwap, both of which are on Amoy, Polygon’s test network. These transactions occur inside Panther Protocol’s shielded Zones, ensuring the privacy of the transactions is preserved.

How to test the swap feature

To prepare the transaction, a user (1) selects the token types (Target Pairing) for the swap and (2) set the amount to send (swap from). The user can see the estimated amount to receive (swap to). Then select which token is being sent and which is being received. 

The following features are included with the release, and must be specified by the user prior to making the swap. 

Routing: By default, the protocol selects the most efficient route. The user can override this option based on preference. Slippage: Panther Protocol will allow users to prevent transactions where slippage exceeds their preferences. Transaction Deadline: Automatically sets the time frame after which a pending transaction will autoif not completed. The current setting is fixed at 5 minutes. 

Prior to submitting the swap request, users can review details such as any impact that the resulting swap will have on price, using information such as liquidity, order book depth and order size; the fees from using the external swap protocols, guaranteed minimums received and exchange rate/conversion ratios.

Data Escrow

Data Escrow is an important Product Feature that supports Compliance Data requirements. Panther Protocols has implemented different types of Data Safe mechanism based on the roles that the DAO, Institutions (also referred as the Zone Manager) and Compliance provider perform to support this.

In the next stage, we will continue to build the decryption mechanism for those roles to access the data. The whole process will be tested when those roles participate in the testing process - essentially we see this as a process to Zone Manager testing the protocol along with other participants.  

History page

The History page UI has been updated with more of the details you will want to track, enabling you to keep records of your testnet transactions (including transaction types,amounts, asset types, value and date and time), mimicking how it will work once the protocol is live. 

Disclaimer

For the avoidance of doubt, tZKP, tzZKP, tPRP, test MATIC, and any other tokens mentioned in this announcement or within the product are for testing purposes only and have no economic value, nor can they be exchanged for value. 

Participation on our incentivized Testnet versions may result in you earning rewards, but such credits are not represented on any blockchain as tokens.


Epicenter Podcast

MultiversX: Blockchain Sharding 101 - Lucian Mincu

In 2017, Vitalik Buterin defined the ‘Scalability Trilemma’ which consisted of 3 attributes that every blockchain had to balance depending on its intended use cases: decentralisation, scalability and security. While Ethereum sacrificed scalability in favour of security and decentralisation, others prioritised throughput over the other two. However, a solution was proposed, inspired from Web2 compu

In 2017, Vitalik Buterin defined the ‘Scalability Trilemma’ which consisted of 3 attributes that every blockchain had to balance depending on its intended use cases: decentralisation, scalability and security. While Ethereum sacrificed scalability in favour of security and decentralisation, others prioritised throughput over the other two. However, a solution was proposed, inspired from Web2 computer science - sharding. Despite the fact that Ethereum’s ossification and significant progress in zero knowledge research led to a shift in Ethereum’s roadmap away from execution sharding towards L2 rollups, there were other L1s that were designed from the get-go as sharded blockchains. One such example was Elrond, who implemented the beacon chain PoS consensus alongside a sharded execution layer. Their recent rebranding to MultiversX alludes to a multi-chain, interoperable ecosystem, in which sovereign chains can communicate in a similar manner to cross-shard transaction routing.

Topics covered in this episode:

Lucian’s background and founding Elrond (MultiversX) Elrond’s validator & shard architecture Cross-shard composability VMs, smart contracts and transaction routing Self sovereignty and modularity MultiversX vision Roadmap

Episode links:

Lucian Mincu on Twitter MultiversX on Twitter xPortal on Twitter

Sponsors:

Gnosis: Gnosis builds decentralized infrastructure for the Ethereum ecosystem, since 2015. This year marks the launch of Gnosis Pay— the world's first Decentralized Payment Network. Get started today at - gnosis.io Chorus1: Chorus1 is one of the largest node operators worldwide, supporting more than 100,000 delegators, across 45 networks. The recently launched OPUS allows staking up to 8,000 ETH in a single transaction. Enjoy the highest yields and institutional grade security at - chorus.one

This episode is hosted by Felix Lutsch.

Thursday, 01. August 2024

Zcash

Zcash community rallies to find consensus on a new 1-year dev fund

The results of community polling are in, and there will be a new Zcash dev fund in November. But this new model will be radically different from what Zcash has […] Source
The results of community polling are in, and there will be a new Zcash dev fund in November. But this new model will be radically different from what Zcash has […]

Source


Verus

How Verus Solved MEV (Maximal Extractable Value) in DeFi

Imagine a DeFi world without front-running, back-running, or sandwich attacks. A world where every user gets a fair price, and transaction fees don’t suddenly skyrocket. This isn’t a distant dream — it’s the reality with Verus. MEV (Maximal Extractable Value) is rampant on VM-based blockchains like Ethereum, causing users to lose money and suffer unfair trading practices. But Verus has changed the
Imagine a DeFi world without front-running, back-running, or sandwich attacks. A world where every user gets a fair price, and transaction fees don’t suddenly skyrocket. This isn’t a distant dream — it’s the reality with Verus. MEV (Maximal Extractable Value) is rampant on VM-based blockchains like Ethereum, causing users to lose money and suffer unfair trading practices. But Verus has changed the game with its innovative protocol design, making MEV attacks a thing of the past. Let’s explore how Verus is redefining DeFi with its MEV-resistance at the protocol layer. MEV and its Consequences

Maximal Extractable Value (MEV) is the maximum value that can be extracted from block production in cryptocurrency networks, beyond the standard block rewards.

A block producer, whether a miner or staker, can make a profit by their ability to arbitrarily include, exclude or re-order transactions within the blocks they produce. There are also bots that continuously search the pool of pending transactions (the mempool) for profitable opportunities. This behavior by block producers and bots leads to users being front-run, back-run or sandwich attacked on their transactions.

Front-running involves placing a transaction before a target transaction, while back-running places one immediately after. Sandwich attacks combine both, surrounding a target transaction. This behavior takes advantage of the time between when a transaction is announced (it’s in the mempool) and when it’s processed, allowing manipulators to benefit from price movements caused by the target transaction.

This malicious behavior leads to worse-than-expected execution prices and high transaction fees for regular users doing swaps or trades on AMMs (automated market makers) or DEXs (decentralized exchanges). A sudden high transaction fee can also lead to failing transactions, costing users gas fees without completing the intended trade. At worst, it can even lead to protocol instability.

Another thing to keep in mind is the possible problematic merger of traditional finance and decentralized finance. Front-running, and back-running are illegal practices in traditional finance but common in DeFi. Regulators are likely to view these tactics as unacceptable market manipulation, especially when applied to RWAs (real-world assets). VM-based blockchains have to balance the open, permissionless nature of the blockchain with the expectations of fairness and protection that come with mainstream financial adoption. The giant elephant in the room asks… Can blockchain protocols where MEV is endemic be used for mainstream adoption?

MEV Resistance in Verus

How Verus is MEV-resistant is actually really simple and doesn’t need complicated ways that try to mitigate it. It is inherently not in the protocol by forward-looking protocol design.

You can find Verus DeFi within the consensus layer of the protocol. All currencies (e.g. tokens, liquidity pools) are accounted for by the miners and stakers. This is very different from Ethereum, for example, where only ETH is accounted for by the stakers, and where ERC-20s are not. On VM-based blockchains DeFi is programmed at the smart contract layer, which has many vulnerability surfaces, and what is happening in those smart contracts is not verified by the block producers.

No Transaction Ordering

All VM-based blockchains solve transactions sequentially. This is their fundamental design flaw. Because of this, transactions can be ordered, by either block producers or bots paying higher gas fees to “skip the line.” This is what makes MEV possible.

Unlike VM-based blockchains that solve transactions sequentially, Verus, a UTXO-based blockchain, solves transactions simultaneously. Front-running, back-running and sandwich attacks are impossible since there is no “front” or “back”. Transactions (and thus conversions) are not ordered.

Conversions are solved simultaneously within 1 to 10 blocks, depending on how many conversions are made. The conversions are triggered when there are 10 transactions made, or 10 blocks have passed if it’s not busy. If it’s busy all the time, then conversions are solved every 2 blocks. This leads to a MEV-resistant protocol, but also to fair pricing and enhanced liquidity through offsetting.

Fair Pricing and Enhanced Liquidity

Because all transactions (conversions) are solved simultaneously (1 to 10 blocks), all conversions on the same “trading pair” get the same fair price. Keep in mind that there can be many “trading pairs” that are the same, but when they are in different basket currencies, they are treated as separate.

Important to note is that orders are offset against each other. For example, if one user wants to sell 100 currency X (for Y) and another wants to buy 80 currency X with Y, 80 of that currency is directly matched, leaving only 20 to affect the overall market price. This also means there is reduced price impact, and thus less slippage.

Another benefit is enhanced liquidity. Instead of each conversion drawing from or adding to the liquidity pool sequentially, the net effect is applied, making more efficient use of the available liquidity. The actual liquidity in the pool might not change, but the offsetting effect makes it behave as if there was more liquidity available because offsetting orders are matched directly, reducing the need to draw from or add to the pool.

Then you also have the market arbitrageurs, who now not only make a profit for themselves, they also stabilize the market overall, giving all participants a fair price. With a very low fee of 0.025% or 0.05% of the conversion (without gas fees), it is a great system for all kinds of market participants.

Verus DeFi protocol wide volume in July 2024 was close to $14M. Source: on-chain metrics.
Real Fair DeFi

Whatever market participant you are, you are best served with the Verus Protocol. Verus DeFi is fully functional on mainnet for everyone to use.

No MEV (no front-running, back-running or sandwich attacks) Truly decentralized (it’s a credibly neutral protocol — not a business, company or otherwise rent-seeking) The same fair price for all market participants within 1–10 block(s) Secured at the protocol level Low conversion fees of max. 0.05% Permissionless launching of liquidity pools Bridged to Ethereum (all ERC-20s can be bridged to the Verus Protocol, in a non-custodial and trustless manner, and vice versa) Decentralized crowdfund mechanisms for your currency (e.g. token, liquidity pool, native PBaaS coin), such as pre-launch discounts, pre-launch carveouts and pre-allocations 💡 Bridge Your Community to Verus

If you are in a community that has an ERC-20 as a token consider moving over to Verus. Not only do you get fair, MEV-resistant DeFi, you can also tap into the self-sovereign identity protocol (VerusID) and experience unlimited scale (no more high gas fees). You can do it permissionlessly!

Join the Discord, the worldwide community is happy to answer any questions and help you assist.

In the meantime you can try Verus DeFi on your mobile and desktop now. Download Verus Mobile on Google Play and App Store.

🛠️ Build dApps with Verus

Look up the complete command list here. Go to docs.verus.io to get guidance on some API commands (e.g. launching currencies, tokens & liquidity pools).

Join the community. Learn about the protocol. Use Verus & build dApps.

➡️ Join the community on Discord

Follow on Twitter

Go to verus.io

How Verus Solved MEV (Maximal Extractable Value) in DeFi was originally published in Verus Coin on Medium, where people are continuing the conversation by highlighting and responding to this story.

Wednesday, 31. July 2024

Empiria

Empe Testnet Decentralized in 48h

Empe Testnet has achieved decentralization in just 48 hours, with over 100 validators onboarded. This marks a significant milestone in the development of the Empe Blockchain — the cornerstone of Empeiria’s End-to-End Verifiable Data Infrastructure (EVDI), designed to empower real-world applications with verifiable data. Backed by over 15,000 hours of R&D and insights from 70+ customer collabo

Empe Testnet has achieved decentralization in just 48 hours, with over 100 validators onboarded. This marks a significant milestone in the development of the Empe Blockchain — the cornerstone of Empeiria’s End-to-End Verifiable Data Infrastructure (EVDI), designed to empower real-world applications with verifiable data.

Backed by over 15,000 hours of R&D and insights from 70+ customer collaborations, EVDI seamlessly bridges on-chain and off-chain worlds, ensuring data privacy, reusability, and compliance while adhering to industry standards like W3C, DIF, and IETF.

With this rapid decentralization, Empeiria has established a secure, transparent, and scalable decentralized network, demonstrating its commitment to delivering a robust and scalable verifiable data solution.

The launch of Phase 2 of Empe Testnet is accompanied by the publication of comprehensive technical documentation, which streamlines the integration process for external validators and ensures a smooth onboarding experience.

Empeiria extends its gratitude to Empe Testnet validators for their critical role in achieving this milestone. Visit the Empe Explorer website to see Empe Testnet in action.

Follow Empeiria on X, or LinkedIn for the latest news & updates. For inquiries or further information, contact Empeiria at media@empe.io

Tuesday, 30. July 2024

a16z Podcast

When AI Meets Art

On June 27th, the a16z team headed to New York City for the first-ever AI Artist Retreat at their office. This event brought together the builders behind some of the most popular AI creative tools, along with 16 artists, filmmakers, and designers who are exploring the capabilities of AI in their work. In this episode, we hear from the innovators pushing the boundaries of AI creativity. Joined by

On June 27th, the a16z team headed to New York City for the first-ever AI Artist Retreat at their office. This event brought together the builders behind some of the most popular AI creative tools, along with 16 artists, filmmakers, and designers who are exploring the capabilities of AI in their work.

In this episode, we hear from the innovators pushing the boundaries of AI creativity. Joined by Anish Acharya, General Partner, and Justine Moore, Partner on the Consumer team, we feature insights from:

Ammaar Reshi - Head of Design, ElevenLabs Justin Maier - Cofounder & CEO, Civitai Maxfield Hulker - Cofounder & COO, Civitai Diego Rodriguez - Cofounder & CTO, Krea Victor Perez - Cofounder & CEO, Krea Mohammad Norouzi - Cofounder & CEO, Ideogram Hang Chu - Cofounder & CEO, Viggle Conor Durkan - Cofounder, Udio

These leaders highlight the surprising commonalities between founders and artists, and the interdisciplinary nature of their work. The episode covers the origin stories behind these innovative tools, their viral moments, and their future visions. You'll also hear about the exciting potential for AI in various creative modalities, including image, video, music, 3D, and speech.

Keep an eye out for more in our series highlighting the founders building groundbreaking foundation models and AI applications for video, audio, photography, animation, and more.

Learn more and see videos on artists leveraging AI at: 

a16z.com/aiart

 

Find Ammaar on Twitter: https://x.com/ammaar

Learn more about ElevenLabs: https://elevenlabs.io

Find Justin on Twitter: https://x.com/justmaier

Find Max on LinkedIn: https://www.linkedin.com/in/maxfield-hulker-5222aa230/
Learn more about Civitai: https://civitai.com

Find Diego on Twitter: https://x.com/asciidiego?lang=en

Find Victor on Twitter: https://x.com/viccpoes

Learn more about Krea: https://www.krea.ai/home

Find Mohammed on Twitter: https://x.com/mo_norouzi

Learn more about Ideogram: https://ideogram.ai/t/explore

Find Conor on Twitter: https://x.com/conormdurkan

Learn more about Udio: https://www.udio.com/home

Find Hang on Twitter: https://x.com/chuhang1122

Learn more about Viggle: https://viggle.ai/

 

Stay Updated: 

Let us know what you think: https://ratethispodcast.com/a16z

Find a16z on Twitter: https://twitter.com/a16z

Find a16z on LinkedIn: https://www.linkedin.com/company/a16z

Subscribe on your favorite podcast app: https://a16z.simplecast.com/

Follow our host: https://twitter.com/stephsmithio

Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures.

 

 


Sequoia

Partnering with XBOW: The Gold Standard in Offensive Security

The post Partnering with XBOW: The Gold Standard in Offensive Security appeared first on Sequoia Capital.
Partnering with XBOW: The Gold Standard in Offensive Security

Oege and his team are using AI to stay ahead of cyber attackers and deliver security that scales.

By Konstantine Buhler and Lauren Reeder Published July 30, 2024 XBOW founder Oege de Moor.

Great offensive security requires getting creative—to safeguard against risks, you must imagine what bad actors might do. For decades, human experts have been the gold standard in this critically important work. Penetration testers (or pentesters), bug hunters and security researchers simulate attacks and find the gaps, so developers can close them.

But with the rise of AI, we have seen an exponential increase both in the amount of code that must be protected and in the volume of exploits, which now cost businesses more than three times what they did less than a decade ago. Human experts simply can’t keep up; last year, nearly two-thirds of organizations reported their biggest challenge in maintaining a pentesting program was finding enough skilled personnel to do the work.

Companies must modernize—and thankfully, XBOW founder Oege de Moor and his team are establishing the new state of the art, leveraging large language models to automate pentesting that scales. XBOW solves an impressive 75% of web app security benchmarks, and it does so with zero human intervention. Customers don’t have to choose between keeping their software secure and sacrificing speed and growth.

XBOW is designed to think like a hacker. It simulates real-world attacks and identifies novel exploits other tools might miss. When it spots a vulnerability, it automatically pinpoints the cause, runs tests to demonstrate how the issue could be exploited, and provides actionable guidance on how to fix it.

This is a transformative innovation for offensive security, and as a world expert in code analysis, Oege is uniquely suited to lead the way. A former Oxford professor turned founder, he sold his first company, Semmle, to GitHub and went on to create the company’s Copilot and Advanced Security products. Along the way, he built a reputation as a brilliant but pragmatic and clear-thinking leader—and a talent magnet, as the highly skilled, deeply committed team at XBOW now attests. We at Sequoia were grateful for the chance to lead their seed round and to welcome Oege to Arc, our company-building immersion for pre-seed and seed-stage founders. It has been a privilege to support him and the team as they grow.

Oege takes the mic at the Arc story slam.

No doubt XBOW’s mission is an ambitious one. But Oege and the team understand that the stakes are high not just for businesses but for national security, as well, and they are moving quickly to stay ahead of attackers and develop the best offering in AI offensive security. Their work is helping security professionals and everyone they protect—and setting a new gold standard for the industry in the process.

Share Share this on Facebook Share this on Twitter Share this on LinkedIn Share this via email Related Topics #AI #Funding announcement AI Ascent 2024 Video highlights from our AI conference. Perspective Read AI and the Frontier Paradox By Konstantine Buhler Perspective Read Sequoia Open Source Fellowship Applications now open News Read Partenariat avec Dust: Productivité Alimentée par les LLM Partnering with Dust: LLM-Powered Productivity News Read JOIN OUR MAILING LIST Get the best stories from the Sequoia community. Email address Leave this field empty if you’re human:

The post Partnering with XBOW: The Gold Standard in Offensive Security appeared first on Sequoia Capital.


PIVX

PIVX Momentum Continues — Another Summer Listing Joins the Roster.

PIVX Momentum Continues — Another Summer Listing Joins the Roster. PIVX is happy to announce that it is now listed on BITStorage Finance. bitstorage.finance a “Secure & Efficient Crypto Trading Platform” offers PIVX/USDT pair. PIVX’s Business Development lead Jeffrey has been busy this year gaining new partnerships for PIVX. We are proud of his success and are excited about PIVX growin
PIVX Momentum Continues — Another Summer Listing Joins the Roster.

PIVX is happy to announce that it is now listed on BITStorage Finance.
bitstorage.finance a “Secure & Efficient Crypto Trading Platform” offers PIVX/USDT pair.

PIVX’s Business Development lead Jeffrey has been busy this year gaining new partnerships for PIVX. We are proud of his success and are excited about PIVX growing and getting known in new markets.

Not to brag but since January 2024 the following New Partnerships have occurred.

January:
- ChangeNOW.io ChangeNow integrates $PIVX.
- NOWNodes lists $PIVX.

February:
- Gate.io lists PIVX. PIVX achieved a significant milestone as it received its first ever fiat trading pair with Turkish Lira (TRY)
- Coinstore adds PIVX to its’ discovery page.
- Atani lists PIVX with a USDT pairing.
- CoinRabbit loans adds $PIVX as a collateral.

April:
- BitStack.com integrates $PIVX.
- DefiNation integrates $PIVX into their virtual gift card selection.

May:
- Bitcointry Exchange lists $PIVX.

June:
- BYDFi Crypto Exchange lists $PIVX.
- CoinMarketCap verifies PIVX on CMC.

July:
- MEXC integrates $PIVX with USD pairing.
- Armoursolutionsuk enables PIVX as payment to purchase and host PIVX. Masternodes.
- Koinbx lists $PIVX, one of India’s largest exchanges.
- XeggeX integrates $PIVX with a PIVX/USDT and PIVX/BTC
- Swapter.io adds $PIVX

With more choices to trade, purchase, sell and use PIVX, it’s clear the ecosystem is continuing to grow and expand, providing increased accessibility and utility for PIVX users.

PIVX. Your Rights. Your Privacy. Your Choice.
To stay on top of PIVX news please visit PIVX.org and Discord.PIVX.org.

PIVX Momentum Continues — Another Summer Listing Joins the Roster. was originally published in PIVX on Medium, where people are continuing the conversation by highlighting and responding to this story.

Monday, 29. July 2024

Zcash Foundation

Results of the ZCAP Dev Fund Runoff poll

The results of the ZCAP Dev Fund runoff poll are in. The community has voted to support the "Hybrid Deferred Dev Fund" proposal, which allocates 20% of the block rewards to a Dev Fund for one year, with 8% earmarked for the Zcash Community Grants program, and 12% to be stored in a protocol lockbox. The post Results of the ZCAP Dev Fund Runoff poll appeared first on Zcash Foundation.

On July 21st, we opened a runoff poll of the Zcash Community Advisory Panel (ZCAP), to determine which of two Dev Fund proposals should be implemented when the current Dev Fund ends in November. The poll closed this morning, and the results are as follows:

Hybrid Deferred Dev Fund: Transitioning to a Non-Direct Funding Model – 91 votes (67%) Lockbox For Decentralized Grants Allocation (20% option) – 40 votes (30%)

Four voters (3%) abstained by choosing neither option, and 135 votes were cast in total.

We can therefore conclude that the Hybrid Deferred Dev Fund option enjoys majority support across the Zcash community. This conclusion is reinforced by the results of alternative polls of the ZecHub DAO, Zcash Brazil community, and the Zcash Spanish-speaking community.

This result bodes well for the future of Zcash. It ensures that the Zcash Community Grants program will continue to receive funding over the coming year, supporting its ongoing mission to fund independent teams entering the Zcash ecosystem, and ensuring that the Zcash ecosystem can continue to become more decentralized.

This result is also an important step in the geographic decentralization of the Zcash ecosystem, as it lays the path for the Zcash Community Grants program to migrate from the Zcash Foundation (based on the United States) to the Financial Privacy Foundation (based in the Cayman Islands).

We’d like to thank everyone who participated in this process, including those who drafted proposals; the many community members who took part in the discussions and debate on the Zcash community forum, on other social media, and in community calls; those who conducted alternative community polls; and everyone who voted in those polls.

Onward!

The post Results of the ZCAP Dev Fund Runoff poll appeared first on Zcash Foundation.


Empiria

The Future of Data Privacy

In our hyper-connected world, consumer data is the new gold, highly sought after by companies eager to gather, analyze, and utilize it for various purposes. But with great power comes great responsibility, and companies that fail to safeguard this valuable asset face a double whammy: an exodus of privacy-conscious customers and potentially crippling regulatory fines running into the billions. Jus

In our hyper-connected world, consumer data is the new gold, highly sought after by companies eager to gather, analyze, and utilize it for various purposes. But with great power comes great responsibility, and companies that fail to safeguard this valuable asset face a double whammy: an exodus of privacy-conscious customers and potentially crippling regulatory fines running into the billions.

Just consider this: according to a recent study by CISCO, a staggering 94% of customers refuse to do business with companies they don’t trust with their information. That’s right, nearly all customers gone, simply because they don’t trust the vendor with their data.

And the pain doesn’t end there, with the consequences of data breaches extending far beyond lost trust. Regulatory bodies are wielding data privacy violations like a financial hammer. Take, for example, the record-breaking €1.2 billion fine imposed on Meta (formerly Facebook) in May 2023 by the Irish Data Protection Commission (DPC) for unlawful data transfers. This wasn’t an anomaly. In fact, the GDPR in Europe alone saw fines surpassing €4 billion in 2023, a significant jump from previous years.

But it’s not all doom and gloom. As consumer attitudes are beginning to shift, companies are starting to see data privacy not as an inconvenient obligation, but as a strategic investment, with massive value waiting to be unlocked. According to the CISCO study, privacy programs have shown an average of 160% ROI, with some respondents declaring even 500% ROI in privacy. This means that by prioritizing data privacy, companies can not only avoid the hefty fines and reputational damage associated with breaches but also unlock significant financial benefits.

This brings us to Empeiria and its groundbreaking privacy-preserving End-to-End Verifiable Data Infrastructure (EVDI). By leveraging advanced Web3 technology, Empeiria offers innovative solutions that simplify the deployment of decentralized data ecosystems, ensuring robust privacy and compliance while driving efficiency and trust. Read on to learn how Empeiria is revolutionizing data privacy and benefiting users and companies alike.

Data Privacy Today: A Breach of Trust

We’ve established that while private data is a highly valuable asset, companies either fail to handle it with the care and responsibility it requires, or, worse yet, actively misuse it for profit.

Let’s delve deeper into the shortcomings of the current data privacy systems that allow this to happen:

Centralized Storage: Imagine a single, monolithic vault overflowing with every detail of your online activity — purchases, searches, clicks. This vulnerability-laden approach is how many companies store data. These centralized repositories become prime targets for hackers. A single breach can expose millions of users, leading to financial ruin and a collapse of trust. Transparency Deficit: Companies often operate with a shroud of secrecy regarding data practices. Data privacy is often not more than a footnote in user agreement and vague terms and conditions offer little insight into how our information is collected, stored, and used. This lack of transparency fosters a deep sense of distrust. Complex Regulations: The global data privacy landscape is a complex and fragmented maze. A patchwork of regulations across different jurisdictions creates a compliance nightmare for businesses. They’re forced to navigate a minefield, risking either overreaction with stifling regulations that hinder innovation, or under-reaction that exposes them to hefty fines and reputational damage. Inadequate encryption: Technology advances at an unrelenting pace, constantly introducing new security threats and vulnerabilities. Keeping systems secure and data properly encrypted is a never-ending battle. Emerging technologies like AI and the Internet of Things raise additional privacy concerns, as they often involve vast amounts of highly sensitive personal data. Companies struggle to keep pace with securing these new frontiers. Resource Strain: Building solid data privacy walls requires significant resources. Regular monitoring, audits, and security updates are essential, but they can strain the budgets and manpower of companies. This resource disparity leaves them vulnerable to breaches and non-compliance issues, further eroding trust within the digital ecosystem.

These are just some cracks in the foundation of our current data privacy system. This lack of robust safeguards incentivizes some companies to prioritize short-term profits over long-term trust.

Having established the reasons behind the current data privacy failures, it’s time to pose the most pressing question: what can we do about it? The answer lies in leveraging advanced and innovative technology to rebuild the foundation of data privacy. This is where Empeiria’s End-to-End Verifiable Data Infrastructure (EVDI) comes into play.

EVDI: A Technological Solution to Data Privacy

EVDI represents a paradigm shift in our approach to data privacy, offering a comprehensive solution that addresses the core issues plaguing the current system. By harnessing the power of blockchain technology and Web3 innovations, Empeiria has developed a robust framework designed to enhance data privacy, transparency, and user control.

Data privacy is built into the very fabric of EVDI, ensuring that data protection is not an afterthought but a fundamental feature. Our solution is compliant with data privacy regulations and provides users with privacy-enhancing features that add a new layer of trust to digital interactions:

Data Ownership: EVDI builds on and expands the principles of Self-Sovereign Identity (SSI), a data framework designed to prioritize data ownership privacy. SSI uses technologies like blockchain and Verifiable Credentials (VCs) to allow users to privately store their information in verifiable data wallets and prove their identity without revealing unnecessary personal information. Only users have access and control over the data they hold in their wallets. Independent Data Verification: EVDI allows verifiers to check data without informing the issuer of what is being verified. Verifiers can confirm the accuracy, integrity, and validity of credentials stored in users’ data wallets without disclosing the specifics to the issuer. This ensures robust privacy while maintaining full verification capabilities. Consent & Selective Disclosure: Consent is a cornerstone of Empeiria’s approach. EVDI empowers users to give explicit consent for data sharing and usage, ensuring transparency and trust in every interaction. Furthermore, EVDI enables selective disclosure, allowing users to share only the necessary aspects of their data while keeping other information private. This means that users can prove their identity or other credentials without exposing unnecessary personal information. Zero-Knowledge Proofs (ZKP): Empeiria employs ZKPs, a cryptographic method that allows one party to prove to another that a statement is true without revealing any additional information. This technique ensures that sensitive information remains private while still verifying its authenticity. Inherent Compliance: By placing data ownership in users’ hands, EVDI meets and exceeds global data privacy regulations. This means that companies using our infrastructure can be confident that they are compliant with laws such as GDPR and CCPA. This built-in compliance simplifies the regulatory landscape for businesses, reducing the risk of fines and enhancing their reputation. Examples of EVDI Implementation

Healthcare Sector: Secure Patient Data Management

Imagine a major hospital struggling to keep patient records safe and secure. This puts them at risk of fines and, more importantly, breaks patients’ trust.

Empeiria offers a new way to handle healthcare data. Instead of a centralized database, patients store their health records privately, in their verifiable data wallets. This keeps patient information secure and makes it easier for the hospital to follow strict privacy rules, like HIPAA in the US.

Doctors can still request the information they need to treat patients, who have more control over their data and feel confident knowing their privacy is protected. This leads to fewer data breaches, better, personalized treatment, happier patients, and a more trusted healthcare system for everyone.

Financial Services: Enhanced Security and Compliance

Financial institutions today face a significant challenge: safeguarding vast amounts of sensitive customer data across multiple jurisdictions, all while adhering to increasingly complex privacy regulations like GDPR and CCPA.

Empeiria’s EVDI offers a solution to this challenge, by empowering customers to store their private data in verifiable data wallets. These wallets act as personal vaults for your financial information, providing enhanced privacy and granular control.

Through zero-knowledge proofs, banks can verify the existence of specific financial information within those wallets, without ever actually seeing the details themselves and only with user consent. This innovative approach fosters trust by ensuring the highest levels of data privacy.

Ultimately, by prioritizing customer privacy, financial institutions can build stronger client relationships and unlock new business opportunities.

E-commerce Platform: Customer Trust and Data Transparency

Imagine an e-commerce platform facing a crisis of confidence. Customers are wary of how their data is stored and used, leading to regulatory scrutiny. Empeiria’s EVDI offers a transformative solution that prioritizes user data privacy.

EVDI empowers customers to take control of their personal information through secure, verifiable data wallets. The platform benefits from increased transparency in data handling practices, with EVDI’s built-in compliance with data privacy regulations ensuring peace of mind.

This proactive approach attracts privacy-conscious customers seeking a secure and transparent shopping experience, ultimately fostering trust, satisfaction, and reduced regulatory risk.

These are just three examples of numerous potential use cases for EVDI. As privacy becomes paramount in an increasing number of our digital interactions, the potential for more practical use cases will only grow.

Conclusion

Empeiria’s easily deployable, cost-effective End-to-End Verifiable Data Infrastructure paves the way for a future where data privacy is no longer an afterthought but a foundational element of digital interactions. This user-centric approach benefits everyone. Users gain complete control over their digital identities, while companies build trust and streamline compliance. It’s a win-win situation.

Looking ahead, EVDI has the potential to create a more resilient digital ecosystem, where trust is earned through technological innovation, not empty promises. By championing data ownership, selective disclosure, and robust security measures, EVDI paves the way for a future of empowered online interactions, with data privacy never compromised.

Follow Empeiria on Twitter/X, or LinkedIn for the latest news & updates. For inquiries or further information, contact Empeiria at media@empe.io

Sunday, 28. July 2024

RadicalxChange(s)

Frank McCourt: Founder of Project Liberty (Part II)

In this episode, Project Liberty Founder Frank McCourt joins Matt for a second round to discuss the challenges and opportunities presented by rapidly developing AI technologies. Building on their previous chat about digital infrastructure, they explore whether AI will exacerbate social media, digital advertising, and data centralization issues, or fundamentally change them. McCourt emphasizes fixi

In this episode, Project Liberty Founder Frank McCourt joins Matt for a second round to discuss the challenges and opportunities presented by rapidly developing AI technologies. Building on their previous chat about digital infrastructure, they explore whether AI will exacerbate social media, digital advertising, and data centralization issues, or fundamentally change them. McCourt emphasizes fixing the internet’s design flaws to ensure AI benefits society, advocates for returning data ownership to individuals and stresses the need for political engagement to align AI with democratic values. Tune in for this enlightening conversation and what we can do moving forward.

Links & References: 

References:

RadicalxChange(s) | Frank McCourt: Founder of Project Liberty (Part I) on Reclaiming the Internet Khmer Empire | Wikipedia The Decline of the Khmer Empire | National Library of Australia Restrictions on TikTok in the United States | Wikipedia TikTok sues to block prospective US app ban | CNN Business How Silicon Valley gamed the world's toughest privacy rules - POLITICO European Union fines Meta $1.3 billion for violating privacy laws : NPR The Dangers of the Global Spread of China’s Digital Authoritarianism | Center for a New American Security (en-US) China’s Techno-Authoritarianism Has Gone Global | Human Rights Watch China trying to develop world ‘built on censorship and surveillance’ | Privacy News | Al Jazeera Project Liberty People’s Bid For TikTok - Project Liberty

Bios:

Frank H. McCourt, Jr. is a civic entrepreneur and the executive chairman and former CEO of McCourt Global, a private family company committed to building a better future through its work across the real estate, sports, technology, media, and capital investment industries, as well as its significant philanthropic activities. Frank is proud to extend his family’s 130-year legacy of merging community and social impact with financial results, an approach that started when the original McCourt Company was launched in Boston in 1893.

He is a passionate supporter of multiple academic, civic, and cultural institutions and initiatives. He is the founder and executive chairman of Project Liberty, a far-reaching, $500 million initiative to transform the internet through a new, equitable technology infrastructure and rebuild social media in a way that enables users to own and control their personal data. The project includes the development of a groundbreaking, open-source internet protocol called the Decentralized Social Networking Protocol (DSNP), which will be owned by the public to serve as a new web infrastructure. It also includes the creation of Project Liberty’s Institute (formerly The McCourt Institute,) launched with founding partners Georgetown University in Washington, D.C., Stanford University in Palo Alto, CA, and Sciences Po in Paris, to advance research, bring together technologists and social scientists, and develop a governance model for the internet’s next era.

Frank has served on Georgetown University’s Board of Directors for many years and, in 2013, made a $100 million founding investment to create Georgetown University’s McCourt School of Public Policy. He expanded on this in 2021 with a $100 million investment to catalyze an inclusive pipeline of public policy leaders and put the school on a path to becoming tuition-free.

In 2024, Frank released his first book, OUR BIGGEST FIGHT: Reclaiming Liberty, Humanity, and Dignity in the Digital Age.

Frank’s Social Links:

Project Liberty Project Liberty (@pro_jectliberty) / X Project Liberty (@pro_jectliberty) • Instagram McCourt Institute (@McCourtInst) / X

Matt Prewitt (he/him) is a lawyer, technologist, and writer. He is the President of the RadicalxChange Foundation.

Matt’s Social Links:

ᴍᴀᴛᴛ ᴘʀᴇᴡɪᴛᴛ (@m_t_prewitt) / X

Connect with RadicalxChange Foundation:

RadicalxChange Website @RadxChange | Twitter RxC | YouTube RxC | Instagram RxC | LinkedIn Join the conversation on Discord.

Credits:

Produced by G. Angela Corpus. Co-Produced, Edited, Narrated, and Audio Engineered by Aaron Benavides. Executive Produced by G. Angela Corpus and Matt Prewitt. Intro/Outro music by MagnusMoone, “Wind in the Willows,” is licensed under an Attribution-NonCommercial-ShareAlike 3.0 International License (CC BY-NC-SA 3.0)

Friday, 26. July 2024

Epicenter Podcast

Mel Project: Is Web3 Truly Decentralised? - Eric Tung

Web3’s decentralisation is currently limited to smart contracts as they can be verified on-chain. However, until scalability and UX become on par with that of Web2, the only realistic way for crypto to reach mass adoption is threw off-chain dApps. This creates the premise for a security bottleneck in the form of centralised APIs used for on-chain querying. Mel Project aims to expand on-chain secur

Web3’s decentralisation is currently limited to smart contracts as they can be verified on-chain. However, until scalability and UX become on par with that of Web2, the only realistic way for crypto to reach mass adoption is threw off-chain dApps. This creates the premise for a security bottleneck in the form of centralised APIs used for on-chain querying. Mel Project aims to expand on-chain security and decentralisation (consensus) to off-chain apps, basically achieving off-chain composability through trustless light clients. Earendil, the backbone of their ecosystem, designed to resist ISP-level censorship, is a decentralized anonymous communication and payment network that enables autonomous applications and true P2P protocols.

Topics covered in this episode:

Eric’s background and founding Mel Project Why Ethereum came short Is Infura a security bottleneck? Liberating markets (and the Internet) Light clients and how Mel Project tackles them Use cases Smart contracts on Mel Project Scaling the ‘world computer’ Mel Project’s consensus: Streamlet Why Mel Project chose an UTXO model Earendil & ISP-level censorship resistance Roadmap The Geph VPN Mel Project’s low volatility (stable) coin $SYM: PoS token Misc.

Episode links:

Eric Tung on Twitter Mel Project on Twitter Geph VPN on Twitter Moxie Marlinspike's 'My first impressions on web3' article Streamlet BFT consensus model

Sponsors:

Gnosis: Gnosis builds decentralized infrastructure for the Ethereum ecosystem, since 2015. This year marks the launch of Gnosis Pay— the world's first Decentralized Payment Network. Get started today at - gnosis.io Chorus1: Chorus1 is one of the largest node operators worldwide, supporting more than 100,000 delegators, across 45 networks. The recently launched OPUS allows staking up to 8,000 ETH in a single transaction. Enjoy the highest yields and institutional grade security at - chorus.one

This episode is hosted by Brian Fabian Crain.

Thursday, 25. July 2024

Circle Press

Circle Announces Unlocking Impact Pitch Competition

Upcoming competitions in DC and New York present projects and startups a chance to win $100K USDC

Upcoming competitions in DC and New York present projects and startups a chance to win $100K USDC


a16z Podcast

Founders Playbook: Lessons from Riot, Discord, & More

Gaming is not just entertainment—it's a revolution reshaping our culture, technology, and economy.  a16z’s Jonathan Lai and Andrew Chen dive into the current gaming renaissance and its future impact. Joining them are Michael Chow, CEO and Steven Snow, CPO of The Believer Company, and Eros Resmini, Founder and Managing Partner of The Mini Fund. They explore the intersection of tech, art, psy

Gaming is not just entertainment—it's a revolution reshaping our culture, technology, and economy. 

a16z’s Jonathan Lai and Andrew Chen dive into the current gaming renaissance and its future impact. Joining them are Michael Chow, CEO and Steven Snow, CPO of The Believer Company, and Eros Resmini, Founder and Managing Partner of The Mini Fund.

They explore the intersection of tech, art, psychology, and design in gaming, discussing how startups can navigate intense competition, distribution challenges, and high production costs. With insights from these industry leaders, this episode covers the transformative potential of AI, the importance of player feedback, and strategies to stand out in a crowded market.

Recorded during Speedrun, a16z’s extensive games accelerator, this episode offers a glimpse into the strategies and innovations driving the gaming industry forward.

 

Resources: 

Find Steven on Twitter: https://twitter.com/StevenSnow

Find Michael on LinkedIn: https://www.linkedin.com/in/believer-paladin/

Find Eros on Twitter: https://twitter.com/erosresmini

Find Jonathan on Twitter: https://twitter.com/Tocelot

Find Andrew on Twitter: https://twitter.com/andrewchen

Learn more about Speedrun: https://a16z.com/games/speedrun/

 

Stay Updated: 

Let us know what you think: https://ratethispodcast.com/a16z

Find a16z on Twitter: https://twitter.com/a16z

Find a16z on LinkedIn: https://www.linkedin.com/company/a16z

Subscribe on your favorite podcast app: https://a16z.simplecast.com/

Follow our host: https://twitter.com/stephsmithio

Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures.


Empiria

Empe Testnet Validators Guide

Empeiria is moving into the next, open phase of its Empe Testnet program. Do you want to join as node validators and help us build a robust and secure network? You’ve come to the right place! This guide will equip you with the knowledge and steps needed to set up your validator on the Empe Testnet. Before you jump in, here are some of the participation benefits: Network Security & Stab

Empeiria is moving into the next, open phase of its Empe Testnet program. Do you want to join as node validators and help us build a robust and secure network? You’ve come to the right place!

This guide will equip you with the knowledge and steps needed to set up your validator on the Empe Testnet. Before you jump in, here are some of the participation benefits:

Network Security & Stability: Your validator node will contribute to the overall health and security of the Empe network. Hands-on Experience: You will gain valuable practical experience with our End-to-End Verifiable Data Infrastructure (EVDI).

In this guide, you will find all the essential information necessary to become an Empe Testnet validator, including hardware requirements and detailed installation and configuration instructions.

For any additional questions or suggestions, please contact us at validators@empe.io.

You can find further information in the Empeiria Technical Documentation.

Ready to dive in? Let’s get started!

Participation Who can participate in the testnet as a validator?

To participate, read the Empeiria Technical Documentation and follow the node setup instructions.

Follow Empeiria on X and join the Empe Testnet Validators Telegram channel to stay updated on the latest developments, including the upcoming launch of Empe Mainnet.

What are the hardware and software requirements for running a validator node?

These are the recommended hardware configurations that can be used to create a new Empe Testnet validator machine. Please note that the higher the effort you put into creating a stable and robust machine and lower the chances of getting slashed due to downtime.

Validator Operations How do I configure my node to connect to the testnet?

Please see the validators instructions below for complete configuration instructions.

Troubleshooting & Support Where can I find support if I encounter issues with my validator node?

For support, join the Empe Testnet Validators Telegram channel or contact the Empeiria tech team at validators@empe.io

How do I report a bug or security vulnerability?

Please report bugs or security vulnerabilities via email at validators@empe.io

What should I do if I suspect my node has been compromised?

If you suspect that your node has been compromised, please turn it off and report it to us via email at validators@empe.io

Community & Communication How can I stay updated with the latest news and updates about the testnet?

Follow Empeiria on X and join the Empe Testnet Validators Telegram channel for the latest announcements, including updates on the upcoming transition to the next open phase.

Validators Instructions Hardware requirements

Below, you will find the recommended hardware configurations that can be used to create a new Empe Testnet validator machine. Please note that the higher the effort you put into creating a stable and robust machine and lower the chances of getting slashed due to downtime.

Operating System: Ubuntu 18.04 or later LTS Number of CPUs: 6 RAM: 32 GB SSD: 240 GB

In addition, make sure that the following requirements are met:

Allow incoming connections on port 26656 Have a static IP address Have access to the root user Required software installation

Install the required software:

sudo apt update && sudo apt upgrade -y
sudo apt install -y git gcc make unzip jq

Add Go to the path:

echo 'export PATH=$PATH:/usr/local/go/bin' >> ~/.profile
echo 'export PATH=$PATH:/home/$USER/go/bin' >> ~/.profile

source ~/.profile Install binary from source code (option A)

As our chain is not yet publicly open-sourced, please contact us (at validators@empe.io) if you want to get access to the repository.

Clone repository and checkout to proper tag:

git clone --depth 1 --branch release/v0.1.0 https://github.com/empe-io/empe-chain

Go to dir and build project:

cd empe-chain/
make install

Check binary version should be equal

emped version

output: v0.1.0 Install prebuild binary (option B)

The emped binary serves as the node client and the application client. In other words, the emped binary can be used to both run a node and interact with it.

For Linux Distributions

Download the tar.gz file:

curl -LO https://github.com/empe-io/empe-chain-releases/raw/master/v0.1.0/emped_linux_amd64.tar.gz

Verify the checksum:

sha256sum emped_linux_amd64.tar.gz

You should see the following:

5353de7004bbacc516d6fc89d7bbcbde251fbba8c4bdccb2a58f8376e47ab753 emped_linux_amd64.tar.gz

Unpack the tar.gz file:

tar -xvf emped_linux_amd64.tar.gz

Move the binary to your local bin directory and make it executable:

mkdir -p ~/go/bin
sudo mv emped ~/go/bin
chmod u+x ~/go/bin/emped

Open a new terminal window and check if the installation was successful:

emped version

You should see the following:

v0.1.0 Configure a node

Select a chain:

export CHAINID=empe-testnet-2
export MONIKER=<your moniker>

Init chain and delete generated genesis:

emped init $MONIKER --chain-id $CHAINID
rm -rf ~/.empe-chain/config/genesis.json

Clone repository with chains:

git clone https://github.com/empe-io/empe-chains.git
cd empe-chains/testnet-2/

Copy genesis file from repo:

cp genesis.json ~/.empe-chain/config/

Change the persistent peers inside config.toml file:

sed -e "s|persistent_peers = \".*\"|persistent_peers = \"$(cat .data | grep -oP 'Persistent peers\s+\K\S+')\"|g" ~/.empe-chain/config/config.toml > ~/.empe-chain/config/config.toml.tmp
mv ~/.empe-chain/config/config.toml.tmp ~/.empe-chain/config/config.toml

Set minimum gas price in app.toml file:

sed -e "s|minimum-gas-prices = \".*\"|minimum-gas-prices = \"$(cat .data | grep -oP 'Minimum Gas Price\s+\K\S+')\"|g" ~/.empe-chain/config/app.toml > ~/.empe-chain/config/app.toml.tmp
mv ~/.empe-chain/config/app.toml.tmp ~/.empe-chain/config/app.toml

Change external_address value to contact your node using the public IP of your node:

PUB_IP=`curl –s –4 icanhazip.com`
sed -e "s|external_address = \".*\"|external_address = \"$PUB_IP:26656\"|g" ~/.empe-chain/config/config.toml > ~/.empe-chain/config/config.toml.tmp
mv ~/.empe-chain/config/config.toml.tmp ~/.empe-chain/config/config.toml Cosmovisor setup

Install cosmovisor

Run go install to download cosmovisor:

go install cosmossdk.io/tools/cosmovisor/cmd/cosmovisor@latest

Create dir structure for cosmovisor:

export DAEMON_NAME=emped
export DAEMON_HOME=$HOME/.empe-chain/
mkdir -p $DAEMON_HOME/cosmovisor/genesis/bin
mkdir -p $DAEMON_HOME/cosmovisor/upgrades

Copy emped binary to cosmovisor genesis bin:

cp ~/go/bin/emped $DAEMON_HOME/cosmovisor/genesis/bin
$DAEMON_HOME/cosmovisor/genesis/bin/emped version

Setup systemd:

sudo tee /etc/systemd/system/cosmovisor.service> /dev/null <<EOF
[Unit]
Description=cosmovisor
After=network-online.target

[Service]
User=$USER
ExecStart=/home/$USER/go/bin/cosmovisor start
Restart=always
RestartSec=3
LimitNOFILE=4096
Environment="DAEMON_NAME=emped"
Environment="DAEMON_HOME=/home/$USER/.empe-chain"
Environment="DAEMON_ALLOW_DOWNLOAD_BINARIES=false"
Environment="DAEMON_RESTART_AFTER_UPGRADE=true"
[Install]
WantedBy=multi-user.target

EOF

Run a node:

sudo systemctl enable cosmovisor
sudo systemctl start cosmovisor

Check status:

sudo systemctl status cosmovisor

Logs from cosmovisor:

sudo journalctl -f -u cosmovisor Sync with state-sync

State-sync is a module built into the Cosmos SDK to allow validators to rapidly join the network by syncing your node with a snapshot-enabled RPC from a trusted block height.

This greatly reduces the time required for a validator or sentry to sync with the network from days to minutes. The limitations of this are that there is not a full transaction history, just the most recent state that the state-sync RPC has stored. An advantage of state-sync is that the database is very small in comparison to a fully synced node, therefore using state-sync to resync your node to the network can help keep running costs lower by minimizing storage usage.

By syncing to the network with state-sync, a node can avoid having to go through all the upgrade procedures and can sync with the most recent binary only.

For nodes that are intended to serve data for dapps, explorers, or any other RPC requiring full history, state-syncing to the network would not be appropriate.

Testnet state-sync

Snapshot are operated on rpc1 and rpc2

WARNING: This documentation assumes you have followed all previous instructions.

The state-sync configuration (in app.toml) is as follows (no need to update it):

# snapshot-interval specifies the block interval at which local state sync snapshots are
# taken (0 to disable). Must be a multiple of pruning-keep-every.
snapshot-interval = 1000

# snapshot-keep-recent specifies the number of recent snapshots to keep and serve (0 to keep all).
snapshot-keep-recent = 10

Set SNAP_RPC1 and SNAP_RPC2 variable

SNAP_RPC1="https://rpc-archive-testnet.empe.io:443"
SNAP_RPC2="https://rpc-archive-testnet.empe.io:443"

Fetch the LATEST_HEIGHT from the snapshot RPC, set the state-sync BLOCK_HEIGHT and fetch the TRUST_HASH from the snapshot RPC. The BLOCK_HEIGHT to sync is determined by subtracting the snapshot-interval from the LATEST_HEIGHT.

LATEST_HEIGHT=$(curl -s https://rpc-archive-testnet.empe.io:443/block | jq -r .result.block.header.height); \
BLOCK_HEIGHT=$((LATEST_HEIGHT - 1000)); \
TRUST_HASH=$(curl -s "https://rpc-archive-testnet.empe.io:443/block?height=$BLOCK_HEIGHT" | jq -r .result.block_id.hash)

Check variables to ensure they have been set:

echo $LATEST_HEIGHT $BLOCK_HEIGHT $TRUST_HASH

# output should be something similar to:
# 29604 28604 2BB3A74046C625CB67D477550D99F2439D48191FD0E840FA42A324B0629A612A

Stop cosmovisor service:

sudo systemctl stop cosmovisor

Set the required variables in ~/.emped/config/config.toml

sed -i.bak -E "s|^(enable[[:space:]]+=[[:space:]]+).*$|\1true| ; \
s|^(rpc_servers[[:space:]]+=[[:space:]]+).*$|\1\"$SNAP_RPC1,$SNAP_RPC2\"| ; \
s|^(trust_height[[:space:]]+=[[:space:]]+).*$|\1$BLOCK_HEIGHT| ; \
s|^(trust_hash[[:space:]]+=[[:space:]]+).*$|\1\"$TRUST_HASH\"|" $HOME/.empe-chain/config/config.toml

Reset the node database

WARNING: This will erase your node database. If you are already running validator, be sure you backed up your `config/priv_validator_key.json` and `config/node_key.json` before running `unsafe-reset-all`.

It is recommended to copy data/priv_validator_state.json to a backup and restore it after unsafe-reset-all to avoid potential double signing.

emped tendermint unsafe-reset-all --home $HOME/.empe-chain

Restart the node and check the logs:

sudo systemctl restart cosmovisor && journalctl -u cosmovisor -f Full state sync from archive snapshot

If a node needs a full state history but wants to synchronize faster, it can start the chain from a history snapshot, (updated daily).

Find the most recent state snapshot on the list https://archive-testnet.empe.io/ (eg https://archive-testnet.empe.io/empe-chain-1_2024-06-06.tar )


curl -O https://archive-testnet.empe.io/$ARCHIVE
tar -xvf $ARCHIVE -C ~/.empe-chain/data
rm $ARCHIVE

Start chain using cosmovisor

Run a Validator

Becoming a validator

Once you properly set up a full node, you can become a validator node and start earning by validating the chain transactions.

Requirements

If you want to become an Empe validator you need to:

Be a full node and cosmovisor up. If you are not, please follow the full node configuration guide and Cosmovisor setup The node must be synchronized: emped status | jq .SyncInfo.catching_up

The command above should return:

false

3. Own enough tokens. To become a validator you need at least 2 empe tokens to create the validator, and for transaction fee. You can obtain your tokens from the faucet: https://testnet.ping.pub/empe/faucet

Add wallet key

Inside the testnet you can use the Ledger, but you can also use the wallet software with the emped. However, if you wish to use Ledger, please add the --ledger flat to any command.

Please remember to copy the 12-word seed phrase in a secure place. They are your mnemonic. If you lose them you lose all your tokens and access to your validator.

Create the first wallet with the following command:

emped keys add <KEY_NAME>
# Enter a password that you can remember

The output of the command will provide the 24 words that are the mnemonic.

Create two wallets one for the validator and the second for the vesting account: example

If you are using the Ledger device you must first connect it to your computer, start the Cosmos application (on the device), and run the command

emped keys add <KEY_NAME> --ledger
# Enter a password that you can remember

In this case, the 12 words are not provided because they have already been configured in the Ledger initialization

emped keys add validator
Enter keyring passphrase:
Re-enter keyring passphrase:

- name: validator
type: local
address: empe1atqq8lmeptgn2jlx2q8r42p572yhh6lzle7vng
pubkey: '{"@type":"/cosmos.crypto.secp256k1.PubKey","key":"A8D47crCW+YkFGduj6brpuzectp3D61xRIx/qbEGGTif"}'
mnemonic: ""

emped keys add vesting
Enter keyring passphrase:

If you don’t have tokens get some from faucet https://testnet.ping.pub/empe/faucet or contact us and send your account address.

What is a Validator?

Validators are responsible for committing new blocks to the blockchain through voting. A validator’s stake is slashed if they become unavailable or sign blocks at the same height. Please read about Sentry Node Architecture to protect your node from DDOS attacks and to ensure high availability.

Create Your Validator

Your empevalconspub consensus public key from tendermint can be used to create a new validator by staking tokens. You can find your validator pubkey by running:

emped tendermint show-validator

To create your validator use the following command:

emped tx staking create-validator \
--amount=10000000uempe \
--pubkey=$(emped tendermint show-validator) \
--moniker=<YOUR_MONIKER> \
--chain-id=empe-testnet-2\
--commission-rate="0.10" \
--commission-max-rate="0.20" \
--commission-max-change-rate="0.01" \
--gas="auto" \
--min-self-delegation="1000000" \
--fees=20uempe \
--from=<KEY_NAME>
When specifying commission parameters, the commission-max-change-rate is used to measure % point change over the commission-rate. E.g. 1% to 2% is a 100% rate increase, but only 1 percentage point.

You can confirm that you are in the validator set by using an explorer.

Confirm Your Validator is Running

Your validator is active if the following command returns anything:

emped query tendermint-validator-set | grep "$(emped tendermint show-validator | jq .key | tr -d \")"

You should now see your validator in one of the block explorers. You are looking for the bech32 encoded address in the ~/.emped-chain/config/priv_validator_key.json file.

To be in the validator set, you need to have more total voting power than the 50th validator.

For support, join the Empe Testnet Validators Telegram channel or contact the Empeiria tech team at validators@empe.io. Follow Empeiria on X for the latest announcements.


Brave Browser

MELTing Point: Mobile Evaluation of Language Transformers

We explore the feasibility of deploying LLMs on device, a model in which user prompts and LLM outputs never leave the device premises.

This post describes work done by Stefanos Laskaridis, Kleomenis Katevas, Lorenzo Minto and Hamed Haddadi. This post was written by Machine Learning Researcher Stefanos Laskaridis.

TL;DR: As we are entering the new era of hyper-scale models, it is indispensable to maintain the ability to host AI locally, for maintaining privacy and sustainability. This is the first study that measures the deployability of Large Language Models (LLMs) at the consumer edge, exploring the potential of running differently-sized models on smartphones and edge devices 1 instead of the cloud.

Introduction

Large Language Models, such as Llama-3, Mixtral, or ChatGPT, have recently revolutionized the machine learning landscape, enabling use-cases that were previously unfathomable, including intelligent assistants (including our own Brave Leo), creative writing, as well as agent-based automations (Brave Search Integration in Leo, for instance). At the same time, devices in our pockets have been getting increasingly capable 2, integrating evermore powerful System-on-Chips (SoCs). Based on this trend, and maintaining our true commitment to preserving users’ privacy, we explore the feasibility of deploying LLMs on device, a model in which user prompts and LLM outputs never leave the device premises.

To do this, the Brave Research team has built their very own LLM benchmarking infrastructure, named BLaDE, for measuring the latency, accuracy, memory, and energy impact of running LLMs on-device. At the same time, acknowledging that these models are oftentimes too large, we leverage edge devices to accelerate execution locally, which can be co-located with smart devices at the consumer side 3. This can also be deployed with our most recent BYOM (Bring Your Own Model) self-hosting option.

Our experience has shown that while the GenAI ecosystem is growing increasingly large, local on-device deployment is still in its infancy and remains very heterogeneous across devices. Deploying LLMs on device is possible, but with noticeable impact latency, comfort and accuracy, especially on mid-tier devices. However, hardware and algorithmic breakthroughs can significantly change the cost of execution and user Quality of Experience (QoE). At the same time, SLMs (Small Language Models) 4 are gradually making their appearance, tailored for specific downstream tasks.

Brave Research Device Lab

BLaDE (BatteryLab Device Evaluation) is a state-of-the-art benchmarking infrastructure that is capable of automating the interaction with mobile devices for performance and energy measurements. It can be used for neural or more generic browser tasks. MELT is the component responsible for the benchmarking of neural workloads on various devices.

MELT adopts a server-client architecture, with the central coordinating process being responsible for the following:

Organizing the execution of the benchmarking suite;

Scheduling and dispatching jobs to connected devices;

Controlling the downstream interaction with the application;

Monitoring their runtime, temperature and energy consumption;

Tracing the events of interest in the downstream task and capturing the associated device behavior.

To this end, it integrates the following components:

a Raspberry Pi 4 8GB, which adopts the role of the coordinator;

a Mac Mini, for building packages;

a Monsoon power monitor connected over a Raspberry Pi GPIO addressable relay to control power to individual mobile devices;

a programmable YKUSH USB Switchable Hub for communicating and selectively disabling USB power to devices;

a Flir One edge wireless camera along with an custom-built IR thermometer (based on MLX90614) for monitoring temperature of the device; and last,

a set of mobile devices, shown on Table 1, which have undergone a battery bypass procedure.

In parallel, the coordinator is connected over Ethernet to the same network as our Nvidia Jetson boards with SSH access to them. Jetsons are able to provide power and temperature metrics through SysFS probes available.

MELT Workflow

The measurement workflow we follow is depicted in Figure 1b. MELT’s infrastructure consists of the following components:

Model Zoo, responsible for the download and compilation/quantization of models. We used the models of Table 2.

Evaluator, responsible for the evaluation of the accuracy degradation of models due to their conversion/quantization. We used the datasets of Table 4.

Builder, responsible for the compilation of the respective benchmarking suite backend, shown in Table 3, namely llama.cpp and MLC-LLM.

Runner, responsible for the deployment, automation and runtime of the LLM on the respective device. Integrated devices are shown on Table 1.

Monitor, responsible for the fine-grained monitoring of resource and energy consumption of the execution

Result Highlights

Below, we provide the most interesting results of our analysis and their consequences for on-device deployment and future product research.

Throughput and Energy per Device

Figure 3 illustrates the prefill and generation throughput of various models on different devices when used in a conversational setting. Prefill refers to the preparation and processing of input tokens before actual generation begins (e.g., tokenization, embedding, KV cache population), while generation refers to the autoregressive generation (i.e., decoding) of output tokens. Throughput expresses the rate of token ingestion/production, measured in tokens/sec. 

Figure 4 depicts the discharge rate per token generated, for different model, device and framework combinations. This is expressed in mAh/token.

Insights: We witnessed a quite heterogeneous landscape in terms of device performance across models. Prefill operations, typically compute bound, are much higher than generation rates, which are typically memory bound. MLC-LLM generally offered higher performance, compared to llama.cpp, but at the cost of model portability. Surprisingly, 4-bit models ran faster than their 3-bit counterparts, but at the expense of higher memory consumption, which caused certain models to run out of memory during runtime. Last, the Metal-accelerated iPhones showed higher throughput rates compared to the OpenCL-accelerated Android phones, for the case of MLC-LLM.

Energy-wise, larger networks offer larger discharge rates, as traffic between on-chip and off-chip memory consumes a significant amount of energy. Indicatively, if we deployed Zephyr-3B (4-bit quantized) on S23 on MLC-LLM, iPhone Pro on MLC-LLM and iPhone 14 Pro on llama.cpp, it would take 542.78, 490.05 and 590.93 prompts on average until their battery was depleted. Last, the CPU execution offered lower energy efficiency, attributed to the latency of running inference compared to accelerated execution. 

Quality Of Experience

In real-world settings, running a large model on device can adversely affect the user experience and render the device unstable or unusable. There are largely three dimensions to consider: 

Device responsiveness refers to the general stability and reliability of the device during runtime of LLM inference. Factors that affected the device responsiveness included long model loading times, out-of-memory errors which killed the application and device restarts, causing effectively a DoS by rebooting the device.

Sustained performance refers to the device’s ability to offer the same performance throughout the runtime of multiple inference requests. We noticed in our experiments that performance under sustained load was not stable, but fluctuated. Reasons for this behavior include DVFS, thermal throttling and different power profiles, along with potential simultaneous workloads.

Device temperature does not only affect device performance, but also user comfort. Devices nowadays come in various forms, but mostly remain passively cooled. Therefore, heat dissipation is mainly facilitated by the use of specific materials and heat management is governed by the OS. The power draw did cause temperatures to rise to uncomfortable levels, reaching 47.9°C after one full conversation with Zephyr-3B (4-bit) model on the iPhone 14 Pro.

Insights: Tractability of the LLM inference workload does not imply deployability.

Accuracy Impact of Quantization

Today’s LLMs are quite large in size. At the same time, the memory that most devices come equipped with is in the region of 6 - 12 GB. This means that deploying such models on device is usually only possible through compression. Quantization is a compression technique which lowers the precision with which weights and activations are represented. However, this happens at the expense of accuracy. We measured the impact of using different model architectures and sizes, quantization schemes and precision to the accuracy of four natural language tasks (HellaSwag, Winogrande, TruthfulQA, ARC-{E,C}).

Insights: The most evident performance difference comes from the model architecture and parameter size, and this performance difference persists across datasets. In terms of quantization schemes, it is obvious that bit width is correlated to model size, but also to accuracy, i.e., lower bit width means higher error rate.On the other hand, there was no single quantization scheme that performed uniformly better across the board. For larger models (≥7B

parameters), AWQ 5 and GPTQ 6 performed slightly better, at the expense of elevated model sizes.

Offloading to Edge Devices

Since QoE and accuracy of LLMs is impacted by running on device, as evidenced earlier, we visit an alternative model of offloading computation to nearby devices at the consumer edge. Such devices may be a dedicated accelerator (e.g., an Edge-AI Hub) or another edge device (e.g., a Smart TV or a high-end router). For this reason, we employ two Jetson devices, namely Nano (mid-tier) and AGX (high-tier) to check the viability of this paradigm.

Insights: Overall, generation throughput is significantly higher than the equivalent mobile runtime, and this runtime can also be sustained for longer periods. Indicatively, for Zephyr-7B (4-bit), the average throughput is 3.3× and 1.78× higher, for prefill and generation respectively. At the same time, we witnessed that the energy efficiency is moving the same direction as the device’s TDP.

Key Takeaways

This work highlights the potential and challenges of deploying Large Language Models on consumer mobile devices for both iOS and Android ecosystems. While advancements in hardware and algorithmic breakthroughs such as quantization show promise, on-device deployment currently impacts latency, device stability, and energy consumption. Offloading computation to local edge devices offers a viable alternative, providing improved performance and energy efficiency. Continued innovation in both hardware and software will be crucial for making local AI deployment practical, preserving user privacy, and enhancing sustainability.

Read More

We are glad to announce that the associated paper has been accepted for publication at the 30th Annual International Conference on Mobile Computing and Networking (ACM MobiCom'24). 

You can find more information in our pre-print: https://arxiv.org/abs/2403.12844 

The codebase of MELT can be found here: https://github.com/brave-experiments/MELT-public.

References

Laskaridis, S., Katevas, K., Minto, L., & Haddadi, H. (2024). MELTing point: Mobile Evaluation of Language Transformers. To appear in the 30th Annual International Conference on Mobile Computing and Networking (MobiCom). ↩︎

Introducing Apple Intelligence for iPhone, iPad, and Mac  ↩︎

Laskaridis, S., Venieris, S. I., Kouris, A., Li, R., & Lane, N. D. (2024). The future of consumer edge-ai computing. IEEE Pervasive Computing. ↩︎

Liu, Z., Zhao, C., Iandola, F., Lai, C., Tian, Y., Fedorov, I., … & Chandra, V. (2024). MobileLLM: Optimizing Sub-billion Parameter Language Models for On-Device Use Cases. arXiv preprint arXiv:2402.14905. ↩︎

Frantar, E., Ashkboos, S., Hoefler, T., & Alistarh, D. (2023). Gptq: Accurate post-training quantization for generative pre-trained transformers. International Conference on Learning Representations (ICLR). ↩︎

Lin, J., Tang, J., Tang, H., Yang, S., Chen, W. M., Wang, W. C., … & Han, S. (2024). Awq: Activation-aware weight quantization for llm compression and acceleration. Proceedings of Machine Learning and Systems (MLSys). ↩︎

Wednesday, 24. July 2024

Zcash

ECC roadmap: July 2024

ECC is building infrastructure that supports the Zcash ecosystem and makes possible a decentralized future that values individual privacy and consent, security, and human dignity. To that end, we are […] Source
ECC is building infrastructure that supports the Zcash ecosystem and makes possible a decentralized future that values individual privacy and consent, security, and human dignity. To that end, we are […]

Source


Sequoia

Partnering with rift: Sales, simplified.

The post Partnering with rift: Sales, simplified. appeared first on Sequoia Capital.
Partnering with rift: Sales, simplified.

If you’re tired of manually stitching together multiple tools, this is the platform for you.

By Stephanie Zhan and Charlie Curnin Published July 24, 2024 rift co-founders Fil Twarowski and Eddie Eriksson.

The Greatest Salesman in the World, Og Mandino’s bestseller on the philosophy of success, tells the story of a young entrepreneur who rises from tending camels to running a massive trade empire after studying ten scrolls about the importance of persistence, love and other virtues. It’s one of our favorite books here at Sequoia, in part because persistence and love are foundational in sales.

But today, persistence and love alone aren’t enough—sales is hard and probably getting harder. Modern B2B sales teams juggle research, prospecting and thousands of Zoom calls. They pull off heroics at the finish line. There are all kinds of distractions in the way, from updating the CRM to cleaning contact data. And making things more frustrating, the average sales team works across 10 different tools, with no single home base where they can focus on doing their jobs.

Fil Twarowski and Eddie Eriksson became obsessed with this problem when they ran sales and growth at the equity-management company Pulley. In the process of integrating the dozens of tools they used to make Pulley the new default for thousands of companies, Fil and Eddie recognized that running exceptional campaigns required exceptional amounts of manual work. It’s one thing to dump generic copy into mass emails. It’s another to craft personalized outbound to exactly the right audience—and to deal with the endless technical details involved in getting those messages seen, such as caps on how many emails you can send per day.

Now, as co-founders of rift, Fil and Eddie are freeing up salespeople to shift their focus back to the core of their work. The platform combines the tools modern sales teams need, in one place. It’s an email solution for perfect sequencing to prospects. It’s a dialer to make calling leads more efficient, and a scheduler for automatically routing them and booking meetings. It’s the infrastructure for managing mechanics like email deliverability. It’s applying LLMs not to replace salespeople, but to lighten their load with features such as custom copy. And it’s all in one—no more passing data back and forth between multiple tools.

In short, rift does more than just give sales teams new capabilities. It creates a delightful, focused home base where they can do their best work.

Since Fil and Eddie started rift in late 2022, leading companies including Gusto, Capchase and Rhombus have come to rely on what the team has built—and they’re getting results. One customer, for example, replaced an outbound email program paralyzed by spam issues with a more personalized approach, and generated $300,000 in pipeline in just one sequence.

And this is just the beginning. We at Sequoia are beyond excited to lead rift’s seed round and partner with Fil, Eddie and their team as they scale their vision to empower exceptional sellers.

Share Share this on Facebook Share this on Twitter Share this on LinkedIn Share this via email Related Topics #AI #Enterprise #Funding announcement Partnering with Skild: The Future of Embodied Intelligence By Stephanie Zhan News Read Developer Tools 2.0 By Charlie Curnin, Josephine Chen and Stephanie Zhan Perspective Read Replicate: Painting the Town AI Spotlight Read Partnering with Pydantic News Read JOIN OUR MAILING LIST Get the best stories from the Sequoia community. Email address Leave this field empty if you’re human:

The post Partnering with rift: Sales, simplified. appeared first on Sequoia Capital.

Tuesday, 23. July 2024

Sequoia

Better Agents Need Better Documentation

The post Better Agents Need Better Documentation appeared first on Sequoia Capital.
Better Agents Need Better Documentation

Using AI effectively requires a new kind of structured data that AI itself cannot yet produce.

By Sonya Huang and Pat Grady Published July 23, 2024

In February, Klarna boldly announced that its new OpenAI-powered assistant handled two thirds of the Swedish fintech’s customer service chats in its first month. All kinds of customer satisfaction metrics were better, but what got everyone’s attention was the $40M boost to its bottom line from replacing 700 full-time contract agents.

Since then, every company we talk to wants to know, “how do we get the Klarna customer support thing?” Against the backdrop of flashy demoware and eye-popping pre-revenue valuations, Klarna’s practical application of GPT-4 was a welcome relief. But as Klarna founder and CEO Sebastian Siemiatkowski explains in this week’s Training Data podcast, the reality of this accomplishment is both more prosaic and more profound than it first appears.

To explain the insight that fueled this breakthrough, Siemiatkowski invokes the classic computer science concept of “garbage in, garbage out” (GIGO). All the way back to Babbage, scientists have understood that “if you put into the machine wrong figures,” the right answers will not come out. For Klarna, this meant that the team needed to pay special attention to the language they gave the model as context for specific customer service tasks—in other words the documentation and training manuals they made for the agents themselves. “That helped us a lot to think about it that way, that we just needed to make sure that the documentation and the manuals were clear enough and of quality enough, and then it can actually execute,” says Siemiatkowski.

Klarna has not disclosed “the secret sauce” for how they built their assistant beyond using a form of RAG (retrieval augmented generation), so it’s not clear how much of its successful performance is due to data quality alone or to “custom cognitive architectures”1 adapted to specific tasks. Either way, this is a prime example of what we call “Goldiocks agents” that benefit both from guardrails (in this case, the specific ways that Klarna handles customer service requests) as well as the ability of LLMs to fluidly merge information in a coherent manner.

The needs of large language models bring what has been a niche concern front and center. Since it was formalized in 2020, RAG has become the industry standard to wrap the expressive elasticity of LLMs around the key facts needed to reliably execute a task, particularly in mission-critical business settings. Curating the quality of examples to retrieve from is key to success.

There are people in the world who love writing documentation, but not many. For most of us, we bemoan the bad but don’t contribute to making it better. However, high-performance companies like Stripe have pioneered “writing culture” where social norms encourage participation. Gitlab, one of the first remote-only companies, has developed a handbook-first approach to communication as a pillar of their operational excellence2.

More generally, in the case of Klarna, the company has built an elaborate knowledge graph3 to power not only their external-facing customer service reps (human and agentic) but also their internal chatbot, Kiki. This focus on the quality of documentation has dual benefits, Siemiatkowski explains: “So actually our agents have better tools today to be successful in helping the customers as does the AI. So both experiences are improving as a consequence of that.”

Siemiatkowski believes that Klarna has derived a lot of value through experimentation and building internally with AI. But in both cases he discussed—LLM-powered internal knowledge management systems and external-facing customer services agents—buying rather than building is now a very viable option with solutions like Glean and Sierra. However, buyer beware, GIGO still applies.

This focus on high-quality data is a large theme in contemporary AI. Microsoft released a paper last year, Textbooks Are All You Need, that showed how a significantly smaller model trained on better data could outperform far larger models on coding exercises. On the large model side, Microsoft CTO Kevin Scott told us in Episode 4 that it’s “a good thing that quality of data matters more than quantity of data, because it gives you an economic framework to go do the do the partnerships that you need to go do to make sure that you are feeding your AI training algorithm a curriculum that is going to result in smarter models. And, honestly, not wasting a whole bunch of compute feeding it a bunch of things that are not.” When it comes to enterprise use cases, particularly consumer-facing ones like Klarna’s, the quality bar is high because customers are comparing against human agents.

What Klarna’s experience suggests is that AI will work best when paired with human effort to understand other humans (something that LLMs seem a long way from achieving). This is actually the same customer-centric approach that is at the heart of all great businesses. 

See Training Data Ep 1 with Harrison Chase of LangChain and Ep 2 with Matan Grinberg and Eno Reyes of Factory as well Andrew Ng’s talk at Ai Ascent 2024. Also see Open Org Siemiatkowski calls out Klarna’s use of Neo4j, a graph database powered by vector search, for the Kiki project. Share Share this on Facebook Share this on Twitter Share this on LinkedIn Share this via email Related Topics #AI #Inference Klarna on Building its AI Customer Service Agent Hosted by Sonya Huang and Pat Grady Podcast Listen “Goldilocks” Agents and Cognitive Architectures By Sonya Huang and Pat Grady Perspective Read LangChain’s Harrison Chase on AI Agents Hosted by Sonya Huang and Pat Grady Podcast Listen JOIN OUR MAILING LIST Get the best stories from the Sequoia community. Email address Leave this field empty if you’re human:

The post Better Agents Need Better Documentation appeared first on Sequoia Capital.


Global Digital Finance

Navigating the EU regulatory Landscape: A guide to CASP Compliance

As the Markets in Crypto Assets (MiCA) Regulation begins its implementation journey across the EU Elise Soucie, Executive Director, GDF and Lucia della Ventura, Zodia Markets collaborated on a short guide to CASP compliance. The guide discusses how to navigate the EU regulatory landscape with a particular focus on how MiCA interacts with MiFID II […] The post Navigating the EU regulatory Landsca

As the Markets in Crypto Assets (MiCA) Regulation begins its implementation journey across the EU Elise Soucie, Executive Director, GDF and Lucia della Ventura, Zodia Markets collaborated on a short guide to CASP compliance. The guide discusses how to navigate the EU regulatory landscape with a particular focus on how MiCA interacts with MiFID II including:

The Objectives of MiCA Who & What Kinds of Activities are Regulated by MiCA How MiCA & MiFID II Interact What License is Required for Which Assets Practical Considerations for Compliance

Publish date: 23.07.2024

The post Navigating the EU regulatory Landscape: A guide to CASP Compliance appeared first on GDF.

Monday, 22. July 2024

Shade Protocol

Shade Protocol: Road To Argos September 24th Upgrade

Greetings community, June 17th, 2024 ushered in the successful Alexandria upgrade of Shade Protocol — bringing advanced protocol & SILK analytics, the prices page, liquid $SHD staking, Spartan NFTs, and more. https://app.shadeprotocol.io/analytics Now that this upgrade was complete, the time has now come for Shade Protocol to prepare its next upgrade on September 24th: Argos. Argos was

Greetings community,

June 17th, 2024 ushered in the successful Alexandria upgrade of Shade Protocol — bringing advanced protocol & SILK analytics, the prices page, liquid $SHD staking, Spartan NFTs, and more.

https://app.shadeprotocol.io/analytics

Now that this upgrade was complete, the time has now come for Shade Protocol to prepare its next upgrade on September 24th:

Argos.

Argos was an ancient Greek city spanning from 1600 BC — 146 BC, a natural cultural hub during its time. Argos produced notable poets, sculptors, and architects. Poets such as Telesilla + Ario and sculptors Polykleitos and Ageladas left their mark with a focus on proper proportions of their sculptures, emphasizing symmetry and balance. The Canon (from the Greek “Kanōn,” meaning “measure” or “rule”) was a theoretical work by Polykleitos that outlined his mathematical approach to achieving perfect proportions in sculpture. Polykleitos sought to create a formula that could be applied to sculpt the ideal human figure.

Principles of the Kanōn

Symmetria (Proportion): Polykleitos emphasized the harmonious relationship between different parts of the body. This concept is akin to symmetry, where each part of the body is in balanced proportion to the others. Mathematical Ratios: He believed that beauty in the human form could be achieved through specific numerical ratios. For instance, the head was considered one-eighth of the total height, and the length of the foot was one-sixth of the height. Contrapposto: This stance, where the weight is shifted onto one leg, creating an asymmetry in the hips and shoulders, was crucial for achieving a naturalistic and dynamic pose. It demonstrated a balance of tension and relaxation in the body, reflecting Polykleitos’ focus on harmony.

Polykleitos’ Canon was a significant contribution to the field of art, emphasizing a scientific and mathematical approach to beauty and aesthetics that resonated throughout the history of Western art.

In many ways, Shade has long carried an unnatural contrapposto where privacy has placed undue tension on certain parts of user interactions, creating an unnatural disharmony.

The Argos update aims to bring symmetry, balance, and harmony to the Shade UX as we continue on the quest to refine user stories & the ease of traversing the Shade Protocol application.

Streamlined Permit Implementation: We’re reducing the token balance viewing and storing process from 60+ paid transactions to handle all of the encrypted tokens on the Shade app to instead just 3–4 clicks, all of which will be free txs. This will significantly enhance the Shade Protocol user experience and cross multi-device management.

Balance Stores V2: Say goodbye to latency issues. We’re revolutionizing how balances are stored on local devices. Real-time updates mean you’ll always see accurate balances as you navigate the app.

Eliminate wrap UX / improve to Automatic Wrap Functionality: We’re eliminating the need for manual wrapping in most cases. Don’t worry, power users can still access this feature on the utility page. And of course, we’re keeping everything private and secure.

Portfolio V2: Get ready for a visual treat! We’re beautifying the historical view of your assets and their performance. Expect cleaner, more intuitive charting that makes tracking your investments a breeze.

Money Market Sneak Peek: We’re working on a comprehensive whitepaper for our upcoming Money Market feature. Keep an eye out for a sleek mockups & a potential testnet.

We believe Shade Protocol through its relentless drive towards ease of use can help unlock & reveal the power of private DeFi for the whole world to use.

Argos: Classical Ideals and Heroic Legacy

As we prepare for the upcoming Argos upgrade on September 24th, we are inspired by the historical and mythological significance of Argos. In Greek mythology, Argos was not only a cultural hub but also the realm of Diomedes, a legendary hero renowned for his strength and valor. Diomedes played a pivotal role in the Trojan War, leading 80 ships into battle and famously participating in the strategic deception of the Trojan Horse, which led to the fall of Troy. His contributions and heroic feats are emblematic of the balance and precision we aim to bring to the Shade Protocol through Argos.

Just as Diomedes’ leadership and prowess shaped the course of the Trojan War, the Argos upgrade represents a transformative moment for Shade Protocol. We are dedicated to bringing a new level of harmony and efficiency to our platform, inspired by the principles of symmetry and balance embodied by Diomedes’ legacy. The upgrade will enhance user experience with a powerful transaction engine, refined portfolio management, and streamlined interactions, mirroring the ideals of precision and coordination that Diomedes exemplified. Join us as we honor this rich legacy and strive to advance private DeFi with the elegance and effectiveness of classical ideals.

Conclusion

As we move forward, inspired by the historical significance and the enduring principles of Argos, Diomede, and Polykleitos’ Canon, Shade Protocol remains committed to evolving and enhancing our platform. By fostering an environment where balance and symmetry guide our development, we aim to create a seamless and harmonious user experience. The Argos upgrade represents not just a step forward in technology, but a stride towards a more refined, accessible, and powerful DeFi ecosystem. Join us on this journey to bring the elegance of classical ideals into the future of private decentralized finance.

Twitter → https://x.com/Shade_Protocol

App → https://app.shadeprotocol.io/analytics

Telegram → https://t.me/ShadeProtocol

Sunday, 21. July 2024

Zcash Foundation

Zcash Dev Fund Runoff Poll

Earlier this month, the Zcash Foundation (ZF), Electric Coin Company (ECC) and various other members and groups across the Zcash community conducted a series of polls to determine whether the was clear consensus in favour of introducing a new Dev Fund when the current one ends later this year. The results of the polls varied. […] The post Zcash Dev Fund Runoff Poll appeared first on Zcash Founda

Earlier this month, the Zcash Foundation (ZF), Electric Coin Company (ECC) and various other members and groups across the Zcash community conducted a series of polls to determine whether the was clear consensus in favour of introducing a new Dev Fund when the current one ends later this year.

The results of the polls varied. While there is clearly strong support for a new Dev Fund, there was no clear consensus regarding which proposal should be adopted. However, two proposals garnered strong support across all the polls (and were the top two proposals in the ZCAP poll).

The Hybrid Deferred Dev Fund: Transitioning to a Non-Direct Funding Model proposal would last for one year, and would allocate

8% to the Financial Privacy Foundation (FPF) for the Zcash Community Grants program, and 12% to a protocol lockbox.

The Lockbox For Decentralized Grants Allocation (20% option) proposal would allocate 20% of the block rewards to a protocol lockbox for two years (with no funding for Zcash Community Grants).

Under both proposals, the remaining 80% of the block rewards would be allocated to the miners. Funds that are directed into the protocol lockbox will remain there until a distribution mechanism is introduced in a future network upgrade. 

ZF (along with ECC and and others across the community) are conducting a runoff poll to determine which of these proposals should be adopted. Today, we opened a Helios poll of the Zcash Community Advisory Panel (ZCAP), which will close at 09:00 UTC on 29th July 2024.

As always, we welcome the community’s feedback. If you have comments or suggestions, please join the conversation on the Zcash community forum.

The post Zcash Dev Fund Runoff Poll appeared first on Zcash Foundation.

Saturday, 20. July 2024

Epicenter Podcast

How Gnosis 3.0 Paves the Way for Mass Adoption - Friederike Ernst

One of the OG flag bearers for decentralisation, Gnosis initially set out to build a prediction market on Ethereum. However, unlike the vibrant ecosystem of today, the early Ethereum days were barren. As a result, Gnosis team decided to build out their own infrastructure and tooling to suit their needs. Before long, tools such as Gnosis Safe, Zodiac, etc. were quickly adopted by the entire industr

One of the OG flag bearers for decentralisation, Gnosis initially set out to build a prediction market on Ethereum. However, unlike the vibrant ecosystem of today, the early Ethereum days were barren. As a result, Gnosis team decided to build out their own infrastructure and tooling to suit their needs. Before long, tools such as Gnosis Safe, Zodiac, etc. were quickly adopted by the entire industry. Nowadays, since the foundation has been tried and tested, Gnosis 3.0 aims to focus on improving consumer dApp UX in order to facilitate widespread adoption.

Topics covered in this episode:

Gnosis’ background and vision Gnosis 3.0 Gnosis DAO Capital allocation management How Gnosis Pay revolutionizes payment rails The ‘Baguette Conundrum’: privacy & regulatory compliance for on-chain payments The future of the EVM vs. modularity Would Gnosis shift to an L2 architecture?

Episode links:

Friederike Ernst on Twitter Gnosis DAO on Twitter Gnosis Chain on Twitter Gnosis Pay on Twitter Safe on Twitter CoW Swap on Twitter

Sponsors:

Gnosis: Gnosis builds decentralized infrastructure for the Ethereum ecosystem, since 2015. This year marks the launch of Gnosis Pay— the world's first Decentralized Payment Network. Get started today at - gnosis.io Chorus1: Chorus1 is one of the largest node operators worldwide, supporting more than 100,000 delegators, across 45 networks. The recently launched OPUS allows staking up to 8,000 ETH in a single transaction. Enjoy the highest yields and institutional grade security at - chorus.one

This episode is hosted by Sebastien Couture.

Friday, 19. July 2024

Horizen - Blog

Horizen Ecosystem H1 2024 Report

TL;DRHorizen 2.0 Upgrade is Full Steam Ahead: Community voted in favor of the upgrade to a more advanced blockchain architecture.Binance Monitoring Tag Removed: $ZEN demonstrates stability and meets Binance’s listing criteria.ZenIPs: Six ZenIPs were voted on, including the removal of shielded pools and the technical roadmap for Horizen 2.0.Technical Upgrades: Significant upgrades include new no

TL;DR Horizen 2.0 Upgrade is Full Steam Ahead: Community voted in favor of the upgrade to a more advanced blockchain architecture. Binance Monitoring Tag Removed: $ZEN demonstrates stability and meets Binance’s listing criteria. ZenIPs: Six ZenIPs were voted on, including the removal of shielded pools and the technical roadmap for Horizen 2.0. Technical Upgrades: Significant upgrades include new node software, EON 1.4, and EON Forger Nodes. New Ecosystem Partnerships: New integration with partners such as Magic Link, Rabby Wallet, and Math Wallet. Media & Events: Highlights include the Horizen Buidler House at ETHDenver and appearances at major Web3 conferences.

2024 has been quite the year so far for the Horizen Ecosystem and for $ZEN. Completing the transition away from being a privacy coin and currently undergoing the Horizen 2.0 upgrade to become the next-gen EVM optimized for ZK apps. From ZenIPs to new partners, product upgrades, Horizen turning 7, and more, let’s dive in and see what happened for the first half of 2024.

Horizen 2.0 Upgrade is Full Steam Ahead

In June of 2024, the community voted overwhelmingly in favor for  transitioning $ZEN from the legacy Horizen mainchain and Horizen EON EVM to a more advanced blockchain architecture, we call this upgrade Horizen 2.0. 

At a high level, Horizen ($ZEN) and EON are currently built on older technology stacks. The Horizen Mainchain is a fork of the Bitcoin C++ codebase with a block time of 2.5 minutes, while EON is written in Scala based on the Scorex SDK with an 18-second block time. To address the limitations of these legacy systems and align with the vision of Horizen as a home for ZK and an enduring utility for $ZEN, a ZenIP was proposed to upgrade to Horizen 2.0, which introduces more advanced features and optimizations to enhance the network performance, security, and utility of $ZEN.

To learn more about ZenIP 42406 and Horizen 2.0, check out this high-level overview that breaks everything down. 

Binance Removes the Monitoring Tag for $ZEN

On July 1st, Binance announced the removal of the monitoring tag for $ZEN recognizing its stability and performance. 

The monitoring tag was placed in Jan 2024 along with several other privacy-featured coins. In the same month, the Horizen community voted in favor of removing the privacy feature from the Horizen mainchain, removing regulatory and delisting risks for $ZEN, allowing the Horizen network to maintain its versatility and compliance in the competitive landscape of web3. The shielded pool was removed from the Horizen mainchain on February 2024. 

What else does this mean for $ZEN?  Consistent Stability and Confidence: Indicates that $ZEN has demonstrated consistent stability and security, reducing concerns over volatility and risk. Improved Listing Criteria: Indicates that $ZEN meets Binance’s stringent listing criteria, including regulatory-compliance, strong development activity, team commitment, and a secure network. Positive Market Perception: Reflects a positive outlook on $ZEN, increasing user confidence and encouraging more participation in the Horizen ecosystem.

In the same month of July, $ZEN is ranked a top 10 ZK Project by Market Cap on Coingecko

ZenIPs 

The Horizen community voted on 6 ZenIPs in the first half of this year: 

ZenIP 42207: Removal of the Shielded Pools from the Horizen Mainchain – 99.97% in Favor

This ZenIP represents a necessary evolution of the Horizen network to maintain its competitiveness, versatility, and compliance in the changing landscape of blockchain regulation and technology. The passing of this proposal speaks to the community’s desire to strategically pivot away from privacy-focused features towards a more scalable and regulatory-compliant framework. 

ZenIP 42400: Modernizing Horizen Technology via a Modular Proof Verification Layer – 100% in Favor

This ZenIP details a comprehensive understanding of the new modular system proposed by Horizen Labs. It outlines the strategic approach and delves into the technical aspects of a modular proof verification layer. Horizen Labs later announced that it will independently undertake the buildout of the proof verification chain in order to bring it to market more quickly. 

ZenIP 42401: Updating Proposal and Quorum Thresholds – 77.34% in Favor

The purpose of this ZenIP is to seek the community’s decision on lowering the thresholds for formally submitting improvement proposals and achieving a quorum of votes for them. The goal of this ZenIP is to make the governance of Horizen DAO more accessible for community members, increase the likelihood of successful votes (valid due to reaching a quorum), and bring quorum requirements in line with industry standards.

ZenIP 42404: Delegated Staking Reward Mechanism on EON – 99.87% in Favor

This ZenIP proposed to improve the usability of the EON staking system at the protocol level by automating the reward distribution among the forger hosting provider and the $ZEN holders who stake their ZEN to the forger node. The goal of the upgrade is for the staking security system to calculate and pay the stake reward directly to the staker’s wallet address without first going through the forger node itself.

ZenIP-42405: Directing the Horizen Foundation to Request a Migration Proposal for $ZEN and EON – 99.99% in Favor

This ZenIP seeks the community’s approval to direct Horizen Foundation to formally request a proposal from Horizen Labs regarding a technical roadmap for $ZEN and EON migration. 

ZenIP 42406: Technical Roadmap for the Migration of $ZEN and EON – 99.87% in Favor

This ZenIP seeks the community’s approval on a detailed strategic technical plan for transitioning the $ZEN cryptocurrency from the current Horizen Mainchain and Horizen EON EVM to a more advanced blockchain architecture. The goal of this technical upgrade is to improve Horizen’s blockchain performance, incorporate the latest advancements in zero-knowledge proof systems into Horizen, and enhance the utility of $ZEN.

Technical Upgrades and Updates

Throughout 2024 so far, we’ve seen a number of major upgrades and updates take place:

Horizen Node Software Upgrade ZEN 5.0.3 ZEN 5.0.2 EON 1.4 Upgrade   New Horizen EON Explorer is live The Removal of Shielded Pool from Horizen mainchain The deprecation of Secure Nodes EON Forger Node becomes available to the public  Deprecation of TokenMint sidechain New Ecosystem Partners

So far in 2024, we’ve welcomed a number of new partnerships and integrations to the Horizen Ecosystem. From new wallet integrations like Magic Link, Rabby Wallet, and Math Wallet, to a Safe integration via OnchainDen, we’re continuously growing the Horizen Ecosystem! 

Magic Link Rabby Wallet 1RPC KuSwap NFT Marketplace MathWallet OnchainDen Polytrade

We look forward to welcoming our current network partners and integrators onto the new Horizen2.0 ecosystem! 

Media & Thought Leadership  CEO of Horizen Labs & Co-Founder of Horizen, Rob Viglione, talks ZK and Horizen on the Absolutely Zero Knowledge Podcast StealthEX AMA on Horizen, Horizen Labs, & zkVerify zkTalk with Gevulot on zkVerify, zk Tech, and Horizen Co-founder, Rob Viglione, authors “Modular Verification Proof” for CoinMarketCap Glossary Horizen Co-founder on Abosolutely Zero Knowledge  Events Horizen Buidler House at ETHDenver

During ETHDenver earlier this year we hosted a Web3 Builders Day in collaboration with Supermoon camp, it was a great day full of panels, discussions, networking, and connecting! 

Rolf Versluis, Co-Founder of Horizen gave a keynote presentation on Horizen, we got to hear some talks from ecosystem partners like Yuzu, Covalent, Tatum, and Stably, discussed modular vs monolithic blockchains, reviewed investing in the next Web3 unicorns and more! 

For those who may have missed our recap, we have a thread to review it here: https://x.com/horizenglobal/status/1763687745694957569

In addition to ETHDenver – the team was present at some of the largest Web3 conferences and events of 2024, including Paris Blockchain Week, Token2049 Dubai, Consensus, and ETHCC! 

Consensus 2024

Later this year, we are looking to have a presence at ETHMilan, zkHack, Korea Blockchain Week, Token2049, and Devcon!

The post Horizen Ecosystem H1 2024 Report appeared first on Horizen Blog.


Zcash Foundation

The Zcash Foundation bids farewell to Teor

We are sad to announce that Teor is leaving the Zcash Foundation.  Teor joined ZF in June 2020, having previously worked with the Tor Project. They quickly became a key member of the engineering team, acting as engineering lead for extended periods of time, and helping onboard and mentor new engineering hires as the organisation […] The post The Zcash Foundation bids farewell to Teor appear

We are sad to announce that Teor is leaving the Zcash Foundation. 

Teor joined ZF in June 2020, having previously worked with the Tor Project. They quickly became a key member of the engineering team, acting as engineering lead for extended periods of time, and helping onboard and mentor new engineering hires as the organisation grew. They were a leading contributor to the successful development and launch of Zebra, and played an important role in responding to the Halborn security disclosure. They contributed to several ZIPs, regularly attended ZIP editors meetings, and formally became a ZIP editor in September 2023. 

Teor went on sabbatical in early 2024, and has now decided to explore new challenges in non-Proof of Work Web3 technologies. 

Teor remains a firm supporter of the Zcash mission, and will continue to support ZF as a member of our Technical Advisory Board. 

On behalf of their colleagues, both with ZF and across the broader Zcash ecosystem, I want to thank Teor for their contributions and service to the Zcash community, and wish them the very best as they embark on the next phase of their career!

The post The Zcash Foundation bids farewell to Teor appeared first on Zcash Foundation.

Wednesday, 17. July 2024

BlueYard Capital

BlueYard Crypto 3: backing the ultimate capital x compute coordination machine with our latest $75m…

BlueYard Crypto 3: backing the ultimate capital x compute coordination machine with our latest $75m early-stage crypto fund As we began to deploy our 3rd dedicated crypto fund, BlueYard Crypto 3* (raised last year), we took a step back to reflect on our thesis and approach behind crypto investing. What are the core belief pillars underpinning our strategy? In the history of technology, econ
BlueYard Crypto 3: backing the ultimate capital x compute coordination machine with our latest $75m early-stage crypto fund

As we began to deploy our 3rd dedicated crypto fund, BlueYard Crypto 3* (raised last year), we took a step back to reflect on our thesis and approach behind crypto investing. What are the core belief pillars underpinning our strategy?

In the history of technology, economics and organizations — never has there been a technology and incentive system that can autonomously coordinate machines (compute, storage, etc), algorithms, data and capital at such an incredible scale, so quickly — all while providing independent, robust and trustless infrastructure for economic activity and data. Although still early in its development, crypto might be the ultimate solution to the collective action problem. Also, if one believes in a future where the autonomous coordination between compute, data and capital will play an increasing role, crypto networks could be the “under the hood” operating system for large economic networks.

Societies and economies will highly value censorship-resistant money and information flows enabled by decentralization, the ability to cut out intermediaries in many large-scale markets — thereby unlocking higher degrees of innovation and efficiency. The creation of portable internet-native assets with the ability to create mass- & user-centric ownership of traditionally centralized businesses will enable a new economic paradigm.

If large networks of coordinated machines and capital can be transformative to many industries and markets relying on computation, data and economic flows in the future — one might reasonably conclude that crypto networks, with their unique properties, have the potential to form a key backbone of such future markets — ranging from software / data / compute markets to physical infrastructure. If crypto can be the future of many markets and financial systems, today’s crypto infrastructure and applications are a call option on that future. This is what BlueYard has always believed — and is reflected in our investment hypothesis and applied strategy of investing at the earliest stages in:

the core functionality of the capital x compute coordination machine, such as the creation of consensus and the economic and technological enhancement of chains (e.g. Protocol Labs / IPFS, Filecoin, Flashbots, Kiln, Ingonyama) the augmentation of core chains to make their consensus and compute useful & usable for developers and users (e.g. Privy, Tableland, Cryptio) an emerging application layer that leverages crypto’s unfair advantages (e.g. 3NUM, Nudge, Radicle / Drips) in web services and financial products (e.g. Centrifuge)

We further augment this by selectively investing in liquid tokens that are a match for our thesis (e.g. Render, Helium, Dimo) and some large-cap L1s for purposes of working capital (e.g. Solana, ETH, Cosmos), with a view to re-deploy the asset base into their native ecosystems.

So if you are building along our thesis or want to open our minds to something completely new — we would like to hear from you. Or if you are looking to work on some open-ended research without committing to scaling a startup, take a look at our DYOR program. BlueYard is a firm that invests in foundational technologies across compute, engineering, software and biology, offering founders a community of like-minded, ambitious and long-term oriented builders that can help each other beyond the boundaries of their respective domains.

Disclaimer: This post is for general information purposes only. It does not constitute investment advice or a recommendation or solicitation to buy or sell any investment and should not be used in the evaluation of the merits of making any investment decision. It should not be relied upon for accounting, legal or tax advice or investment recommendations. This post reflects the current opinions of the authors and is not made on behalf of BlueYard or its affiliates and does not necessarily reflect the opinions of BlueYard, its affiliates or individuals associated with BlueYard. The opinions reflected herein are subject to change without being updated.

*If you are wondering why we are announcing BlueYard Crypto 3 when the last crypto fund we announced was BlueYard Crypto 1 (vs 2) — it is because we confusingly called our first crypto fund BY Crypto 0 (a spin-out of BY 1), whereas our first general fund generation was named BlueYard 1. We are just aligning our naming policy to be less confusing. Right now we are investing BlueYard 3 ($175m) and BlueYard Crypto 3 ($75m).


a16z Podcast

Transitioning From Gymnast to Investor with Aly Raisman

Former gymnast and current investor Aly Raisman joins general partner Julie Yoo and investment partner Daisy Wolf of a16z Bio + Health. In this episode, Aly Raisman shares her quest for healthier living—physically, mentally, and financially—on her journey from gymnast to a business investor. Having transitioned from an intensely structured routine, Aly emphasizes the need for more open conversati

Former gymnast and current investor Aly Raisman joins general partner Julie Yoo and investment partner Daisy Wolf of a16z Bio + Health.

In this episode, Aly Raisman shares her quest for healthier living—physically, mentally, and financially—on her journey from gymnast to a business investor. Having transitioned from an intensely structured routine, Aly emphasizes the need for more open conversations about mental health and financial literacy. She speaks passionately about the gap in women’s health solutions and hopes to inspire entrepreneurs to create impactful businesses. Aly’s experiences as a patient, survivor, and global figure adds a unique dimension to her perspective as an investor. This candid conversation with Aly and Julie Yoo sheds light on Aly’s passion for more education within the investment space, offering invaluable insights for entrepreneurs, particularly in biotech and healthcare.

 

Resources: 

Find Aly on Twitter: https://x.com/aly_raisman

Find Julie on Twitter: https://x.com/julesyoo

Find Daisy on Twitter: https://x.com/daisydwolf

 

Stay Updated: 

Let us know what you think: https://ratethispodcast.com/a16z

Find a16z on Twitter: https://twitter.com/a16z

Find a16z on LinkedIn: https://www.linkedin.com/company/a16z

Subscribe on your favorite podcast app: https://a16z.simplecast.com/

Follow our host: https://twitter.com/stephsmithio

Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures.

Monday, 15. July 2024

Zcash

Paul Brigner transitions from ECC, joins Bootstrap Org Board of Directors

Paul Brigner has resigned from his role at Electric Coin Co. (ECC) as Vice President of Strategic Alliances to join Coinbase and lead the Coinbase Institute. While he leaves a […] Source
Paul Brigner has resigned from his role at Electric Coin Co. (ECC) as Vice President of Strategic Alliances to join Coinbase and lead the Coinbase Institute. While he leaves a […]

Source


Horizen - Blog

Keep Your ZEN Safe and Avoid Scams During Horizen 2.0 Upgrade

Dear community members. Our transformative upgrade to Horizen 2.0 is full steam ahead since the community voted in favor of the ZenIP 42406, which proposes a roadmap for the upgrade of $ZEN and EON! Security Best Practices Please follow the steps below to ensure a smooth and safe experience during the 2.0 upgrade and keep […] The post Keep Your ZEN Safe and Avoid Scams During Horizen 2.0 Upgr

Dear community members.

Our transformative upgrade to Horizen 2.0 is full steam ahead since the community voted in favor of the ZenIP 42406, which proposes a roadmap for the upgrade of $ZEN and EON!

Security Best Practices

Please follow the steps below to ensure a smooth and safe experience during the 2.0 upgrade and keep your ZEN safe from scammers:

Do NOT send funds to anyone claiming to be from Horizen. We would never ask the community to send us ZEN or any other type of funds. Do NOT share your private keys with anyone. We would never ask for private keys. If your private keys are not safe and secure, neither are your crypto assets. Do NOT install anything on your device that is not from the official sources. Do NOT give anybody remote access to your device. We’d never ask you to install any software for remote access. Do NOT trust any DMs that give instructions for any type of Horizen airdrop. Official team members would never DM you. We only communicate the procedure on our official public channels. The best way to get support is to open a ticket on the “open-a-ticket” channel on the Horizen Discord. Always double-check the social media handle and URL to make sure you are visiting the official channels.

Official Channels

The following channels are official and safe accounts to follow:

Horizen X: https://x.com/horizenglobal Horizen Labs X: https://x.com/HorizenLabs Telegram: https://t.me/horizencommunity Discord: https://horizen.io/invite/discord  Blog: https://blog.horizen.io Discourse: https://horizen.discourse.group  Website: www.horizen.io  Github: https://github.com/HorizenOfficial Newsletter Signup: https://www.horizen.io/developer-newsletter/ 

What to do if you find a scam or have a question about an announcement? Please report the group or account to our teams on Discord

Here is the ONLY official invitation link to the Discord: https://horizen.io/invite/discord 

You can also contact us on Telegram “horizen community”: https://t.me/horizencommunity 

Example of a scam attempt

As you can see in this scam attempt, users are being misled. They may think they’re on the official channel, but they’re not.

The only Horizen domain you should trust is horizen.io and blog.horizen.io.

Any additional words are domains bought by scammers. In this example: migrate-horizen.io.

The name of the scam Telegram group here is https://t.me/horizencommunitty Can you spot the scam?

In this scam example, there are two “t’s” in the word “community”.

You can also check when the group/user providing the information was created. Usually, scammers are very new accounts with little or no message history.

The post Keep Your ZEN Safe and Avoid Scams During Horizen 2.0 Upgrade appeared first on Horizen Blog.


Zcash Foundation

Results of the ZCAP Dev Fund poll

The Zcash Dev Fund poll has now closed and the results are in! The purpose of this poll was to determine whether there is clear consensus amongst the Zcash community that a new Dev Fund should be established when the current Dev Fund expires in November, and, if so, which Dev Fund proposal should be […] The post Results of the ZCAP Dev Fund poll appeared first on Zcash Foundation.

The Zcash Dev Fund poll has now closed and the results are in!

The purpose of this poll was to determine whether there is clear consensus amongst the Zcash community that a new Dev Fund should be established when the current Dev Fund expires in November, and, if so, which Dev Fund proposal should be adopted. 

Members of the Zcash Community Advisory Panel (ZCAP) were asked to indicate their approval for each of the six proposals for a new Dev Fund, and for allowing the current Dev Fund to end without establishing a new Dev Fund.

Here are the results:

Under the Majority Choice Approval (MCA-M or Modified Bucklin) voting method, the winning proposal is the Lockbox for Decentralized Grants Allocation (20% option), which directs 20% of the block rewards to a lockbox, for future disbursement in a manner that is consistent with results of community polling.

Note that cumulative percentages may not total to 100% due to abstentions.

The poll also asked voters to express their approval of direct and non-direct funding models, with an additional option of “I wish to reserve judgment until more information about a Non-Direct Funding Model is available”.

Here are the results:

Thank you to everyone who voted!

The post Results of the ZCAP Dev Fund poll appeared first on Zcash Foundation.

Sunday, 14. July 2024

Epicenter Podcast

Ethereum at a Crossroads - Vitalik Buterin (Live from EthCC 7)

We couldn’t miss EthCC 7, nor the chance to chat with Vitalik Buterin about Ethereum’s status quo as an L1 amidst the plethora of L2s competing for market share, and what challenges will most likely arise along the way (e.g. staking decentralisation). Tune in for a captivating discussion on the importance of decentralisation as a last stand of human empowerment in the current geopolitical context

We couldn’t miss EthCC 7, nor the chance to chat with Vitalik Buterin about Ethereum’s status quo as an L1 amidst the plethora of L2s competing for market share, and what challenges will most likely arise along the way (e.g. staking decentralisation).

Tune in for a captivating discussion on the importance of decentralisation as a last stand of human empowerment in the current geopolitical context and how it could positively impact AI's trajectory. 

Topics covered in this episode:

Types of Ethereum ‘hardnesses’ (sic) The L1 status quo Staking decentralisation & block building Non-financial crypto applications Account abstraction & interoperability Solana & Bitcoin ecosystems DeSci, biotech & longevity Geopolitics Ethereum nation state? The impact of AI Do not go gentle into that good night

Episode links:

Vitalik Buterin on Twitter EthCC on Twitter Ethereum on Twitter

Sponsors:

Gnosis: Gnosis builds decentralized infrastructure for the Ethereum ecosystem, since 2015. This year marks the launch of Gnosis Pay— the world's first Decentralized Payment Network. Get started today at - gnosis.io Chorus1: Chorus1 is one of the largest node operators worldwide, supporting more than 100,000 delegators, across 45 networks. The recently launched OPUS allows staking up to 8,000 ETH in a single transaction. Enjoy the highest yields and institutional grade security at - chorus.one

This episode is hosted by Sebastien Couture & Brian Fabian Crain.

Friday, 12. July 2024

a16z Podcast

Live at Tech Week: Delivering AI Products to Millions

Less than two years since the breakthrough of text-based AI, we now see incredible developments in multimodal AI models and their impact on millions of users. As part of New York Tech Week, we brought together a live audience and three leaders from standout companies delivering AI-driven products to millions. Gaurav Misra, Cofounder and CEO of Captions, Carles Reina, Chief Revenue Officer of Elev

Less than two years since the breakthrough of text-based AI, we now see incredible developments in multimodal AI models and their impact on millions of users.

As part of New York Tech Week, we brought together a live audience and three leaders from standout companies delivering AI-driven products to millions. Gaurav Misra, Cofounder and CEO of Captions, Carles Reina, Chief Revenue Officer of ElevenLabs, and Laura Burkhauser, VP of Product at Descript discuss the challenges and opportunities of designing AI-driven products, solving real customer problems, and effective marketing.

From the critical need for preventing AI misuse to ensuring international accessibility, they cover essential insights for the future of AI technology.

 

Resources: 

Find Laura on Twitter: https://x.com/burkenstocks

Find Carles on Twitter :https://twitter.com/carles_reina

Find Gaurav of Twitter: https://twitter.com/gmharhar

 

Stay Updated: 

Let us know what you think: https://ratethispodcast.com/a16z

Find a16z on Twitter: https://twitter.com/a16z

Find a16z on LinkedIn: https://www.linkedin.com/company/a16z

Subscribe on your favorite podcast app: https://a16z.simplecast.com/

Follow our host: https://twitter.com/stephsmithio

Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures.

 

Wednesday, 10. July 2024

Zcash Foundation

Polling ZCAP on a New Dev Fund

In 2020, the Zcash community decided to establish a Dev Fund, consisting of 20% of block rewards, allocated between the Electric Coin Company, the Zcash Foundation, and a major grants program, now known as Zcash Community Grants.  Since the Dev Fund was established, the Zcash ecosystem has evolved and changed significantly, becoming more decentralized as […] The post Polling ZCAP on a New D

In 2020, the Zcash community decided to establish a Dev Fund, consisting of 20% of block rewards, allocated between the Electric Coin Company, the Zcash Foundation, and a major grants program, now known as Zcash Community Grants. 

Since the Dev Fund was established, the Zcash ecosystem has evolved and changed significantly, becoming more decentralized as new teams have joined the ecosystem. However, that Dev Fund will end later this year, and the question we now face is how best to support this continued evolution, growth and progress towards greater decentralization. 

Last year, the community began discussing whether there should be a new Dev Fund, and, if so, what it should look like. After months of discussion, debate, community calls, Twitter spaces, and exploratory polls, we now have a slate of six proposals for the community to consider:

Manufacturing Consent; Re-Establishing a Dev Fund for ECC, ZF, ZCG, Qedit, FPF, and ZecHub by Noamchom Lockbox For Decentralized Grants Allocation (perpetual 50% option) by Skylar Saveland Establishing a Hybrid Dev Fund for ZF, ZCG and a Dev Fund Reserve by Jack Gavigan Hybrid Deferred Dev Fund: Transitioning to a Non-Direct Funding Model by Jason McGee, Peacemonger & GGuy Lockbox For Decentralized Grants Allocation (20% option) by Kris Nuttycombe Masters Of The Universe? by Noamchom

If the community does not reach clear consensus in favour of adopting a particular proposal, the Dev Fund will expire in November as planned.

A Note on Impartiality

One of proposals was drafted by the Zcash Foundation (ZF), based on the results of exploratory polling. However, ZF is not endorsing or advocating for any specific Dev Fund proposal. This is a decision for the community to make, and our primary objective is to help the community reach consensus, even if the end result is no funding for ZF. 

We at ZF believe that the work we have carried out over the past few years has been of benefit to Zcash and the Zcash community, and we hope that the Zcash community agrees, and is willing to continue funding that work. 

However, for the avoidance of any doubt, if the Zcash community decides that the Dev Fund should end and not be replaced (or that it should be replaced but that ZF should not receive any funding), ZF will support that decision, and do what is necessary to enact it (including implementing the new consensus rules in Zebra). 

Polling ZCAP 

Today, we opened a Helios poll of the Zcash Community Advisory Panel (ZCAP), which will close at 09:00 UTC on 15th July 2024.

The purpose of this poll is to determine whether there is clear consensus amongst the Zcash community that a new Dev Fund should be established when the current Dev Fund expires in November, and, if so, which Dev Fund proposal should be adopted. 

The first question asks respondents to indicate their approval for allowing the current Dev Fund to end without establishing a new Dev Fund, and sending 100% of the block rewards to miners.

The next six questions ask respondents to indicate their approval for each of the six proposals for a new Dev Fund. A summary of the various Dev Fund proposals can be found here, and the poll includes links to the relevant ZIPs, which we strongly encourage you to review in detail.

The poll asks respondents to indicate their approval for each proposal on a five-point scale: 

Strongly Approve Approve Neutral Disapprove Strongly Disapprove

The outcome of the poll will be determined using a Majority Choice Approval (MCA-M or Modified Bucklin) voting method, which works as follows:  Respondents rate each proposal on the 5-point scale. First, only the “Strongly Approve” ratings are counted. If one or more proposals is rated “Strongly Approve” by more than 50% of respondents, the proposal with the highest number is selected for implementation in NU6. If not, the number of “Approve” ratings is added to the “Strongly Approve” counts. If a proposal now is approved of by more than 50% of respondents, it is regarded as the preferred proposal. If there is still no clear consensus, “Neutral” ratings are added, and so on. The proposal with the highest combined total in each step is regarded as the preferred proposal if it reaches more than 50%.

In the event that two or more proposals garner majority support, but there is no clear community consensus, there will be a runoff poll.

Future Evolution of Development Funding Models

The final two poll questions solicit respondents’ input regarding the future evolution of the Dev Fund, specifically regarding direct versus non-direct funding models. Electric Coin Company CEO Josh Swihart has proposed that the current Direct Funding Model (which allocates block reward directly to specific entities or organizations by embedding their wallet addresses into the protocol) be replaced by a Non-Direct Funding Model, which instead allocates a portion of the block rewards to a multi-sig wallet. Those who seek funding to contribute to Zcash would then be required to submit proposals and compete for funding. 

While the design, governance and implementation details have not yet been fleshed out, several community polls have indicated strong community support for a Non-Direct Funding Model, and several of the current Dev Fund proposals direct a slice of the Dev Fund to a “lockbox”, in anticipation that a Non-Direct Funding Model may be established in the future. Therefore, we want to ensure that ZCAP members have the opportunity to express their views on this topic. 

The Helios poll will close at 09:00 UTC on 15th July 2024. 

As always, we welcome the community’s feedback. If you have comments or suggestions, please join the conversation on the Zcash community forum.

The post Polling ZCAP on a New Dev Fund appeared first on Zcash Foundation.


Global Digital Finance

Digital assets – on the road to mass adoption

Madeleine Boys, Director of Programmes and Innovation, GDF, reports on policy and consultation updates from around the world. The post Digital assets – on the road to mass adoption appeared first on GDF.

Madeleine Boys, Director of Programmes and Innovation, GDF, reports on policy and consultation updates from around the world.

The post Digital assets – on the road to mass adoption appeared first on GDF.


For whom the crypto bell polls

Elise Soucie, Executive Director, GDF, discusses what the UK’s political shift means for the crypto and digital finance industry, how it affects ongoing initiatives as well as the timelines for certain pieces of the digital asset regulatory framework. The post For whom the crypto bell polls appeared first on GDF.

Elise Soucie, Executive Director, GDF, discusses what the UK’s political shift means for the crypto and digital finance industry, how it affects ongoing initiatives as well as the timelines for certain pieces of the digital asset regulatory framework.

The post For whom the crypto bell polls appeared first on GDF.


Raising the Stakes – The UK’s Incoming Rules for DeFi & Staking

Elise Soucie, Executive Director, GDF, discusses staking amidst the UK’s commitment to building staking regulation. The post Raising the Stakes – The UK’s Incoming Rules for DeFi & Staking appeared first on GDF.

Elise Soucie, Executive Director, GDF, discusses staking amidst the UK’s commitment to building staking regulation.

The post Raising the Stakes – The UK’s Incoming Rules for DeFi & Staking appeared first on GDF.


FATF & Furious – Preparing for Travel Rule Implementation

Elise Soucie, Executive Director, GDF, discusses the travel rule and compliance. The post FATF & Furious – Preparing for Travel Rule Implementation appeared first on GDF.

Elise Soucie, Executive Director, GDF, discusses the travel rule and compliance.

The post FATF & Furious – Preparing for Travel Rule Implementation appeared first on GDF.


The Sandbox: Building a Tech Friendly Regime for a Growing Crypto Industry

Elise Soucie, Executive Director, GDF, discusses the Digital Securities Sandbox (DSS) in the UK. The post The Sandbox: Building a Tech Friendly Regime for a Growing Crypto Industry appeared first on GDF.

Elise Soucie, Executive Director, GDF, discusses the Digital Securities Sandbox (DSS) in the UK.

The post The Sandbox: Building a Tech Friendly Regime for a Growing Crypto Industry appeared first on GDF.


E.T.?

Elise Soucie, Executive Director, GDF, discusses the recent regulatory news involving bitcoin ETPs, cETNs, and BlackRock’s bitcoin ETF – and what this all means. The post E.T.? appeared first on GDF.

Elise Soucie, Executive Director, GDF, discusses the recent regulatory news involving bitcoin ETPs, cETNs, and BlackRock’s bitcoin ETF – and what this all means.

The post E.T.? appeared first on GDF.


Dial R for Regulation

Elise Soucie, Executive Director, GDF, discusses policy and consultation updates from around the world, as well as the latest digital innovations in institutional markets. The post Dial R for Regulation appeared first on GDF.

Elise Soucie, Executive Director, GDF, discusses policy and consultation updates from around the world, as well as the latest digital innovations in institutional markets.

The post Dial R for Regulation appeared first on GDF.


The Wild West of money: Preserving singleness in a digital age

Elise Soucie, Executive Director, GDF, discusses conserving the “singleness” of money, amidst the Bank of England’s priorities of preserving financial stability as well as the singleness of money as new forms of digital money continue to evolve. The post The Wild West of money: Preserving singleness in a digital age appeared first on GDF.

Elise Soucie, Executive Director, GDF, discusses conserving the “singleness” of money, amidst the Bank of England’s priorities of preserving financial stability as well as the singleness of money as new forms of digital money continue to evolve.

The post The Wild West of money: Preserving singleness in a digital age appeared first on GDF.

Monday, 08. July 2024

a16z Podcast

Marc Andreessen on Building Netscape & the Birth of the Browser

"The Ben & Marc Show," featuring a16z co-founders Marc Andreessen and Ben Horowitz.  In this special episode, Marc and Ben dive deep into the REAL story behind the creation of Netscape—a web browser co-created by Marc that revolutionized the internet and changed the world. As Ben notes at the top, until today, this story has never been fully told either in its entirety or accurately.&nbs

"The Ben & Marc Show," featuring a16z co-founders Marc Andreessen and Ben Horowitz. 

In this special episode, Marc and Ben dive deep into the REAL story behind the creation of Netscape—a web browser co-created by Marc that revolutionized the internet and changed the world. As Ben notes at the top, until today, this story has never been fully told either in its entirety or accurately. 

In this one-on-one conversation, Marc and Ben discuss Marc's early life and how it shaped his journey into technology, the pivotal moments at the University of Illinois that led to the development of Mosaic (a renegade browser that Marc developed as an undergrad), and the fierce competition and legal battles that ensued as Netscape rose to prominence. 

Ben and Marc also reflect on the broader implications of Netscape's success, the importance of an open internet, and the lessons learned that still resonate in today's tech landscape (especially with AI). That and much more. Enjoy!

Watch the FULL Episode on YouTune: https://youtu.be/8aTjA_bGZO4

 

Resources: 

Marc on X: https://twitter.com/pmarca 

Marc’s Substack: https://pmarca.substack.com/ 

Ben on X: https://twitter.com/bhorowitz 

Book mentioned on this episode: 

- “Expert Political Judgment” by Philip E. Tetlock https://bit.ly/45KzP6M 

TV Series mentioned on this episode: 

- “The Mandalorian” (Disney+) https://bit.ly/3W0Zyoq

 

Stay Updated: 

Let us know what you think: https://ratethispodcast.com/a16z

Find a16z on Twitter: https://twitter.com/a16z

Find a16z on LinkedIn: https://www.linkedin.com/company/a16z

Subscribe on your favorite podcast app: https://a16z.simplecast.com/

Follow our host: https://twitter.com/stephsmithio

Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures.

Friday, 05. July 2024

Zcash

Zooko and a new focus for Zcash resilience

  From the Bootstrap Board of Directors: Zcash is at an inflection point. With a more decentralized Zcash contributor model taking shape and a new development fund hanging in the […] Source
  From the Bootstrap Board of Directors: Zcash is at an inflection point. With a more decentralized Zcash contributor model taking shape and a new development fund hanging in the […]

Source


Epicenter Podcast

‘Aave v4 Will Unify Cross-Chain Liquidity’ - Stani Kulechov

One of the bluechip DeFi projects and backbone of Ethereum liquidity markets, Aave has recently announced their V4 upgrade proposal, which aims (among others) to unify crosschain liquidity. We were joined by Stani Kulechov, founder & CEO of Aave, to discuss the DeFi (r)evolution since ETH Lend to Aave V4, expanding to non-EVM chains and potentially even building a self-sovereign chain. 

One of the bluechip DeFi projects and backbone of Ethereum liquidity markets, Aave has recently announced their V4 upgrade proposal, which aims (among others) to unify crosschain liquidity.

We were joined by Stani Kulechov, founder & CEO of Aave, to discuss the DeFi (r)evolution since ETH Lend to Aave V4, expanding to non-EVM chains and potentially even building a self-sovereign chain. 

Topics covered in this episode:

The evolution of Aave Aave V3 risk management & MakerDAO feud Aave V4 Unified liquidity layer & risk pricing GHO stablecoin How the unified liquidity layer improves UX RWA & liquidation strategies in Aave V4 Expanding to non-EVM networks Building a self-sovereign chain Aave’s role in the future of DeFi How stablecoins will evolve DeFi institutional adoption Future roadmap

Episode links:

Stani Kulechov on Twitter Aave on Twitter Lens Protocol on Twitter

Sponsors:

Gnosis: Gnosis builds decentralized infrastructure for the Ethereum ecosystem, since 2015. This year marks the launch of Gnosis Pay— the world's first Decentralized Payment Network. Get started today at - gnosis.io Chorus1: Chorus1 is one of the largest node operators worldwide, supporting more than 100,000 delegators, across 45 networks. The recently launched OPUS allows staking up to 8,000 ETH in a single transaction. Enjoy the highest yields and institutional grade security at - chorus.one

This episode is hosted by Sebastien Couture & Friederike Ernst.

Wednesday, 03. July 2024

Zcash

ECC Transparency Report for Q4 2023

Why release a transparency report? ECC is committed to openness and transparency — as we help evolve and support the Zcash digital currency, and in support of our mission to […] Source
Why release a transparency report? ECC is committed to openness and transparency — as we help evolve and support the Zcash digital currency, and in support of our mission to […]

Source


a16z Podcast

The Art of Technology, The Technology of Art

We know that technology has changed art, and that artists have evolved with every new technology — it’s a tale as old as humanity, moving from cave paintings to computers. Underlying these movements are endless debates around inventing versus remixing; between commercialism and art; between mainstream canon and fringe art; whether we’re living in an artistic monoculture now (the answer may surpris

We know that technology has changed art, and that artists have evolved with every new technology — it’s a tale as old as humanity, moving from cave paintings to computers. Underlying these movements are endless debates around inventing versus remixing; between commercialism and art; between mainstream canon and fringe art; whether we’re living in an artistic monoculture now (the answer may surprise you); and much much more. 

So in this new episode featuring Berlin-based contemporary artist Simon Denny -- in conversation with a16z crypto editor in chief Sonal Chokshi -- we discuss all of the above debates. We also cover how artists experimented with the emergence of new technology platforms like the web browser, the iPhone, Instagram and social media; to how generative art found its “native” medium on blockchains, why NFTs; and other art movements. 

Denny also thinks of entrepreneurial ideas -- from Peter Thiel's to Chris Dixon's Read Write Own -- as an "aesthetic"; and thinks of technology artifacts (like NSA sketches!) as art -- reflecting all of these in his works across various mediums and contexts. How has technology changed art, and more importantly, how have artists changed with technology? How does art change our place in the world, or span beyond space? It's about optimism, and seeing things anew... all this and more in this episode.

 

Resources: 

Find Denny on Twitter: https://x.com/dennnnnnnnny

Find Sonal on Twitter: https://x.com/smc90

 

Stay Updated: 

Let us know what you think: https://ratethispodcast.com/a16z

Find a16z on Twitter: https://twitter.com/a16z

Find a16z on LinkedIn: https://www.linkedin.com/company/a16z

Subscribe on your favorite podcast app: https://a16z.simplecast.com/

Follow our host: https://twitter.com/stephsmithio

Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures.

Monday, 01. July 2024

Horizen - Blog

StealthEX AMA Recap

On June 26th, 2024, Bryan Nowlan, Marketing Project Manager from Horizen Labs, participated in a text AMA with the StealthEX community to discuss all things Horizen Ecosystem, Horizen Labs, zkVerify, and more! In case you didn't get a chance to catch it live, we put together this recap blog of the questions and answers. Question: What technical […] The post StealthEX AMA Recap appea

On June 26th, 2024, Bryan Nowlan, Marketing Project Manager from Horizen Labs, participated in a text AMA with the StealthEX community to discuss all things Horizen Ecosystem, Horizen Labs, zkVerify, and more! 

In case you didn’t get a chance to catch it live, we put together this recap blog of the questions and answers. 

Question: What technical support and resources are available for developers and users of the ZKVProtocol? How does HorizenLabs ensure the long-term sustainability and maintenance of the protocol?

Answer: zkVerify has in-depth technical documentation available for developers and builders that are looking to interact with the project. The documentation can be reached here: https://docs.zkverify.io/

Ranging from an overview of zkVerify, to the zkVerify Hub with important links and recommendations, to a tutorials section, users can find all they need to get started with the project. 

Our tech teams are also very responsive on the zkVerify Discord, if any users run into any issues or have any questions, members of the team can jump in there to provide support: discord.gg/zkverify

Horizen Labs is the builder of zkVerify, so we are deeply invested in the long-term sustainability and maintenance of the protocol. We have extensive experience with zero-knowledge technologies and an entire team dedicated to ensuring zkVerify’s success!

Question: How does the ZKVProtocol integrate with other Horizen technologies, such as the Horizen EON blockchain and the Horizen Wallet? What are the benefits of this integration?

Answer: The Horizen Community recently voted to pass ZenIP 42406, the technical proposal for the migration of EON and $ZEN, included in that proposal is an overview of how zkVerify will integrate with other Horizen technologies. 

Horizen 2.0 will be a fully-compatible EVM, which means that existing smart contracts on EON will be transitioned over, including $ZEN in current liquidity pools. Horizen 2.0 will also be built using the Substrate framework (written in Rust), which will allow for a tight integration with the zkVerify protocol (which is also built on Substrate) for fast and cost-effective verification of zk proofs. Specifically, EON 2.0 will be a parachain connected to the zkVerify Relay Chain.

The benefits to this integration allow Horizen to become optimized for ZK-based Apps, the seamless integration with zkVerify ensures the most efficient process for all ZK projects. This sets the groundwork for innovative ZK apps that will make Horizen the home for ZK and also provide more utility for $ZEN.

Question: Given Spencer’s remarks on the necessity of evolving with blockchain technologies in the previous StealthEX AMA, how is the marketing department planning educational initiatives or community outreach programs to address the complexities of ZKVerify?

Answer: Great question, this is something that the Horizen Labs Marketing Team is extensively focused on and planning for. We have established a developer relations team, whose responsibility will be to ensure sufficient educational and community outreach programs will be created for zkVerify, some of which have already kicked off or will be starting soon. 

The zkVerify documentation hub has tutorials already available to developers and builders: https://docs.zkverify.io/tutorials/tutorials/

This includes which wallets to use to get started, how to connect the wallets to zkVerify, how to request test tokens ($ACME), and many guides on how to submit proofs with various verifiers such as Groth16, Fflonk, Risc0, and more! I’d encourage anyone who is interested to explore this to check out our documentation linked above. The team is always available on Discord for any questions or any support needed. 

We will also be launching an educational program with Metaschool soon for zkVerify. This will be an interactive course in which users can learn more about zero knowledge technology, the modular blockchain stack, and how zkVerify fits into all of it. There will be some user actions to take as part of the course, like following some of the tutorials above, additional information to come soon on that! 

The zkVerify Twitter & Youtube is a great source of educational content for users as well, to make sure that zkVerify is as builder friendly as possible. 

zkVerify Twitter: https://x.com/ZKVProtocol

zkVerify Youtube: https://www.youtube.com/@zkVerify

Question: How does HorizenLabs approach governance and decision-making within the project? Are there any specific mechanisms or processes in place to ensure community involvement and transparency?

Answer: The Horizen Ecosystem follows a DAO structure and is an open-source, public protocol with a broad and diverse ecosystem, including miners, developers, node operators, and $ZEN holders. 

The Horizen DAO, governed by the Horizen Foundation follows various processes regarding community proposals, called ZenIPs (ZEN Improvement Proposals) – starting with a proposal idea and research, idea modification, community feedback, draft creation, voting period, and more. 

The Horizen DAO Discourse is the community hub, this is where many of the ZenIPs are discussed among the community members, as the proposals go from the idea phase to the draft phase and on to the formal voting phase. Community members are encouraged to express their thoughts and provide feedback on each proposal or discussion topic that comes in. 

Voting on these ZenIP proposals is done via Snapshot, the Horizen Foundation Snapshot is a great way to check out previous community votes, what the participation levels were like, what was being voted on, and what was the community sentiment for each of the votes: https://snapshot.org/#/horizenfoundation.eth

I’ll add a few resources that might be of value to learn more about this: 

Introducing Horizen DAO Blog: https://blog.horizen.io/introducing-horizen-dao/ Horizen DAO Discourse: https://horizen.discourse.group/ Horizen Discord: https://horizen.io/invite/discord Horizen Governance Documentation: https://docs.horizen.io/governance/overview/about

Question: As horizenglobal expands its network of partners and collaborations, how do you plan to maintain a cohesive and unified ecosystem that benefits all stakeholders, and what challenges do you foresee in managing such a diverse portfolio of projects?

Answer: Horizen Labs prides itself on being one of the major collaborators in the crypto space. This means that our Business Development, Marketing, and Tech teams have extensive experience working with many different partners on many different initiatives, ensuring that the ecosystem is functioning efficiently. 

One of the primary ways that we ensure alignment across many different partners within the ecosystem is through our communication. When it comes time for a network upgrade, update, new integration, new tool available, etc, we are always sure to communicate that both our community, as well as all of our ecosystem partners. This helps to ensure that all of our partners are aligned. 

At Horizen Labs we have worked with the teams at Offchain Labs, Yuga Labs, Animoca Brands, Darewise, ApeCoin DAO, and more on various projects and initiatives. 

Also, with the Horizen ecosystem running as a DAO structure, there is extensive community involvement in all that goes on within the ecosystem. From Discourse, to Discord, to Twitter and Telegram, the Horizen community is constantly in discussion and this is where we share updates, to ensure that everyone is aligned!

Question: How does HorizenLabs leverage its expertise in zero-knowledge cryptography and modular blockchain to address the scalability and efficiency challenges faced by the Web3 ecosystem?

Answer: Horizen Labs has built zkVerify, the Modular Blockchain for ZK Proof Verification. zkVerify is dedicated to verifying ZK proofs across diverse blockchains efficiently, reducing proof verification costs by 90%+, which would have saved ZK Rollups $42.8 Million in 2023 alone. 

By drastically reducing cost, zkVerify is empowering organizations and dApps to achieve economies of scale, bringing in the next generation of use cases that achieve blockchain’s ultimate potential.

zkVerify Enhances network performance by decoupling the computationally heavy proof verification process, allowing for faster and more precise operations free from the constraints of multi-purpose chains and seamlessly integrates with existing blockchain ecosystems and multiple zero-knowledge proving schemes, minimizing technical overhead with a developer-friendly environment.

I’d encourage anyone who is looking to learn more about zkVerify to check out their website: https://zkverify.io/

Question: How could the successful implementation of ZenIP 42406 impact the value and adoption of the ZEN token? What are the potential economic benefits for ZEN holders and the broader Horizen ecosystem as a result of the migration and the new EON 2.0 parachain?

Answer: I won’t go into details on any price predictions with this answer, but I will say that there is a lot to unpack with the passing of ZenIP 42406. 

At a high level, Horizen ($ZEN) and EON are currently built on older technology stacks. The Horizen Mainchain is a fork of the Bitcoin C++ codebase with a block time of 2.5 minutes, while EON is written in Scala based on the Scorex SDK with an 18-second block time. To address the limitations of these legacy systems and align with the vision of Horizen as a home for ZK and an enduring utility for $ZEN, Horizen Labs proposed to migrate to Horizen EON 2.0 with ZenIP 42406. 

Essentially Horizen/EON 2.0 will have a new target block time of ~6 seconds, be built on the Substrate framework written in rust, ZK-Optimized, and have backward compatibility. 

On ZK-Optimization: 

Leverage fast, native zk proof verification from zkVerify Protocol Modular and upgradable architecture for future zk advancements, such as new proving systems. 

On Backward Compatibility: 

Preserve max $ZEN supply Full Solidity and EVM support $ZEN, EON snapshots Prolonged Claim Window Enable Incentives for Ecosystem Partners

In Addition: 

-Enhanced Performance: Incorporating the latest advancements in zero-knowledge proof systems, improving transaction speeds, and increasing security

-Expanded Ecosystem with Full EVM Compatibility: Enhances interoperability with other blockchains, opening up a world of possibilities for cross-chain collaborations and integrations. This means more opportunities and partnerships for the Horizen community

-Community Growth: With these improvements, we are fostering a better environment for developers and web3 users to join our network, making Horizen stronger and more vibrant than ever before!

I’d encourage those who are interested to learn more about ZenIP 42406 and everything it means for the Horizen Ecosystem to check out our high-level blog post which provides an overview to the proposal: https://blog.horizen.io/zenip-42406-eon-and-zen-migration-high-level-overview/

Question: How do @HorizenLabs and @ZKVProtocol plan to stay ahead of the curve in the rapidly evolving crypto landscape?

Answer: Horizen Labs has an incredible research and product marketing team that works extensively to stay up to date with the latest and greatest in the industry. Many of our Cryptographers and Engineers actively contribute to various research projects and initiatives, both within and outside of Horizen Labs. 

We also leverage our vast network of connections and industry partners for this, which helps us to stay ahead of the curve and to learn of the latest developments as they are emerging in the space. Our Business Development team is constantly out in the field discussing with prospective partners and projects to see what their needs are, and how we can help to address them, providing a valuable feedback loop for our team.

Question: In what ways can developers take advantage of Horizen’s fully EVM-compatible blockchain to seamlessly migrate or build their Ethereum-based dApps, and what tools and resources are available to support this process?

Answer: The Horizen Ecosystem is quite vibrant, I’d encourage everyone to go check out the full list of our Ecosystem: https://eon.horizen.io/

There are many developer tools available on Horizen EON currently, such as Thirdweb, Remix, Truffle, and Hardhat. The Ecosystem includes infrastructure, bridges, an NFT Marketplace, many different Wallet Integrations, Oracles, DeFi integrations such as SpookySwap, Interport, Yuzu, and more!

In addition, there is extensive documentation available for Horizen EON that explains how to get and connect a wallet, receive test tokens, conduct a forward or backward transfer and learn more about how to set up forger nodes. Be sure to check out our documentation hub: https://docs.horizen.io/

To answer your specific question, we have a tutorial available that explains how to deploy a dApp using the Remix IDE here as well: https://docs.horizen.io/horizen_eon/tutorials/todolist

Question: A strong community not only brings interesting ideas to the project but also attracts larger partners. So how is its initiative  planning to build its community? And is there a plan to recruit people with Blockchain experience to the initiative team??

Answer: I figured this would be a great way to end this text AMA, and I wanted to provide as many relevant links as possible for this community to learn more about Horizen, zkVerify, and Horizen Labs, and to join our communities. 

Horizen: 

Horizen Website: https://www.horizen.io/ Horizen Twitter: https://x.com/horizenglobal Horizen Discord: https://horizen.io/invite/discord Horizen Discourse: https://horizen.discourse.group/ Horizen Telegram: https://t.me/horizencommunity Horizen Documentation: https://docs.horizen.io/

Horizen Labs: 

Horizen Labs Website: https://horizenlabs.io/ Horizen Labs Twitter: https://x.com/HorizenLabs

zkVerify: 

zkVerify Website: https://zkverify.io/ zkVerify Twitter: https://x.com/ZKVProtocol zkVerify Discord: https://discord.gg/zkverify zkVerify Telegram: @zkverify zkVerify Documentation: https://docs.zkverify.io/

On both the Horizen and zkVerify Discord servers there are many different channels tailored to various languages and communities around the globe, such as French, Korean, Chinese, Turkish, Spanish, Portuguese, and more! 

The Horizen Community has many dedicated and passionate Ambassadors who help to moderate the Discord Server and other community channels, provide community support, answer questions, and are amazing resources for our community. The Horizen Labs team is constantly working to build vibrant communities across the Horizen ecosystem, the zkVerify ecosystem, and more! Our teams are always available to provide community support, answer questions, and provide guidance.

Question: How can we use your platform? Do you have any guide for beginners?

Answer: Great Question! 

Breaking it down for both zkVerify and Horizen. 

For zkVerify, I’d encourage anyone interested to learn more about the platform and to start testing to check out our documentation hub: https://docs.zkverify.io/

We have lots of good tutorials and documentation there for users to get started. The tutorials are all step by step to ensure seamless onbaording!

For Horizen, I’d also encourage users to check out our Get Started page, we have useful links there for users to get started with Horizen EON!

https://eon.horizen.io/app/start-here

Question: Where can I get the latest updates or more information about the project?

Answer: This is a great way to wrap up this text AMA! 

Twitter (or X as its called now) is the best resource for information about both zkVerify, Horizen, and Horizen Labs. 

I would encourage this community to follow our accounts, to be sure to stay up to date on the latest and greatest!

Horizen: https://x.com/horizenglobal

Horizen Labs: https://x.com/HorizenLabs

zkVerify: https://x.com/ZKVProtocol

The post StealthEX AMA Recap appeared first on Horizen Blog.


Circle Press

Circle is First Global Stablecoin Issuer to Comply with MiCA

Circle’s French entity is launching USDC and EURC issuance in the EU in compliance with one of the world’s most comprehensive regulatory regimes for digital assets

Circle’s French entity is launching USDC and EURC issuance in the EU in compliance with one of the world’s most comprehensive regulatory regimes for digital assets

Friday, 28. June 2024

Panther Protocol

Dark Pools for Institutional Crypto Users: Challenges and Innovations

Privacy is critical for institutional crypto traders. Trades on public blockchains expose institutions to risks, including strategy theft, front-running, and MEV bots. Dark pools offer institutions a promising way to execute large trades privately. However, their adoption has been slow. One major issue is the association between Privacy Enhancing Technology

Privacy is critical for institutional crypto traders. Trades on public blockchains expose institutions to risks, including strategy theft, front-running, and MEV bots. Dark pools offer institutions a promising way to execute large trades privately. However, their adoption has been slow. One major issue is the association between Privacy Enhancing Technology (PET) and illegal activities like money laundering and terrorist financing, as seen with Tornado Cash. Other problems include concerns about compliance with regulations, inefficiencies in handling large transactions, poor integration with existing financial systems, limited scalability, and the risk of censorship.

This blog will discuss the reasons behind the slow adoption of dark pools by institutional users such as asset managers, market makers, and broker-dealers, their benefits, and how Panther Protocol will help resolve these issues. Panther is building modular, compliance-supportive DeFi access infrastructure for regulated financial market operators. The protocol will include customizable private trading Zones, strong compliance support, direct access to major decentralized exchanges, and privacy-preserving transaction methods to improve digital asset management security, efficiency, and regulatory compliance.

*For brevity, organizations with users such as asset managers, market makers, and broker-dealers are all referred to as institutions throughout this blog. 

Reasons behind the slow adoption of dark pools by institutions Lack of regulatory compliance

Inadvertently facilitating illegal activities can result in severe penalties, loss of reputation, and operational disruptions for institutions. AML and KYC regulations are of utmost importance, ensuring that bad actors, such as sanctioned individuals and entities are off their platforms.  

Many dark pools in DeFi, such as Tornado Cash, do not align with regulatory standards. This non-compliance poses significant risks, as seen with Tornado Cash being sanctioned by the U.S. Office of Foreign Assets Control (OFAC) and other regulators around the world due to its use by malicious actors like the Lazarus Group. This has led to many institutions being cautious about using dark pools due to potential legal repercussions and the risk of funds being associated with illicit activities.

Need for liquidity and handling of large transactions

Institutions need solutions that ensure privacy without compromising transaction efficiency. Large trades can see significant slippage or delay if a Dark Pool faces liquidity issues. Any perceived transaction inefficiency naturally leads to challenges with adoption. 

Poor integration with existing financial systems

Institutions prefer solutions that easily integrate with their existing systems, allowing for seamless interoperability and efficient workflows. Poor integration capabilities hinder adoption and can lead to operational bottlenecks. Currently, most of the dark pools available in the market lack integration with existing financial systems and tools. This creates a barrier for institutions that need to incorporate these solutions into their broader financial operations.

Vulnerability to censorship 

For institutions, the risk of transaction censorship is unacceptable. Robust mechanisms are needed by these types of investors to ensure their transactions are processed fairly and securely without being subject to censorship.  A number of high-profile regulators, including the U.S. Treasury, have proposed rules that identify virtual currency mixing as a class of transactions of primary money laundering concern. Tornado Cash is a prime example of how transactions can be excluded by OFAC-compliant block builders. 

Why institutions should care about dark pools Privacy and confidentiality

One reason why the adoption of Web3 dark pools is slow is as a result of misconceptions related to the use of CEXs. While CEXs can provide partial privacy by breaking transaction links, CEXs also custody their user’s assets, introducing exposure to risks ranging from fraud to security issues. Crypto exchanges have a history of high-profile custody issues, such as FTX, whose litany of deceptive practices, including the lending its customers’ assets to Alameda research led to one of the worst crises in crypto’s history. Other high profile examples include Mt. Gox, which lost approximately 850,000 BTC to hacking incidents; Bitfinex’s infamous security breach resulting the theft of 120,000 BTC and QuidrigaCX, where the sudden death of the CEO, who was the sole person with access to the exchange’s private keys, led to the loss of $190M in customer funds, among others.

Many institutional users find the risks associated with centralized custody unacceptable, particularly given CEXs only offer partial privacy. Dark pools provide essential confidentiality for institutions. In both traditional finance and decentralized finance (DeFi), large trades can significantly impact market prices and reveal trading strategies if executed publicly. Dark pools prevent market manipulation and front-running by hiding the details of your trades. This protection is essential for institutional investors to execute large transactions without other market participants exploiting this information. The ability to trade anonymously ensures that institutions can protect their trading strategies and investment returns from manipulative practices.

MEV Protection

MEV bots exploit the transparency and structure of blockchain transactions to extract additional value, often at the expense of other users. Their activities can lead to higher transaction costs, market manipulation, network congestion, and financial losses for other users of the platform. Dark pools such as Panther are using multiple approaches that can significantly reduce MEV risks. While no solution is entirely foolproof, the following strategies, currently under consideration for Panther Protocol’s designs will help create a more secure trading environment:

Approach 1: Offchain Order Book with P2P Library

Using this design choice, users communicate and negotiate trade details offchain using a peer-to-peer (P2P) library. For example, User A submits an offer, and User B handles the matching. They generate cryptographic proofs of the transaction offchain, and only one user submits a single transaction onchain for settlement via a smart contract. This approach minimizes onchain data exposure, reducing the information available to MEV bots and thereby lowering the chances of exploitation.

Approach 2: Onchain Order Book with Onchain Interaction

This design choice involves maintaining the order book and interactions entirely onchain. Users generate proofs and submit transactions onchain with a smart contract performing the settlement. Although this approach benefits from transparency and simpler implementation, it exposes trade details on the blockchain, making it more susceptible to MEV exploits.

Approach 3: Hybrid Model

A hybrid approach combines the strengths of both offchain and onchain methods. Sensitive data like user identities and trade amounts are exchanged offchain, while only cryptographic proofs are submitted onchain. This limits the onchain exposure to essential verification elements, enhancing privacy and reducing the likelihood of MEV attacks.

Dark pools inherently offer resistance to MEV bots because they typically involve peer-to-peer transactions and minimize onchain exposure, making it difficult for MEV bots to gather the information they need to exploit trades. In summary, by leveraging offchain communication, cryptographic proofs, and strategic onchain interactions, dark pools provide a robust framework for reducing MEV risks and enhancing transaction privacy.

Market efficiency and liquidity management

TradFi dark pools face significant reputational issues due to their history of exploiting information advantages to the detriment of their clients. Unlike traditional dark pools, on-chain dark pools such as Panther are working to contribute to market efficiency by allowing large trades to be executed without causing significant price movements. This stability benefits institutions that need to manage liquidity effectively without impacting market conditions. By enabling large, discrete trades, dark pools help institutions maintain optimal portfolio balances and manage risk efficiently. Efficient liquidity management is critical for institutions, and dark pools provide a platform where large orders can be matched without slippage or adverse market impacts. Mechanisms such as peer-to-peer order matching and liquidity aggregation from various sources ensure that institutions can access necessary liquidity while maintaining trade secrecy. This capability is vital for managing large transactions seamlessly.

Enhanced security and trust

Security is paramount for institutional investors, and dark pools offer a secure environment for executing trades. Privacy-enhancing technologies like zero-knowledge proofs ensure that trade data remains confidential and secure from external threats. Robust security protocols in dark pools build trust among institutional investors. Features like remote attestation and secure multi-party computation enhance the security of dark pool transactions, ensuring that the system remains trustworthy and reliable for institutional use. This trust is essential for institutions to engage in large-scale trading activities confidently.

How Panther will enable private, compliant DeFi access for institutions

Panther Protocol is set to enhance privacy, security, and efficiency in digital asset management. Panther is building compliance-enabling DeFi access infrastructure, complete with dark pool functionality for regulated financial entities. Panther Zones will enable institutions to create private trading Zones with customized asset lists, user lists, transaction limits, and access to DeFi applications. This modular approach will allow institutions to tailor their trading environments according to specific regulatory and operational needs. 

Want to learn more about Panther Zones? Reach out to us at contact@pantherprotocol.io


Epicenter Podcast

Fabric Ventures' Investment Thesis Since the Dawn of Bitcoin to AI - Richard Muirhead

It is one thing to invest in early stage companies, but to do so in a nascent industry that constantly reinvents and rediscovers itself is a whole different venture. Richard Muirhead co-founded Fabric Ventures in 2012 around the thesis of supporting the Open Economy. While at first this meant investing in Bitcoin-related projects, once Ethereum was announced, a new horizon unveiled, filled with tr

It is one thing to invest in early stage companies, but to do so in a nascent industry that constantly reinvents and rediscovers itself is a whole different venture. Richard Muirhead co-founded Fabric Ventures in 2012 around the thesis of supporting the Open Economy. While at first this meant investing in Bitcoin-related projects, once Ethereum was announced, a new horizon unveiled, filled with tremendous potential. Since then, the Open Economy thesis was adapted to match technological breakthroughs, and it is now centered around artificial intelligence, distributed computing and self sovereignty (over both fungible, as well non-fungible assets).

Topics covered in this episode:

Richard’s background and how he founded a crypto VC Fabric Ventures’ thesis How the AI thesis evolved User-generated AI On-chain autonomous agents Upcoming DeFi innovations Web3 trends that took Richard by surprise Missed opportunities The impact of crypto liquidity & market cycles on VC investing How to vet projects in early investment phase

Episode links:

Richard Muirhead on Twitter Fabric Ventures on Twitter Fabric Ventures

Sponsors:

Gnosis: Gnosis builds decentralized infrastructure for the Ethereum ecosystem, since 2015. This year marks the launch of Gnosis Pay— the world's first Decentralized Payment Network. Get started today at - gnosis.io Chorus1: Chorus1 is one of the largest node operators worldwide, supporting more than 100,000 delegators, across 45 networks. The recently launched OPUS allows staking up to 8,000 ETH in a single transaction. Enjoy the highest yields and institutional grade security at - chorus.one

This episode is hosted by Sebastien Couture & Friederike Ernst.

Thursday, 27. June 2024

Brave Browser

Bring Your Own Model (BYOM): using Brave Leo with your own LLMs

Brave Nightly sets a new standard for AI privacy and customization. Connect your preferred AI model, local or remote, directly to Leo in your browser.
Note: As of August 22, 2024, BYOM is now in general release and available to all desktop users (with browser update version 1.69 or higher).

AI assistants have become a common part of modern Web browsers. Brave has Leo, Edge has Copilot, Opera has Aria, Arc has Max… This new wave of AI integration has raised many questions among developers and users alike. What’s the best way to interact with an AI assistant within the browser? What unique opportunities arise when AI is brought directly into the browsing experience? And, perhaps most importantly: How can we effectively safeguard a user’s data, and allow them to configure their own AI models?

In our Leo roadmap–where we outlined how Brave is thinking about Leo and the future of AI integrations within the browser (aka Browser AI)—we pointed out the need to allow users to run models locally, and to configure their own models. Running models locally (i.e. directly on device) ensures that data like conversations, webpages, and writing prompts exchanged with the AI assistant never leave the user’s device. This approach will also open more opportunities for assisting users with their local data while safeguarding end user privacy. Additionally, by allowing users to configure their own AI models, whether running locally or hosted remotely, Brave empowers users to tailor the AI’s behavior, output, and capabilities to their specific needs and preferences.

Your model, your rules

Our first step towards that promise is called “bring your own model” (or “BYOM” for short). This optional new way of using Leo, Brave’s native browser AI, allows users to link it directly with their own AI models. BYOM allows users to run Leo with local models running safely and privately on their own machines. Users will be able to chat or ask questions on webpages (and docs and PDFs) to Leo without their content ever having to leave the device.

BYOM isn’t limited to just local models, either–BYOM users will also have the flexibility to connect Leo to remote models running on their own servers, or that are run by third-parties (e.g. ChatGPT). This opens new possibilities for users and organizations to leverage proprietary or custom models while still benefiting from the convenience of querying the AI directly within the browser.

With BYOM, the requests are made directly from the user’s device to the specified model endpoint, bypassing Brave entirely. Brave does not act as an intermediary and has absolutely no access to—or visibility into—the traffic between the user and the model. This direct connection ensures end-to-end control for the user.

BYOM is currently available on the Brave Nightly channel for developers and testers, and is targeted to launch in full release later this summer. Note that BYOM is initially for Desktop users only.

Getting started Using Local Models with BYOM in Leo

No special technical knowledge or hardware is required to try BYOM. The local LLMs landscape has evolved so fast that it is now possible to run performant local models on-device in just a few simple steps.

For example, consider a hobbyist-favorite framework for running local LLMs: Ollama. The Ollama framework supports an extensive list of local models that range from 829MB to 40GB in size.

You can get started using a local LLM in Brave Leo in two easy steps:

Step 1: Download and install Ollama

First, visit https://ollama.com/download to download Ollama. Select your platform and continue with the download. Once you download and unzip the file, make sure to move the application somewhere you can easily access, such as your computer’s desktop or Application folder. You will have to start the serving framework whenever you want to use the local model.

Next, click the application and wait for it to open, and complete the install steps.

After you complete these steps and click Finish, you’ll be able to run Meta’s Llama 3 model on your machine just by opening your terminal and typing:

ollama pull llama3

You should see that the model manifest and the model are being downloaded (note this step may take a while to complete, depending on the file size and your connection speed). Once the model has been pulled successfully, you can close your terminal.

Llama 3, at 4.7GB for an 8B parameter model, strikes a good balance between quality and performance, and can be easily run on a laptop with at least 8GB of RAM. Other models we found suitable to be run locally are Mistral 7B (by Mistral AI) and Phi 3 Mini (by Microsoft). See the full list of models supported by Ollama.

Now that the model is running locally on your device, no data is being transmitted to any third party. Your data remains yours. 

Step 2: Plug your model into Leo

To add your own local model to Leo, open the Brave browser and visit Settings, and then Leo. Scroll to the Bring your own model section and click Add new model.

You’ll then be brought to a new interface where you can add the details of your model. The required fields are the following:

Label: The name of the model as it will appear in the model selection menu.

Model request name: The name of the model as it should appear in the request to the serving framework, e.g. llama-3 (note that if this name doesn’t match exactly what is expected by the serving framework, the integration will not work).

Server endpoint: The url where your serving framework is “listening” for requests. If you’re not sure, check the serving framework documentation (For Ollama, this is always http://localhost:11434/v1/chat/completions).

API Key: third party frameworks may require authentication credentials, such as an API key or an access token. These will be added to the request header.

Click Add model; your local model should now appear in the Leo model menu. Simply select it to use it to power your browser AI.

Note that while we’ve used Ollama in this section, as we think it’s one of the most user-friendly frameworks to set up and run local models, the BYOM feature can be used with any local serving framework with an exposed endpoint and that conforms to the OpenAI chat protocol.

Using Remote Models (ChatGPT) with BYOM in Leo

To connect Leo to a remote model or a third party API, such as OpenAI API, follow these steps:

Enter https://api.openai.com/v1/chat/completions as the endpoint, and add the name of your desired model (e.g. gpt-4o see model list) in the Model request name field.

Enter your private API key in the Authentication Credentials field to authenticate your requests to the API. The key is then stored safely in your browser.

Note that—at the time of writing—Brave Leo only supports textual input and outputs; outputs of any other modality will not be processed. You can now use Brave Leo with ChatGPT through your OpenAI API account.

If you’re a Brave Nightly user, please give BYOM a try, and let us know what you think on X @BraveNightly or at https://community.brave.com/!


What Manifest V3 means for Brave Shields and the use of extensions in the Brave browser

Brave is committed to maintaining Manifest V2 support, allowing users to keep their favorite extensions functioning beyond the industry-wide deprecation.

In late 2021, Google first announced plans to deprecate Manifest V2 (MV2), the longstanding Chrome extension manifest file format, and force extensions to be built using Manifest V3 (MV3) going forward. What does this mean for Brave users? In short:

Manifest V3 will not weaken Brave Shields in any way

For as long as we’re able (and assuming the cooperation of the extension authors), Brave will continue to support some privacy-relevant MV2 extensions—specifically AdGuard, NoScript, uBlock Origin, and uMatrix

Brave Shields block ads and trackers by default, and they’re built natively in the Brave browser—no extensions required. Since Shields are patched directly onto the open-source Chromium codebase, they don’t rely on MV2 or MV3.

Thanks to this independence, Google’s forced removal of MV2 will not weaken Brave Shields. The filter lists (such as EasyList and EasyPrivacy) we rely on to protect users from invasive ads and trackers are open for community contribution, and we expect the privacy community at large to continue maintaining these lists. Brave’s privacy research and engineering teams will do so as well.

No matter what happens with the deprecation of MV2 and the shift to MV3, Shields will continue to offer better, more stable protection than extensions.

Will MV2 extensions still work in Brave?

Yes, for now. We recognize the importance of supporting existing Manifest V2 extensions. We have force-enabled Manifest V2 support in the Brave browser, ensuring that you can continue to use your favorite extensions without interruption. In June 2025, Google plans to remove all remaining Manifest V2 items from the Chrome Web Store. While Brave has no extension store, we have a robust process for customizing (or “patching”) atop the open-source Chromium engine. This will allow us to offer limited MV2 support even after it’s fully removed from the upstream Chromium codebase.

Which MV2 extensions will work in Brave?

As of now, the MV2 extensions we plan to explicitly support are AdGuard AdBlocker, NoScript, uBlock Origin, and uMatrix. This feature will be best-effort: we might have to modify support based on either Google’s plans or what extension authors ultimately decide to do. If extensions become stale or obsolete, we may remove support for them rather than offer our users an out-of-date (potentially even unsafe) experience.

We’re gradually rolling out a new page in Settings that lists these extensions. Once you have the update, you will see it in brave://settings/extensions.

What’s the issue with Manifest V3?

As a recap, Manifest V3 restricts the blocking capabilities of Web extensions, making it harder for privacy-enhancing extensions such as uBlock Origin to protect users. The privacy community has already begun releasing experimental versions of ad block extensions that are rebuilt from the ground up to meet the new constraints of MV3: AdGuard AdBlocker MV3 Experimental and uBlock Origin Lite are just two examples. And we’ve seen that these experimental versions do block ads and other unwanted Web content. However, MV3 imposes new limitations, such as a cap on blocking rules, the removal of background scripts, and changes around cosmetic filtering. Overall, these new MV3 extensions lean on workarounds to solve problems MV3 itself introduces.

The increased importance of user-first browsers such as Brave

Brave provides best-in-class privacy, no extensions required. We’re also actively working on new features such as procedural filtering, which will give us more flexibility in blocking invasive ads and trackers, ensuring a cleaner and safer browsing experience for our users. If you’re not able to use Brave, browser extensions can be helpful to achieve some (though not all) of Brave’s protections in other, less private browsers. Most ad block developers put a lot of thought and care into the performance and safety of their extensions, and they can provide some additional privacy.

Brave also builds a host of other privacy protections adjacent to ad blocking, such as fingerprint randomization, ephemeral storage partitions, and more; such protections would be impossible in an extension-based solution. By building natively in the Brave browser, we’re saying these features are table stakes, not something that should be offered as an optional add-on.

While Brave will continue to offer limited support for MV2 extensions, the real solution is to use Brave’s industry-leading, native features. All are available by simply downloading the Brave browser.

For more info, you can read Google’s updated timeline for deprecation of MV2.

Wednesday, 26. June 2024

a16z Podcast

Cybersecurity's Past, Present, and AI-Driven Future

Is it time to hand over cybersecurity to machines amidst the exponential rise in cyber threats and breaches? We trace the evolution of cybersecurity from minimal measures in 1995 to today's overwhelmed DevSecOps. Travis McPeak, CEO and Co-founder of Resourcely, kicks off our discussion by discussing the historical shifts in the industry. Kevin Tian, CEO and Founder of Doppel, highlights the rise

Is it time to hand over cybersecurity to machines amidst the exponential rise in cyber threats and breaches?

We trace the evolution of cybersecurity from minimal measures in 1995 to today's overwhelmed DevSecOps. Travis McPeak, CEO and Co-founder of Resourcely, kicks off our discussion by discussing the historical shifts in the industry. Kevin Tian, CEO and Founder of Doppel, highlights the rise of AI-driven threats and deepfake campaigns. Feross Aboukhadijeh, CEO and Founder of Socket, provides insights into sophisticated attacks like the XZ Utils incident. Andrej Safundzic, CEO and Founder of Lumos, discusses the future of autonomous security systems and their impact on startups.

Recorded at a16z's Campfire Sessions, these top security experts share the real challenges they face and emphasize the need for a new approach. 

Resources: 

Find Travis McPeak on Twitter: https://x.com/travismcpeak

Find Kevin Tian on Twitter: https://twitter.com/kevintian00

Find Feross Aboukhadijeh on Twitter: https://x.com/feross

Find Andrej Safundzic on Twitter: https://x.com/andrejsafundzic

 

Stay Updated: 

Find a16z on Twitter: https://twitter.com/a16z

Find a16z on LinkedIn: https://www.linkedin.com/company/a16z

Subscribe on your favorite podcast app: https://a16z.simplecast.com/

Follow our host: https://twitter.com/stephsmithio

Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures.

 


Circle Press

Circle Supports Responsible French & European Innovation

PARIS – June 26, 2024 – Circle, a global financial technology firm and the issuer of USDC and EURC, has today announced that it has joined Adan. Adan, a leading association bringing together professionals in the digital asset and blockchain sector throughout France and Europe, has over 200 member companies and is the largest grouping of Web3 players in France and Europe.

PARIS – June 26, 2024 – Circle, a global financial technology firm and the issuer of USDC and EURC, has today announced that it has joined Adan. Adan, a leading association bringing together professionals in the digital asset and blockchain sector throughout France and Europe, has over 200 member companies and is the largest grouping of Web3 players in France and Europe.

Tuesday, 25. June 2024

Circle Press

Circle’s New USDC.com Website Provides New Resource for USDC Information

Boston, MA – June 25, 2024 – Circle, a global financial technology firm and issuer of USDC, today launched USDC.com, a dedicated website to educate USDC users and those interested in learning more about USDC and where to access it. The website also serves as a resource to access USDC reserve transparency and circulation information. 

Boston, MA – June 25, 2024 – Circle, a global financial technology firm and issuer of USDC, today launched USDC.com, a dedicated website to educate USDC users and those interested in learning more about USDC and where to access it. The website also serves as a resource to access USDC reserve transparency and circulation information. 

Friday, 21. June 2024

Epicenter Podcast

Plurality: How Taiwan Managed to Unite Its People Through Tech - Audrey Tang & Glen Weyl

In a world constantly torn by social division amplified by polarizing scissor statements throughout social media, Taiwan conducted a social experiment aimed at strengthening social unity while also embracing diversity. Plurality details how Taiwan’s Digital Minister Audrey Tang and her collaborators achieved inclusive, technology-fueled growth that harnessed digital tools to provide an antidote to

In a world constantly torn by social division amplified by polarizing scissor statements throughout social media, Taiwan conducted a social experiment aimed at strengthening social unity while also embracing diversity. Plurality details how Taiwan’s Digital Minister Audrey Tang and her collaborators achieved inclusive, technology-fueled growth that harnessed digital tools to provide an antidote to information chaos and warfare. The open-source book is living proof that present global challenges can be solved through democratic solutions that embody a decentralised ethos.

We were joined by Audrey Tang and Glen Weyl, co-authors of Plurality, to discuss the social dynamics they studied and how technology can be used to unite rather than divide.

Topics covered in this episode:

How Audrey & Glen met and Plurality’s genesis Audrey’s journey from civic hacker to Taiwan’s Digital Affairs Minister How democracy is perceived around the world Establishing a co-creating mentality Scissor statements and how to avoid division How Polis works Leveraging Web3 to strengthen democracy & social collaboration Decentralised co-ownership Web3 governance Human facilitators

Episode links:

Audrey Tang on Twitter Glen Weyl on Twitter Plurality Book on Twitter Plurality Institute on Twitter Radical xChange

Sponsors:

Gnosis: Gnosis builds decentralized infrastructure for the Ethereum ecosystem, since 2015. This year marks the launch of Gnosis Pay— the world's first Decentralized Payment Network. Get started today at - gnosis.io Chorus1: Chorus1 is one of the largest node operators worldwide, supporting more than 100,000 delegators, across 45 networks. The recently launched OPUS allows staking up to 8,000 ETH in a single transaction. Enjoy the highest yields and institutional grade security at - chorus.one

This episode is hosted by Friederike Ernst.

Wednesday, 19. June 2024

a16z Podcast

The Science and Supply of GLP-1s

Brooke Boyarsky Pratt, founder and CEO of knownwell, joins Vineeta Agarwala, general partner at a16z Bio + Health. Together, they talk about the value of obesity medicine practitioners, patient-centric medical homes, and how Brooke believes the metabolic health space will evolve over time. This is the second episode in Raising Health’s series on the science and supply of GLP-1s. Listen to last w

Brooke Boyarsky Pratt, founder and CEO of knownwell, joins Vineeta Agarwala, general partner at a16z Bio + Health.

Together, they talk about the value of obesity medicine practitioners, patient-centric medical homes, and how Brooke believes the metabolic health space will evolve over time.

This is the second episode in Raising Health’s series on the science and supply of GLP-1s. Listen to last week's episode to hear from Carolyn Jasik, Chief Medical Officer at Omada Health, on GLP-1s from a clinical perspective.

 

Listen to more from Raising Health’s series on GLP-1s:

The science of satiety: https://raisinghealth.simplecast.com/episodes/the-science-and-supply-of-glp-1s-with-carolyn-jasik

Payers, providers and pricing: https://raisinghealth.simplecast.com/episodes/the-science-and-supply-of-glp-1s-with-chronis-manolis

 

Stay Updated: 

Let us know what you think: https://ratethispodcast.com/a16z

Find a16z on Twitter: https://twitter.com/a16z

Find a16z on LinkedIn: https://www.linkedin.com/company/a16z

Subscribe on your favorite podcast app: https://a16z.simplecast.com/

Follow our host: https://twitter.com/stephsmithio

Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures.

Friday, 14. June 2024

Epicenter Podcast

'Ethereum Needs Polygon's Aggregation Layer to Scale' - Brendan Farmer & Sandeep Nailwal

As more and more L2s launch promising to scale Ethereum, they end up competing for the same market share, userbase and liquidity. Apart from this harsh reality, crosschain interactions should be as seamless as possible in order to bridge different L2 ecosystems. Polygon envisions a scalable future in which various zero knowledge rollups post their proofs on an aggregation layer before settling on

As more and more L2s launch promising to scale Ethereum, they end up competing for the same market share, userbase and liquidity. Apart from this harsh reality, crosschain interactions should be as seamless as possible in order to bridge different L2 ecosystems. Polygon envisions a scalable future in which various zero knowledge rollups post their proofs on an aggregation layer before settling on Ethereum, thus lowering latency and transaction costs as crosschain interactions take place expeditiously, without involving the L1 mainnet.

We were joined by Sandeep Nailwal & Brendan Farmer, to discuss Polygon’s aggregation layer and how it aims to solve the current fragmentation of Ethereum L2 scaling solutions.

Topics covered in this episode:

Polygon’s ZK expansion and the acquisition of Mir Protocol Polygon’s aggregation layer Block building between different L2s Shared sequencing & asynchronous sequencing Security guarantees of the aggregation layer Sequencer decentralisation & censorship resistance Chains using Polygon’s aggregation layer Pessimistic proofs Can optimistic rollups be included in the aggregation layer? Type 1 prover and Plonky3 The evolution of ZKP systems Ensuring ZK rollup integrity The future of scalability

Episode links:

Sandeep Nailwal on Twitter Brendan Farmer on Twitter Polygon on Twitter

Sponsors:

Gnosis: Gnosis builds decentralized infrastructure for the Ethereum ecosystem, since 2015. This year marks the launch of Gnosis Pay— the world's first Decentralized Payment Network. Get started today at - gnosis.io Chorus1: Chorus1 is one of the largest node operators worldwide, supporting more than 100,000 delegators, across 45 networks. The recently launched OPUS allows staking up to 8,000 ETH in a single transaction. Enjoy the highest yields and institutional grade security at - chorus.one

This episode is hosted by Friederike Ernst.


Panther Protocol

Why asset managers should care about privacy

Proprietary investment strategies have always been among the most closely guarded secrets in finance. Traditionally, these strategies were protected by the nature of how trades are transacted, giving brokers and financial institutions the ability to leverage their proprietary strategies to their advantage. The rise of blockchain and decentralized finance (DeFi)

Proprietary investment strategies have always been among the most closely guarded secrets in finance. Traditionally, these strategies were protected by the nature of how trades are transacted, giving brokers and financial institutions the ability to leverage their proprietary strategies to their advantage. The rise of blockchain and decentralized finance (DeFi) has introduced a new level of transparency. This openness, while fostering trust, poses significant risks as it exposes sensitive trading patterns and strategies to public scrutiny.

Exposure of investment strategies

In traditional finance, investment strategies are often the real value of what is being provided to the client or fund. Hedge funds, asset managers, and brokers invest significant resources into developing proprietary trading strategies that provide them a competitive edge. However, in the realm of blockchain and decentralized finance (DeFi), the inherent transparency of public ledgers poses a significant threat to these proprietary strategies.

Every transaction on public blockchains like Ethereum and Bitcoin is recorded in a publicly accessible ledger. While this transparency promotes trust and verification, it also means that anyone can analyze transaction data to uncover trading patterns and strategies. This exposure can be detrimental to asset managers,  brokers and financial institutions that rely on the confidentiality of their trading methods.

Blockchain analytics tools such as Nansen have made it increasingly easy to scrutinize and interpret blockchain data. Nansen, for instance, labels and analyzes millions of wallet addresses, providing insights into the activities of various market participants. These tools can track large transactions, identify trends, and even link transactions to specific entities. For asset managers, this means that their trades can be monitored, virtually in real-time. It’s possible that their strategies could be reverse-engineered by competitors or malicious actors.

When proprietary trading strategies are exposed, the risk of financial loss increases substantially. Competitors can replicate successful strategies, diluting competitive advantage. Worse still, malicious actors can engage in counteracting strategies such as front-running, where they anticipate trades and execute their own trades in advance to profit from the price movements triggered by the asset manager’s transactions.

For example, if an asset manager plans to purchase a large amount of a specific cryptocurrency, front-runners can detect the initial buying activity and buy the cryptocurrency themselves, driving up the price. By the time the purchase is complete, the price has increased, causing the asset manager to pay more than intended. Once the asset manager’s purchase is complete, the front-runners can sell their holdings at a profit, leaving the asset manager at a disadvantage.

In addition to financial losses, the exposure of investment strategies undermines the trust and credibility of asset managers and other financial professionals. Clients expect their asset managers, brokers and other advisors to maintain the confidentiality of their strategies and transactions. If asset managers cannot guarantee this privacy, they risk losing clients to non-crypto or private alternative investments. 

Front-running

One of the most significant issues that comes with transparent blockchain ledgers is front-running. Front-running occurs when malicious actors observe patterns that suggest an impending transaction is coming and execute their trades ahead of it to capitalize on the expected price movement. For example, if an asset manager intends to buy a large amount of a particular token, front-runners might detect the client’s typical pre-cursory activity and purchase the cryptocurrency first. This buying activity drives up the price, and by the time the transaction is completed, they end up paying a higher price than anticipated. The front-runners then sell their holdings at the increased price, profiting from the transaction while the portfolio suffers from the inflated costs.

This lack of privacy not only results in financial losses but also erodes your ability to execute effective trading strategies. With open blockchains, competitors can analyze trading patterns, replicate successful strategies, or even devise counter-strategies that exploit your clients’ positions. This constant surveillance can lead to a loss of competitive edge, as asset managers, brokers and others are unable to maintain the secrecy of their clients’ strategies. 

MEV Bots

MEV (Maximal Extractable Value) bots underscore the importance of privacy in the trading strategies used by asset managers. These bots are sophisticated algorithms that monitor blockchain networks for pending transactions and exploit opportunities to extract additional value. By analyzing transaction data in real-time, MEV bots can execute trades such as arbitrage, front-running, and sandwich attacks, allowing them to profit from price discrepancies and transaction sequencing. The transparency of blockchain networks, while beneficial for trust and verification, makes it easier for these bots to operate, emphasizing the necessity of maintaining transaction confidentiality for all users, but asset managers who engage in high volume or high frequency trades on behalf of clients may be at higher risk. According to Flashbots, MEV bots extract around 500M USD off Ethereum alone per year. 

Options are limited

As of this writing, your options as an asset manager are limited if you wish to preserve your privacy and that of your clients. Many of today’s best-known privacy-enhancing technologies can create regulatory issues. For example, the virtual currency mixer Tornado Cash was infamously sanctioned by the U.S. Treasury and faced investigations and actions from other regulators around the world, due to its involvement in illicit activities. Regulators recognize that there is an opportunity for privacy-enhancement for DeFi trading, and are calling for innovations that can simultaneously preserve the privacy of users while also enabling AML/CFT obligations. It is worth noting that Panther Protocol is being built to address this challenge.  

Panther Protocol

Panther’s cutting-edge decentralized privacy protocol will empower enterprises with customizable, private operating environments to manage portfolios and assets. Leveraging Zero-Knowledge proofs and a proprietary Multi-Asset Shielded Pool (MASP) design, Panther Protocol will deliver a suite of features that provide unparalleled privacy, security, and compliance support for public on-chain transactions. 

Benefits For Asset Managers (In development) Private Portfolio Management Supports confidential client portfolio management, ensuring transaction details and holdings remain private, maintaining investor confidentiality. Custom Investment Strategies Tailored investment access with specific DeFi applications and asset lists, optimizing returns while adhering to client risk profiles. Compliance and Reporting Encrypted data storage for transactional details, simplifying compliance and reporting processes with secure transaction detail exports. Conclusion

The transparency of blockchain technology presents both opportunities and challenges for asset managers and financial institutions. While it promotes trust and accountability, it also exposes proprietary strategies to potential competitors and malicious actors. This exposure necessitates that asset managers take steps to enhance their privacy measures and stay vigilant against threats like front-running. Navigating this landscape requires balancing the benefits of blockchain transparency with the need to protect confidential trading strategies.


a16z Podcast

The State of AI with Marc & Ben

In this latest episode on the State of AI, Ben and Marc discuss how small AI startups can compete with Big Tech’s massive compute and data scale advantages, reveal why data is overrated as a sellable asset, and unpack all the ways the AI boom compares to the internet boom.   Subscribe to the Ben & Marc podcast: https://link.chtbl.com/benandmarc   Stay Updated:  Let us know

In this latest episode on the State of AI, Ben and Marc discuss how small AI startups can compete with Big Tech’s massive compute and data scale advantages, reveal why data is overrated as a sellable asset, and unpack all the ways the AI boom compares to the internet boom.

 

Subscribe to the Ben & Marc podcast: https://link.chtbl.com/benandmarc

 

Stay Updated: 

Let us know what you think: https://ratethispodcast.com/a16z

Find a16z on Twitter: https://twitter.com/a16z

Find a16z on LinkedIn: https://www.linkedin.com/company/a16z

Subscribe on your favorite podcast app: https://a16z.simplecast.com/

Follow our host: https://twitter.com/stephsmithio

Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures.

Thursday, 13. June 2024

Horizen - Blog

ZenIP 42406: EON and $ZEN Migration – High Level Overview

On June 6th, the official ZenIP for the technical roadmap of the migration of $ZEN and EON went live on Discourse, there’s a lot to cover on it, so we wanted to provide a high level overview of the proposal and what it means for the Horizen Ecosystem.  At a high level, Horizen ($ZEN) and […] The post ZenIP 42406: EON and $ZEN Migration – High Level Overview appeared first on Horizen Blog

On June 6th, the official ZenIP for the technical roadmap of the migration of $ZEN and EON went live on Discourse, there’s a lot to cover on it, so we wanted to provide a high level overview of the proposal and what it means for the Horizen Ecosystem. 

At a high level, Horizen ($ZEN) and EON are currently built on older technology stacks. The Horizen Mainchain is a fork of the Bitcoin C++ codebase with a block time of 2.5 minutes, while EON is written in Scala based on the Scorex SDK with an 18-second block time. To address the limitations of these legacy systems and align with the vision of Horizen as a home for ZK and an enduring utility for $ZEN, Horizen Labs is proposing to migrate to EON 2.0.

Key Points EON 2.0 will be a fully-compatible EVM. This includes backward compatibility to provide a smooth transition for existing $ZEN holders and EON developers, including incentives, emissions, staking, partner integrations, exchange continuity, and EVM compatibility. This means that existing smart contracts on EON will be transitioned over, including $ZEN in current liquidity pools. This is a technical ZenIP, laying out the technical roadmap for the migration mechanics for $ZEN and EON. Tokenomics are not in scope for this ZenIP. Topics such as allocations and emissions are explicitly excluded since they will be covered in greater detail in a separate proposal. It will be a snapshot-based migration, where comprehensive state snapshots of both EON 1.0 and Horizen will be taken to ensure a seamless transition to EON 2.0. We will provide the ability for anyone to independently verify the integrity of both snapshots. For Mainchain $ZEN, there will be a claim process overtime, where eligible users can claim their old Mainchain $ZEN by proving the ownership of their address. As for $ZEN on EON, it will not require any claim, since balances will be migrated with the state snapshot. EON 2.0 will be built using the Substrate framework (written in Rust) with tight integration with the zkVerify protocol (which is also built on Substrate) for fast and cost-efficient verification of zk proofs. Specifically, EON 2.0 will be a parachain connected to the zkVerify Relay Chain. It is not associated with the Polkadot ecosystem. Visual Representation of the Migration Proposal

Voting Timeline

Voting for ZenIP 42406 will start on Tuesday, June 18th, 2024 at 12pm EST and end on Friday, June 21st, 2024 at 12pm EST. The snapshot will be taken as soon as the vote goes live. In order to prepare for the vote, please be sure to follow our instructions here: https://blog.horizen.io/how-to-vote-on-zenips-using-your-staked-zen/

Frequently Asked Questions What will be the consensus of the new EON chain? Delegated Proof of Stake Same as the previous EON chain Is $ZEN going to be the block reward on the new chain? Yes, but allocations, emissions, and tokenomics are not covered in this technical ZenIP, but in a separate and dedicated proposal.  If there are no forger nodes, will there be some other type of staking? Yes there will be staking on EON collators.  What will be EON’s connection to zkVerify? EON 2.0 will integrate tightly with the zkVerify protocol, ensuring fast and cost-efficient verification of zk proofs.  zkVerify will operate as its own relay chain. EON 2.0 will also utilize the substrate framework as a parachain to zkVerify.  What will happen to the Horizen mainchain? It will be deprecated. What will happen to my mainchain $ZEN? $ZEN will be the native token for EON 2.0 A comprehensive snapshot will be taken.  Eligible users can claim their $ZEN through a claiming mechanism over time.  How will exchanges support this deprecation and migration? We will work closely with exchanges to coordinate a seamless upgrade.  How will forger nodes migrate? They will migrate to EON collators, migration steps will be provided. What will happen to Super Nodes? They will migrate to EON collators Will be rewarded based on the blocks that they are chosen to forge.  What wallet will I have to hold ZEN in? What will happen to Sphere? MetaMask will be used to store $ZEN, Sphere will be deprecated.

As always, we remain available in the Horizen Discord for any community questions, comments, or concerns, please do not hesitate to reach out to us!

The post ZenIP 42406: EON and $ZEN Migration – High Level Overview appeared first on Horizen Blog.


Brave Browser

Leo, Brave’s in-browser AI assistant, now incorporates real-time Brave Search results for even better answers

Leo—Brave's in-browser AI assistant—is now even more useful thanks to its integration with Brave Search.

Leo—Brave’s in-browser AI assistant—is now even more useful, thanks to its integration with Brave Search. With today’s release, Leo can incorporate search-augmented responses into its answers, powered by the Brave Search API. This new integration allows Leo to provide more accurate and up-to-date answers, especially for queries related to current events or topics where the initial language model training may be outdated or lack full context.

Brave Leo with Brave Search results is now available to desktop users with today’s 1.67 browser update, and will be coming to mobile platforms very soon. 

The benefits of real-time Web data in your AI assistant, and how to get started

There are countless instances where you might need real-time information incorporated into your conversation with Leo. For example, if you ask Leo about upcoming events like concerts or sporting events, it can now reference Brave Search results to supplement its response with the timeliest news and information. Or, if you’re researching a complex topic like financial markets, Leo can weave in current data and insights from credible sources to give you a well-rounded perspective.

Importantly, any Leo response that incorporates search results includes links to Brave Search to explore more results, and in an upcoming release will also include links to the sources used to inform the answer. This helps you understand the source of the answers you receive, and fact check before you proceed.

Asking Leo “What are the latest reviews for Inside Out 2?”

Asking Leo “When is Game 4 of the NBA Finals?”

Asking Leo “What’s happening with the stock market today?”

Ready to try Brave Leo? It’s easy: Simply open the browser, begin typing in the address bar, and click “Ask Leo”. For the on-page chat experience, desktop users can click in the sidebar, while mobile users can tap “…” (iOS) or “⋮” (Android) and then tap Leo to get started.

While Leo’s search integration provides valuable context, it’s still important to verify information from third-party sources, especially for critical topics. Brave encourages users to think critically and fact-check key claims against authoritative sources. For this reason, answers that incorporate information from Brave Search are clearly labeled, and include a link to a search results page (SERP) where you can investigate further.

AI assistance that’s both smart and privacy-protecting

By seamlessly blending contextual search results into natural conversations, Leo offers a uniquely powerful yet privacy-preserving AI experience. Chats with Leo are private and secure. Leo never uses your chats for model training and no account or login is required to use Leo. Additionally, Leo includes some specific privacy protections, including:

Reverse proxy: All requests are proxied through an anonymization server so the request and user cannot be linked. Brave cannot associate the user request with their IP address.

Responses discarded: Conversations are not persisted on Brave’s servers. We do not collect identifiers that can be linked to you (such as IP address). Responses generated with Brave-hosted models are discarded after they’re generated, and not used for model training; no personal data is retained by Brave-hosted AI models.

No login or account required for access for the free version: Users do not need to create a Brave account to use Leo.

Unlinkable subscription: If you sign up for Leo Premium, you’re issued unlinkable tokens that validate your subscription when using Leo. This means that Brave can never connect your purchase details with your usage of the product, an extra step that ensures your activity is private to you and only you. The email you use to create your account is unlinkable to your day-to-day use of Leo, making this a uniquely private credentialing experience.

For an added layer of privacy, you can clear all local Leo data stored on your device through the Settings menu.

Learn more about Leo and data privacy in our Privacy Policy.

Using Brave Search to build your own AI applications

Want to connect your AI to the Web? The Brave Search API allows any developer to integrate Brave Search results into their applications. The Brave Search API provides access to an index of billions of pages via Brave’s privacy-preserving Web index, for building everything from search engines to AI apps. Learn more.

Monday, 10. June 2024

a16z Podcast

Predicting Revenue in Usage-based Pricing

Over the past decade, usage-based pricing has soared in popularity. Why? Because it aligns cost with value, letting customers pay only for what they use. But, that flexibility is not without issues - especially when it comes to predicting revenue. Fortunately, with the right process and infrastructure, your usage-based revenue can become more predictable than the traditional seat-based SaaS model.

Over the past decade, usage-based pricing has soared in popularity. Why? Because it aligns cost with value, letting customers pay only for what they use. But, that flexibility is not without issues - especially when it comes to predicting revenue. Fortunately, with the right process and infrastructure, your usage-based revenue can become more predictable than the traditional seat-based SaaS model. 

In this episode from the a16z Growth team, Fivetran’s VP of Strategy and Operations Travis Ferber and Alchemy’s Head of Sales Dan Burrill join a16z Growth’s Revenue Operations Partner Mark Regan. Together, they discuss the art of generating reliable usage-based revenue. They share tips for avoiding common pitfalls when implementing this pricing model - including how to nail sales forecasting, adopting the best tools to track usage, and deal with the initial lack of customer data. 

Resources: 

Learn more about pricing, packaging, and monetization strategies: a16z.com/pricing-packaging

Find Dan on Twitter: https://twitter.com/BurrillDaniel

Find Travis on LinkedIn: https://www.linkedin.com/in/travisferber

Find Mark on LinkedIn: https://www.linkedin.com/in/mregan178

Stay Updated: 

Let us know what you think: https://ratethispodcast.com/a16z

Find a16z on Twitter: https://twitter.com/a16z

Find a16z on LinkedIn: https://www.linkedin.com/company/a16z

Subscribe on your favorite podcast app: https://a16z.simplecast.com/

Follow our host: https://twitter.com/stephsmithio

Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures.

Saturday, 08. June 2024

Panther Protocol

Testnet Stage 6 is now live with improved fee management

Fellow Panthers, First and foremost, thank you to the community of over 3000 testnet users who have been testing and sharing valuable feedback. Your continued support and patience are critically important to our mission. Testnet Stage 5 is now complete, and it provided us with valuable insights, resulting in several

Fellow Panthers,

First and foremost, thank you to the community of over 3000 testnet users who have been testing and sharing valuable feedback. Your continued support and patience are critically important to our mission.

Testnet Stage 5 is now complete, and it provided us with valuable insights, resulting in several updates to our codebase.

We are excited to announce that Stage 6 of our testnet is live. Stage 6 includes simulated fee management, AMM refilling mechanisms, simulated limited subsidy rewards, onboarding enhancements using a bundler service, and several technical improvements to our testing environment.

Note: Panther’s test network is now live on Amoy, Polygon’s test network, and can be accessed here. Due to this network change, users will need to complete the onboarding process again and activate their test zAccount. Users who are using their old accounts, however, will not have to go through PureFi credential process again and will be moved directly to the account activation. This is a one-time activity required to access Panther.  

Getting started with Stage 6

To start testing Stage 6, visit here for the updated dApp test link and follow the steps. Here are some of the features you can look forward to testing in Stage 6:

Fees
This release introduces the mechanism for charging different types of fees within the simulated testnet, including Protocol Fees, Network Gas Fees, KYT fees and KYC Fees. All simulated fees are paid from the Panther Gas account in tZKP. 

AMM Refilling with Protocol Fees
Simulated protocol fees collected during the testing will be deposited into the AMM’s pool of tZKP. 

Limited Subsidy Rewards
The production (mainnet) version of Panther Protocol will include a feature that will allow to introduce additional rewards to partially or fully compensate costs like Network Gas Cost, KYC, KYT and Miner fees, in service of encouraging use of the platform. Stage 6 of our testnet simulates this feature for testing purposes.

Note: This feature is designed to cover only tZKP-nominated fees, and will not involved test MATIC or other native test network tokens unless using the relayer service.

Note: As the protocol developer, Panther Ventures Limited is resuming the testnet after re-deployment on Amoy. Panther Protocol Foundation is finalizing the details of the reward mechanism and will ensure that testers are rewarded proportionally to their efforts and we expect all incentives will be distributed via airdrop. This blog will be updated with more specifics in the coming days.

Disclaimer

For the avoidance of doubt, tZKP, tzZKP, tPRP, test MATIC, and any other tokens mentioned in this announcement or within the product are for testing purposes only and have no economic value, nor can they be exchanged for value. 

Participation on our incentivized Testnet versions may result in you earning rewards, but such credits are not represented on any blockchain as tokens.

Friday, 07. June 2024

Epicenter Podcast

EthStaker: Ethereum Staking Wars - Nixorokish & Superphiz

In a world where a handful of (centralised) entities hold the majority of staked Ethereum (either directly or via delegation), the network’s decentralisation and security might be in peril. While there are multiple proposals being discussed to address this pressing matter, from reducing issuance after a certain breakpoint (~30% of total ETH being staked) to introducing more severe slashing for col

In a world where a handful of (centralised) entities hold the majority of staked Ethereum (either directly or via delegation), the network’s decentralisation and security might be in peril. While there are multiple proposals being discussed to address this pressing matter, from reducing issuance after a certain breakpoint (~30% of total ETH being staked) to introducing more severe slashing for colluding entities, education and guidance remain paramount. EthStaker is a community led project whose goal is to guide, educate, support and offer resources for stakers, in order to lower the barrier of entry for solo stakers.

We were joined by Superphiz & Nixorokish, to discuss the current Ethereum staking landscape and what challenges need to be addressed to ensure the network’s decentralisation and security.

Topics covered in this episode:

Superphiz’ & Nixorokish’ backgrounds Ethereum staking evolution Staking options Liquid staking Centralization risks Incentivizing solo staking Lowering ETH issuance proposal & network security Solo stakers How a healthy staking ecosystem looks like The importance of client diversity Socio-political collusion Misc.

Episode links:

Superphiz on Twitter Nixorokish on Twitter EthStaker on Twitter EthStaker

Sponsors:

Gnosis: Gnosis builds decentralized infrastructure for the Ethereum ecosystem, since 2015. This year marks the launch of Gnosis Pay— the world's first Decentralized Payment Network. Get started today at - gnosis.io Chorus1: Chorus1 is one of the largest node operators worldwide, supporting more than 100,000 delegators, across 45 networks. The recently launched OPUS allows staking up to 8,000 ETH in a single transaction. Enjoy the highest yields and institutional grade security at - chorus.one

This episode is hosted by Friederike Ernst.

Thursday, 06. June 2024

a16z Podcast

California's Senate Bill 1047: What You Need to Know

On May 21, the California Senate passed bill 1047. This bill – which sets out to regulate AI at the model level – wasn’t garnering much attention, until it slid through an overwhelming bipartisan vote of 32 to 1 and is now queued for an assembly vote in August that would cement it into law. In this episode, a16z General Partner Anjney Midha and Venture Editor Derrick Harris breakdown everything t

On May 21, the California Senate passed bill 1047.

This bill – which sets out to regulate AI at the model level – wasn’t garnering much attention, until it slid through an overwhelming bipartisan vote of 32 to 1 and is now queued for an assembly vote in August that would cement it into law. In this episode, a16z General Partner Anjney Midha and Venture Editor Derrick Harris breakdown everything the tech community needs to know about SB-1047.

This bill really is the tip of the iceberg, with over 600 new pieces of AI legislation swirling in the United States. So if you care about one of the most important technologies of our generation and America’s ability to continue leading the charge here, we encourage you to read the bill and spread the word.

Read the bill: https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=202320240SB1047


Horizen - Blog

Mainnet Node Software Upgrade: ZEN 5.0.3 is Available to Download

The new version ZEN 5.0.3 is available to download on GitHub and via Docker. The post Mainnet Node Software Upgrade: ZEN 5.0.3 is Available to Download appeared first on Horizen Blog.

The new version ZEN 5.0.3 is available to download on GitHub and via Docker.

Download ZEN 5.0.3 Now ZEN 5.0.3 is an official maintenance Release for Mainnet and Testnet ZEN 5.0.3 will not perform any network upgrade on Mainnet via Hard Fork Nodes running on Mainnet and Public Testnet can be updated with this version, but the upgrade is not mandatory See release notes Here Performance improvement(60x) on listunspent RPC implementation listunspent RPC implementation, results ordered by destination address (previously ordered by transaction hash)

Please let us know if you have any questions or need further support by contacting us on our Discord.

The post Mainnet Node Software Upgrade: ZEN 5.0.3 is Available to Download appeared first on Horizen Blog.

Wednesday, 05. June 2024

Horizen - Blog

EON 1.4 – Version Changes and Update Instructions

Overview This document will outline the various changes that were introduced with the 1.4 update. It will provide a list of areas you should be aware of and also instructions on how to update your forger with new capabilities. In version 1.4, we’ve improved the usability of the staking system at the protocol level by […] The post EON 1.4 – Version Changes and Update Instructions appeared firs

Overview

This document will outline the various changes that were introduced with the 1.4 update. It will provide a list of areas you should be aware of and also instructions on how to update your forger with new capabilities.

In version 1.4, we’ve improved the usability of the staking system at the protocol level by automating the reward distribution among the forger hosting provider and the ZEN holders who stake their ZEN to the forger node. Forgers are now able to designate a smart contract that will allow delegators to claim rewards. The percentage of the forger’s incomes redirected to the smart contract will be declared by each forger when first registering.

Horizen will provide a factory contract that can be used to distribute rewards to forgers, although any contract can be used. The smart contract provided will distribute rewards proportionally to delegators according to the amount they stake.

Version 1.4 also introduces a minimum staking requirement (10 ZEN)  to forgers for security best practices.

It’s important to note that no changes need to be made by existing forger nodes, 100% of the forger incomes will continue being sent to the reward address specified by the forger as it does currently. Otherwise, you can designate a smart contract for reward distribution (instructions provided within this document). 

The EON 1.4 hardfork will happen on Thursday, June 27th at Epoch 1593 (16:55pm UTC)

EON 1.4  mainnet release tag https://github.com/HorizenOfficial/eon/releases/tag/1.4.0 :  Docker Compose Updates: https://github.com/HorizenOfficial/compose-evm-simplified/releases/tag/1.4.0  Horizen Reward Factory Contract Address: https://eon-explorer.horizenlabs.io/address/0x8604Bb903B7D54F666bA1e75f98045345C63132a  (instructions for interacting with the factory contract noted below) Changes in Stakes Management

Please review the following information to understand the changes that occurred with this update. This is important for those of you running forger nodes as some functionality has changed.

A new native smart contract will be available to manage stake-related operations. It will be available at the address 0x0000000000000000000022222222222222222333. Full documentation of its interface can be found here.

The old smart contract exposed at address 0x0000000000000000000022222222222222222222 will be deprecated and all its methods will error if called after the 1.4 hardfork.

The below table summarizes the most important differences.

Detail: Changes Before <> After

Operation

EON < 1.4

EON 1.4

Declare a new forger

Not needed
(Just create a stake with a standard stake operation)

New Endpoint /transaction/registerForger
with 10 ZEN min

Add stake (via smart contract call)

delegate method on old stake contract. Stake owner (the address that can unstake it) can be any address

delegate method on new stake contract
(Stake owner will be always the tx sender)

Add stake (via http endpoints)

transaction/makeForgerStake

Not available

Withdraw stake (via smart contract call)

withdraw on old stake contract

StakeId and additional signature needed

Partial withdraw not possible

withdraw on new stake contract

No stakeId and no additional signature needed

Partial withdraw possible

Withdraw stake (via http endpoints)

transaction/spendForgingStake

Not available

Stakes belonging to a delegator address

getPagedForgersStakesByUser on old stake contract

(in the return list 1 element for each stakeId)

getPagedForgersStakesByDelegator on new stake contract

(in the return list 1 element for each forger, with the sum of the stakes)

Stakes owned or delegated to a specific forger

Not available
(Was possible to retrieve the whole list with getPagedForgersStakes and then filter later)

getPagedForgersStakesByForger on the new stake contract

Total amount staked

stakeOf on the old stake contract: 

Returns only the current value and only by delegator

stakeTotal on new stake contract.

Possible to query historical data for previous epochs, and filter by delegator and/or forger

Stake lottery

Forger will participate at the lottery with any amount

Forger will participate at the lottery only if owned+delegated stake >= 10 ZEN

List of registered forgers

Not available

getPagedForgers on new stake contract


Explorer Changes

EON Explorer will be updated to a new version on Tuesday, June 25.

Breaking changes Current API will be deprecated on the current URL. Please update the API URL to https://eon-explorer-api.horizenlabs.io/api.
Alternatively, the V2 API can be used. See specifications in the V2 API docs: https://eon-explorer.horizenlabs.io/api-docs  New features Display ABI for Forger Stake V2 contract Display a list of “Mainchain Rewards Distribution” from 0x0000000000000000000033333333333333333333  V2 API docs:  https://eon-explorer.horizenlabs.io/api-docs New APIs for forward transfers and fee payments: /blocks/{block_number_or_hash}/forward-transfers /blocks/{block_number_or_hash}/fee-payments /addresses/{address_hash}/forward-transfers /addresses/{address_hash}/fee-payments /forward-transfers /fee-payments Deploying the Factory Contract to EON for Distribution

Horizen provides a default implementation of a smart contract handling the redistribution of rewards between forger’s delegators.

If you want to deploy one for your forger:

Go to the factory contract available on the explorer: https://eon-explorer.horizenlabs.io/address/0x8604Bb903B7D54F666bA1e75f98045345C63132a Select the Contract tab. Select the Write Contract tab. Connect your wallet. Use the method deployDelegatedStakingReferenceImplementation of the factory: The required parameters that identify your forger are: signPubKey vrfKey Note that this is split in between two parameters: one for the first 32 bytes and another for the last btye. This is because the vrfkey is 33 bytes of data.  Once the parameters are entered click the Write button. The method execution will trigger the deployment of the smart contract instance.  Take note of its address: you will need it in the forger registration step.  

For further info, you can read the full tutorial on how to setup a forger node here: https://docs.horizen.io/horizen_eon/tutorials/forger_node_setup_guide

If you want to look at the code of the delegated staking contract, it is available at this address: https://github.com/HorizenOfficial/eon-delegated-staking

This factory contract was audited by Halborn for security best practices.

Forger Registration and Setting Smart Contract for Distribution of Rewards

The instructions below are all steps required when setting up a new forger node. If you already have a node, you only need to (optionally) follow Step 2 to assign an EON smart contract for reward distribution.

Step 1: Registration of  Forgers

Each forger must now register before being able to participate in the lottery and receive stake delegations. This is an additional step compared to the actual flow (so far, was enough to create an initial stake).

The registration can be performed by calling  a new http endpoint on the forger node:

/transaction/registerForger

The method will accept the following parameters:

signPubKey vrfPubKey rewardShare smartContractAddress stakedAmount

(see above link for further info)

Please note that rewardShare and smartContractAddress once set are considered immutable and can’t be changed later  (to use different values a forger would have to register new keys, and ask the delegators to unstake+stake to the new key).

Step 2: Update of a Forger Already Present Before EON 1.4

Forgers already present before EON 1.4 will not require an explicit registration described in Step 1: they will be automatically migrated and registered.

But please note that if they own a total stake (own + delegated)  < 10 ZEN  they will no longer be able to forge any block after the 1.4 hardfork activation, and you will need to increase the stake to 10 ZEN or more.

All migrated forgers will have the fields rewardShare = 0 and smartContractAddress  = none, meaning they will not redirect any percentage of the reward to an additional smart contract (the same behavior happens in old EON versions). 

A method to update these fields is available in the http endpoints:

/transaction/updateForger

The method will accept the following parameters:

signPubKey vrfPubKey rewardShare rewardAddress

(see above link for further info)

Please note this method can be called only once and only if their value is equal to: (rewardShare=0, smartContractAddress=null). In other words,  rewardShare and smartContractAddress once set are considered immutable and  can’t be changed (to use different values a forger would have to register new keys, and ask the delegators to unstake+stake to the new key).

Important: for security reasons the updateForger operation will be doable only starting from two consensus epochs after the hard fork activation (25 hours).

Step 3: Collect Stakes

Similar to the actual smart contract, the new native smart contract will expose methods to delegate stake to a forger and withdraw them.

delegate(bytes32 signPubKey, bytes32 vrf1, bytes1 vrf2)

The method will fail if the forger was not previously registered (remember forgers existing before the hard fork activation will be considered registered by default).

withdraw(bytes32 signPubKey, bytes32 vrf1, bytes1 vrf2, uint256 amount)

The owner of the stake must be the one sending the tx.

Enough stake must exist, but now will be possible also to partially withdraw the stake.

Amount unstaked will be sent back to the sender balance.

Please note that with the new system the stake handling is simplified, since:

There is no more the concept of “stakeId”. A stake is now just an amount associated with a specific forger and delegator address, that can be incremented or decremented over-time with multiple delegate and withdrawal operations.  No additional signatures other than the standard Ethereum-like transaction signature will be required to call delegate and withdraw methods: this allows them to be called easily also from hardware wallets like Ledger. Step 4: Reward Calculation 

The same algorithm  will be applied at the end of each withdrawal epoch to reward forgers, but the amount previously sent to the forger address will be split in two parts if the rewardShare specified by the forger is > 0.

An additional change has been also implemented in the:

Endpoints /block/getFeePayments  RPC endpoint zen_getFeePayments

Their result will keep the same format as now, but will also include the reward paid to the address of the smart contracts (if defined).

We will also detail the amount coming from the mainchain redistribution: to be retrocompatible they will be into additional fields valueFromMainchain and valueFromFees:

Step 5: Claim of Delegators Rewards

The amounts directed to the smart contract will have to be redistributed among delegators through the contract. Distribution is the responsibility of the contract implementation. Horizen’s contract provides a fair distribution based on the amount staked and epoch of staking. The default smart contract implementation provided by Horizen exposes a claim method to do it: 

claimReward(address owner)

Note that anyone could call the claim method on behalf of the owner (no restrictions on the sender of the transaction). 

Documentation on the factory contract is located here: https://github.com/HorizenOfficial/eon/blob/development/doc/howto/delegatedstakingcontract.md

Any forger is free to deploy its own smart contract. Please read the native smart contract documentation for useful methods that could be used inside the smart contract logic.

Reward from Mainchain – New Rules

The maximum ZEN amount redistributed to forgers from the special address 0x000000000000000000003333333333333333333 in a single withdrawal epoch is now limited to a maximum value expressed by the following formula:

MAX_VALUE_REDISTRIBUTED = sum [10% of Mainchain’s block-reward Coinbase of each mainchain block reference included in the withdrawal epoch]

Funds over the limit will stay in the address balance and will be redistributed in the following epochs.

For example:
Current Mainchain block reward: 6.25 ZEN
Number of mainchain block-reference in a withdrawal epoch: 100
MAX_VALUE_REDISTRIBUTED = 10%(6.25) * 100 = 62.5 ZEN

Release Notes

Additional info on other changes introduced in EON 1.4 can be found in the release notes for EON 1.4 and SDK respectfully: 

https://github.com/HorizenOfficial/eon/blob/1.4.0-RC1/doc/release/1.4.0.md  https://github.com/HorizenOfficial/Sidechains-SDK/blob/dev/doc/release/0.12.0.md

The post EON 1.4 – Version Changes and Update Instructions appeared first on Horizen Blog.

Monday, 03. June 2024

Shade Protocol

June Syracuse v0.1 Shade Protocol Upgrade

Greetings community, In this blog post, we’re excited to present a comprehensive list of updates made in May that didn’t make it into the Alexandria update. While these changes may seem minor, we believe in transparency and the importance of highlighting even the smallest tweaks to our application to enhance user experience. First on the list is a fix to the bridge button, addressing instances w

Greetings community,

In this blog post, we’re excited to present a comprehensive list of updates made in May that didn’t make it into the Alexandria update. While these changes may seem minor, we believe in transparency and the importance of highlighting even the smallest tweaks to our application to enhance user experience.

First on the list is a fix to the bridge button, addressing instances where it would mysteriously “gray out” without warning. Additionally, we’ve streamlined the process by adding sSCRT support on the Shade Bridge page, simplifying the management of SCRT and sSCRT balances. Another significant change includes filtering the swap dropdown by user balances, eliminating the need for manual scrolling.

Furthermore, we’ve introduced toast notifications on the Shade bridge page to alert users when they lack sufficient gas for bridging operations. Excitingly, we’ve updated the SILK/SHD circulating supply in anticipation of the June 18th upgrade, alongside migrating the Shade API infrastructure to the latest version of Secret Network post successful mainnet upgrade!

Other improvements include fixing bugs on the portfolio page and updating data feeds for Quicksilver, Stride, and Persistence oracles in light of the Secret Network upgrade.

While this technical rundown may seem brief, we aim to foster trust within the community by providing insight into the ongoing enhancements of the Syracuse upgrade (v0.1) on its journey towards the Shade Protocol private DeFi suite.

Alexandria Unveiled — Shade v2.1

Conclusion

In conclusion, the Syracuse v0.1 gets us one step closer to the launch of Alexandria — a significant milestones in Shade Protocol’s journey towards revolutionizing decentralized finance. Inspired by the legacy of ancient Alexandria and the valor of the Spartans, Shade Protocol stands as a beacon of innovation and financial autonomy in the decentralized landscape.

With its commitment to privacy, ease of use, and continuous improvement through Syracuse upgrades, Shade Protocol is not only shaping the future of DeFi but also empowering individuals to reclaim control over their finances and data. As the community eagerly anticipates the release of Alexandria and beyond, Shade Protocol remains dedicated to its mission of building an unstoppable, decentralized financial ecosystem for all.

Website: app.shadeprotocol.io

Twitter: https://twitter.com/Shade_Protocol


Zaisan

What is NFT Staking?

Featured image: R2-Rewards.com So you’ve got a nice collection of non-fungible tokens (NFTs). After a few months, you have managed to amass a respectable amount of those little digital wonders. You might have been able to sell one or two, earning you enough for a couple of cups of coffee in your nearest franchised outlet. […] The post What is NFT Staking? appeared first on Zaisan.

Featured image: R2-Rewards.com

So you’ve got a nice collection of non-fungible tokens (NFTs). After a few months, you have managed to amass a respectable amount of those little digital wonders. You might have been able to sell one or two, earning you enough for a couple of cups of coffee in your nearest franchised outlet. But as you take sips off that hard-earned coffee and browse through your NFT portfolio, you can’t help but wonder about the value hidden there, just waiting to be uncovered. 

The question is, how to unlock the value in those NFTs? 

NFT staking: definition

When they hear the word staking, most people think back to the old Hammer films, when the nefarious Dracula would meet his demise with a wooden stake through the heart. And yes, that is one traditional understanding of the word staking. But it’s 2022 now, and though Count Dracula still roams somewhere down the darkest halls of our dreams, staking now means something altogether more profitable. 

In today’s digital context of NFTs, staking is sort of a game of investment/rewards. You commit (‘lock’) your NFTs into a platform or protocol for a set period, and you receive rewards (or other benefits) in exchange for such a commitment. 

Staking addresses one NFT’s inherent issue: liquidity. Remember, NFTs are tokenized assets that you can sell, trade, or, as we see in this article, stake for a yield. Staking enables earning passive income off your NFT portfolio while retaining ownership.

Fungible vs non-fungible: an explanation

For a long time, the concepts of fungibility and non-fungibility had been well outside the layman’s lexicon. But once NFTs entered public discourse, people began to learn more about what these concepts mean.

In short, fungibility means the possibility of using goods, items, or commodities interchangeably because they have the same value. Banknotes or coins, pieces of the same fruit, or a liter of petrol are good examples of fungibility.

Non-fungibility denotes uniqueness. Collectible items, including original oil paintings, baseball cards, art pieces, diamonds, and vintage cars can be considered non-fungible.

It is the latter concept that emerges as most interesting, as non-fungibility carries value, and it is a value that NFT staking unlocks.

Image: R2-rewards.com NFT staking: how it works

Unlocking the value of your NFTs through staking is easier than you think. All you need is a crypto wallet and, well, NFTs. The more the better. Then, choose a protocol or a platform to stake your NFTs in, and wait for the rewards, which might be distributed daily, weekly, or in any other fashion. It really is that simple.

NFT staking is akin to yield farming, insofar as you lock a certain amount of wealth (NFTs in this case) and receive a certain percentage of rewards based on the amount staked, the period of time that the NFTS will be locked, and other parameters.

All about staking rewards

When you invest in traditional financial products (bonds, stocks, mutual funds, etc.), you expect your investment to produce a certain yield. Equally, the ultimate purpose of staking your NFTs is to get a return – a reward.

But this raises several questions: what kind of rewards do you get? And who issues those rewards?

If you purchase an investment bond, your reward at the end of the investment term will usually be money, a certain percentage based on the bond’s performance, length of investment, etc. The staking rewards system works similarly. You lock your stake in for a certain amount of time and you get rewards. Some platforms might offer a fixed amount, while others might give users different amounts, depending on how many people are participating, and the total size of the rewards pool.

And who pays these rewards? Well, the platform or protocol where you stake your NFTs. Normally, an algorithm would calculate the due rewards based on specified parameters, and the rewards would be deposited into your account at the agreed intervals. Now, what these rewards are, depends on the protocol or platform. Sometimes, you get the platform’s native tokens, other times you get tokens that you can exchange for fiat, and certain protocols issue tokens that can only be used within their own environment (a gaming platform, for example). Always check the terms and conditions when you sign up so you know exactly what rewards you will get, and how often.

Risk versus Reward

As with most financial endeavors, returns are never guaranteed and are not entirely without risk. If you put up collateral it means you risk losing that. In the case of a staking system, there are often 2 things to keep in mind. First of all, in most staking systems you will need to purchase a staking key to get started. Buying the key can be seen as an entrance fee to the platform and there are often a variety of keys, ranging from cheap and small ones to big and expensive keys. Of course, you hope to earn your investment into the key back but this is never guaranteed.

Another risk is that the staking platform can be hacked and you may lose your NFTs since they’re not in your custody anymore. While the right permissions setup (along with responsible project owners) can drastically reduce this risk, hacks are and will always be a risk for platforms holding considerable amounts of value.

Conclusion

NFT staking is a subset of the broader decentralized finance (DeFi) ecosystem. It is a relatively new practice in the space, but there already are several NFT management companies that offer staking services, including Zaisan, which launched a staking platform in collaboration with R2. While there is an inherent risk to using and creating a staking platform, this risk can be minimised by choosing the right partners and using secure setups.

All in all, staking is just another cool utility of NFTs, which are proving to be rather versatile assets that break new ground almost every day.

The post What is NFT Staking? appeared first on Zaisan.

Thursday, 30. May 2024

Epicenter Podcast

PsyDAO: Decentralising Psychedelic Research - Dima Buterin & Paul Kohlhaas

One could argue that psychedelic research has experienced a similar journey as crypto, in the sense that decentralised communities bound together to fund research where centralised entities refuse to do so. One of the greatest hurdles in modern science is access to research funding, and DeSci aims to provide a solution through decentralised crowdfunding. In addition, IP rights are incontestably at

One could argue that psychedelic research has experienced a similar journey as crypto, in the sense that decentralised communities bound together to fund research where centralised entities refuse to do so. One of the greatest hurdles in modern science is access to research funding, and DeSci aims to provide a solution through decentralised crowdfunding. In addition, IP rights are incontestably attributed with the help of blockchain technology, as they cannot be altered at a later date. However, as the human mind is unique and moulded by each individual’s life story, establishing a causal relationship between consumption and their beneficial or detrimental effects can be extremely biased.

We were joined by Dima Buterin & Paul Kohlhaas, to discuss the vast subject of psychedelics (especially in relationship to trauma), how research has progressed and how decentralisation pushes the field forward (DeSci).

Topics covered in this episode:

Dima’s & Paul’s backgrounds Discovering psychedelics Dealing with trauma Psychedelics throughout history and societies How Molecule DAO was founded DeSci & on-chain IP Ethereum early days, causality & rationality PsyDAO Network states Misc

Episode links:

Dima Buterin on Twitter Paul Kohlhaas on Twitter Molecule DAO on Twitter Psy DAO on Twitter

Sponsors:

Gnosis: Gnosis builds decentralized infrastructure for the Ethereum ecosystem, since 2015. This year marks the launch of Gnosis Pay— the world's first Decentralized Payment Network. Get started today at - gnosis.io Chorus1: Chorus1 is one of the largest node operators worldwide, supporting more than 100,000 delegators, across 45 networks. The recently launched OPUS allows staking up to 8,000 ETH in a single transaction. Enjoy the highest yields and institutional grade security at - chorus.one

This episode is hosted by Brian Fabian Crain.


Circle Press

Chainlink & Circle Expand Enterprise & Developer DeFi Engagement

Chainlink and Circle are collaborating to expand developer usage of USDC and EURC with the Chainlink platform’s industry-standard services for tokenized assets and Circle’s developer platform

Chainlink and Circle are collaborating to expand developer usage of USDC and EURC with the Chainlink platform’s industry-standard services for tokenized assets and Circle’s developer platform

Tuesday, 28. May 2024

RadicalxChange(s)

Frank McCourt: Founder of Project Liberty (Part I)

Today, in Part I of a two-episode conversation, Matt Prewitt is joined by civic entrepreneur and Founder of Project Liberty, Frank McCourt, who is on a mission to reclaim the internet and prioritize human rights in our digital landscape. Drawing parallels between the early public oversight of television and the current state of the internet, Frank highlights the commodification of our data and ide

Today, in Part I of a two-episode conversation, Matt Prewitt is joined by civic entrepreneur and Founder of Project Liberty, Frank McCourt, who is on a mission to reclaim the internet and prioritize human rights in our digital landscape. Drawing parallels between the early public oversight of television and the current state of the internet, Frank highlights the commodification of our data and identities online. He advocates for new protocols and a movement inspired by historical fights against oppression to secure genuine data rights and agency online. As we look to the future, Project Liberty's endeavors may play a crucial role. This interview is a fantastic opportunity to hear more about Frank's thinking.

Links & References: 

References:

Our Biggest Fight: Reclaiming Liberty, Humanity, and Dignity in the Digital Age by Frank H. McCourt, Jr. with Michael J. Casey Tim Berners-Lee - Wikipedia FACT SHEET: CHIPS and Science Act Will Lower Costs, Create Jobs, Strengthen Supply Chains, and Counter China | The White House Mythbusting: The Facts On Reports About Our Data Collection Practices | TikTok Newsroom Sesame Workshop - Wikipedia GDPR The Digital Markets Act: ensuring fair and open digital markets - European Commission The EU’s Digital Services Act TCP/IP | Internet protocol suite - Wikipedia HTTP - Wikipedia Distributed Social Networking Protocol - Wikipedia Technology | Project Liberty Common Sense - Wikipedia

Bios:

Frank H. McCourt, Jr. is a civic entrepreneur and the executive chairman and former CEO of McCourt Global, a private family company committed to building a better future through its work across the real estate, sports, technology, media, and capital investment industries, as well as its significant philanthropic activities. Frank is proud to extend his family’s 130-year legacy of merging community and social impact with financial results, an approach that started when the original McCourt Company was launched in Boston in 1893.

He is a passionate supporter of multiple academic, civic, and cultural institutions and initiatives. He is the founder and executive chairman of Project Liberty, a far-reaching, $500 million initiative to transform the internet through a new, equitable technology infrastructure and rebuild social media in a way that enables users to own and control their personal data. The project includes the development of a groundbreaking, open-source internet protocol called the Decentralized Social Networking Protocol (DSNP), which will be owned by the public to serve as a new web infrastructure. It also includes the creation of Project Liberty’s Institute (formerly The McCourt Institute,) launched with founding partners Georgetown University in Washington, D.C., Stanford University in Palo Alto, CA, and Sciences Po in Paris, to advance research, bring together technologists and social scientists, and develop a governance model for the internet’s next era.

Frank has served on Georgetown University’s Board of Directors for many years and, in 2013, made a $100 million founding investment to create Georgetown University’s McCourt School of Public Policy. He expanded on this in 2021 with a $100 million investment to catalyze an inclusive pipeline of public policy leaders and put the school on a path to becoming tuition-free.

In 2024, Frank released his first book, OUR BIGGEST FIGHT: Reclaiming Liberty, Humanity, and Dignity in the Digital Age.

Frank’s Social Links:

Project Liberty Project Liberty (@pro_jectliberty) / X Project Liberty (@pro_jectliberty) • Instagram McCourt Institute (@McCourtInst) / X

Matt Prewitt (he/him) is a lawyer, technologist, and writer. He is the President of the RadicalxChange Foundation.

Matt’s Social Links:

ᴍᴀᴛᴛ ᴘʀᴇᴡɪᴛᴛ (@m_t_prewitt) / X

Connect with RadicalxChange Foundation:

RadicalxChange Website @RadxChange | Twitter RxC | YouTube RxC | Instagram RxC | LinkedIn Join the conversation on Discord.

Credits:

Produced by G. Angela Corpus. Co-Produced, Edited, Narrated, and Audio Engineered by Aaron Benavides. Executive Produced by G. Angela Corpus and Matt Prewitt. Intro/Outro music by MagnusMoone, “Wind in the Willows,” is licensed under an Attribution-NonCommercial-ShareAlike 3.0 International License (CC BY-NC-SA 3.0)


Zaisan

How to Build an NFT Community

NFT Community management can help ensure the smooth operation of non-fungible token (NFT) communities and their future development in Web3. The idea is to keep track of community events and developments while keeping the community organised and cohesive toward the project goal. Community management can help to resolve disputes and to show empathy to digital […] The post How to Build an NFT Commu

NFT Community management can help ensure the smooth operation of non-fungible token (NFT) communities and their future development in Web3. The idea is to keep track of community events and developments while keeping the community organised and cohesive toward the project goal. Community management can help to resolve disputes and to show empathy to digital asset owners. Nevertheless, NFT holders are, in essence, shareholders of the project. NFT community management is becoming increasingly more important as Web3 communities grow, and with a strong community, anything is possible.

Quick refresh: what is an NFT (community)?

The management of NFT communities is a critical but often overlooked aspect of developing and deploying these digital assets. NFTs are unique tokens on a blockchain that can represent ownership of digital or physical assets. In contrast to fungible tokens, which are interchangeable and can be divided into smaller units, NFTs are non-divisible. Due to their rarity or uniqueness, they are ideal for representing items like art, collectibles, or in-game items with monetary value.

NFT communities are flourishing online spaces where people with a shared interest in NFTs can connect with one another. As the popularity of NFTs continue to grow, as does the number of people flocking to these communities in search of information, advice, and support.

Why NFT community management is essential

With a wide range of people with different levels of knowledge and expertise, NFT community management can be a challenge. It is important to strike a balance between providing content that is both informative and accessible to all while avoiding overwhelming newcomers with too much information.

While the technical aspects of creating and deploying NFTs are essential, it is equally important to have a well-managed community around the project. This is because NFTs are often bought and sold in online communities. The success of an NFT project depends on the strength of these communities. For example, an NFT gaming organisation needs players to enjoy the game enough to transact in-game assets with one another.

A good NFT community will contain a variety of content that covers the basics of what NFTs are and how they work. Additionally, it includes more in-depth articles and discussions for those with a more advanced understanding. It is also essential to have a protocol for moderating content and managing users to keep the community respectful and welcoming to all. One way is by establishing rules and best practices written on communication channels or creating a tutorial page for newcomers.

With the right mix of content and community management, NFT communities can be a valuable resource for anyone interested in this exciting technology. It can be a fun entry point to the Web3 world!

3 Tips to create a strong foundation for your NFT Community

There are a few things that project owners should keep in mind when managing their NFT communities.

1. Constantly remind people of your vision

Have a clear vision and communicate constantly with your members. This way, you can keep everyone on the same page and working towards the same goals. Let’s take, for example, an NFT online chess game that allows users to interact with in-game assets and transact them. This creates a Play-to-Earn (P2E) model. The NFT game envisions creating a community where chess players can enjoy the game while earning rewards in the form of digital assets, NFTs and cryptocurrency. It is therefore necessary for a weekly content plan to include benefits and positive facts about NFT online chess, to create rapport and build trust in the project as more players join.

2. Active moderation is key

The project ambassadors should be active in the community and should engage with community members regularly. This is essentially getting more experienced people in the community to become a moderator. They should be the ones who are zealously advocating for the project. However, a community manager is needed to push content forward and stay attentive to the market needs.

3. Get people involved in decision-making

Finally, it is vital to have a community-centred approach to decision-making. This means that the community should be involved in decisions about the project, such as what features to add or what direction to take the project in. This will lead the project to a democratic system. The funds spent on NFTs are being directed to the project development from a shared decision-making perspective. In practice, this can be done through a voting system or members’ livestream meetings with the project owner.

The post How to Build an NFT Community appeared first on Zaisan.

Monday, 27. May 2024

a16z Podcast

The GenAI 100: The Apps that Stick

Consumer AI is moving fast, so who's leading the charge?  a16z Consumer Partners Olivia Moore and Bryan Kim discuss our GenAI 100 list and what it takes for an AI model to stand out and dominate the market. They discuss how these cutting-edge apps are connecting with their users and debate whether traditional strategies like paid acquisition and network effects are still effective. We're go

Consumer AI is moving fast, so who's leading the charge? 

a16z Consumer Partners Olivia Moore and Bryan Kim discuss our GenAI 100 list and what it takes for an AI model to stand out and dominate the market.

They discuss how these cutting-edge apps are connecting with their users and debate whether traditional strategies like paid acquisition and network effects are still effective. We're going beyond rankings to explore pivotal benchmarks like D7 retention and introduce metrics that define today's AI market.

Note: This episode was recorded prior to OpenAI's Spring update. Catch our latest insights in the previous episode to stay ahead!

 

Resources:

Link to the Gen AI 100: https://a16z.com/100-gen-ai-apps

Find Bryan on Twitter: https://twitter.com/kirbyman01

Find Olivia on Twitter: https://x.com/omooretweets

 

Stay Updated: 

Find a16z on Twitter: https://twitter.com/a16z

Find a16z on LinkedIn: https://www.linkedin.com/company/a16z

Subscribe on your favorite podcast app: https://a16z.simplecast.com/

Follow our host: https://twitter.com/stephsmithio

Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures.

Friday, 24. May 2024

Epicenter Podcast

Crypto ETFs: Trojan Horse or Big Win? - Austin Griffith, Mona El Isa, Peter Van Valkenburgh

The approval of Ethereum spot ETFs sent shockwaves through the industry as policymakers pivoted abruptly from threatening to veto FIT21 bill, to a pro-crypto discourse. One could say their hand was forced by the imminent US elections, but Ethereum is now officially classified as a commodity, nonetheless. As the regulatory hurdle seems, for the time being, surpassed, one should not forget the value

The approval of Ethereum spot ETFs sent shockwaves through the industry as policymakers pivoted abruptly from threatening to veto FIT21 bill, to a pro-crypto discourse. One could say their hand was forced by the imminent US elections, but Ethereum is now officially classified as a commodity, nonetheless. As the regulatory hurdle seems, for the time being, surpassed, one should not forget the values promoted by the crypto movement from the get-go: decentralisation and permissionlessness. However, everyday users tend to overlook these aspects in favour of a more streamlined user experience. As technology evolves and scaling solutions mature, better UI & UX represent crucial goals in the race for end user adoption.

Topics covered in this episode:

Crypto regulations: FIT21 & Tornado Cash trial Ethereum spot ETF Is Ethereum a commodity or a security? Scaling solutions & end user adoption The importance of decentralisation How Web3 lowers barriers of (permissionless) competition Regulatory and legal hurdles Educating ‘normies’ & UX Technology: depoliticising money Berlin blockchain week & Dappcon 2024 wrap-up

Episode links:

Austin Griffith on Twitter Mona El Isa on Twitter Peter Van Valkenburgh on Twitter

Sponsors:

Gnosis: Gnosis builds decentralized infrastructure for the Ethereum ecosystem, since 2015. This year marks the launch of Gnosis Pay— the world's first Decentralized Payment Network. Get started today at - gnosis.io Chorus1: Chorus1 is one of the largest node operators worldwide, supporting more than 100,000 delegators, across 45 networks. The recently launched OPUS allows staking up to 8,000 ETH in a single transaction. Enjoy the highest yields and institutional grade security at - chorus.one

This episode is hosted by Friederike Ernst and Brian Fabian Crain.


Zcash

A proposal for the next Zcash Dev Fund

Electric Coin Co. (ECC) has posted a ZIP for consideration in ongoing discussions about the Zcash Development Fund. Our CEO, Josh Swihart, published it today on the Zcash Community Forum, […] Source
Electric Coin Co. (ECC) has posted a ZIP for consideration in ongoing discussions about the Zcash Development Fund. Our CEO, Josh Swihart, published it today on the Zcash Community Forum, […]

Source

Wednesday, 22. May 2024

a16z Podcast

Finding a Single Source of AI Truth With Marty Chavez From Sixth Street

a16z General Partner David Haber talks with Marty Chavez, vice chairman and partner at Sixth Street Partners, about the foundational role he’s had in merging technology and finance throughout his career, and the magical promises and regulatory pitfalls of AI. This episode is taken from “In the Vault”, a new audio podcast series by the a16z Fintech team. Each episode features the most influential

a16z General Partner David Haber talks with Marty Chavez, vice chairman and partner at Sixth Street Partners, about the foundational role he’s had in merging technology and finance throughout his career, and the magical promises and regulatory pitfalls of AI.

This episode is taken from “In the Vault”, a new audio podcast series by the a16z Fintech team. Each episode features the most influential figures in financial services to explore key trends impacting the industry and the pressing innovations that will shape our future. 

 

Resources: 
Listen to more of In the Vault: https://a16z.com/podcasts/a16z-live

Find Marty on X: https://twitter.com/rmartinchavez

Find David on X: https://twitter.com/dhaber

 

Stay Updated: 

Find a16z on Twitter: https://twitter.com/a16z

Find a16z on LinkedIn: https://www.linkedin.com/company/a16z

Subscribe on your favorite podcast app: https://a16z.simplecast.com/

Follow our host: https://twitter.com/stephsmithio

Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures.

Monday, 20. May 2024

Shade Protocol

Shade Spartan Ethos Introduction

The Spartan Creed In the digital realm, where Spartans roam, We rise, united, to defend our own. With shields of privacy, and spears of autonomy, We march forth, in steadfast sovereignty. Guardians of data, warriors of finance, In every byte, we take a stance. Our information is not for sale, Our sovereignty is not for sale Our dignity is not for sale
The Spartan Creed
In the digital realm, where Spartans roam,
We rise, united, to defend our own.
With shields of privacy, and spears of autonomy,
We march forth, in steadfast sovereignty.
Guardians of data, warriors of finance,
In every byte, we take a stance.
Our information is not for sale,
Our sovereignty is not for sale
Our dignity is not for sale
Our autonomy is not for sale
Our property rights are not for sale
Our souls are not for sale
For we are Spartans, hearts aflame,
In the digital realm, we stake our claim.

Greetings Spartans,

The origin of the Shade Spartans is a simple tale — to be a staunch advocate of privacy and freedom inevitably puts you in the crosshairs of those who see our claims to digital sovereignty as a threat to their power. There are those who believe people must be protected at every turn. That security should be prioritized above the human experiment of individualism and experimentation. The powers at be are leaving an entire generation chained to the burden of malicious (not all) regulation. Disdainful pen & paper, stale and unromantic. Uninspired dead documents that stifle human ingenuity, and at worse, diminish it.

We Spartans.

We few, we chose few.

Pushed to the brink, night near extinct.

In the arena of the digital and physical world we lay it all on the line. We deeply value our dignity. We stake our future on the truth that there is no grea