Blockchain Technology, Bitcoin and America 2.0 - Paul Mampilly

Raoul Pal and Michael Saylor's Bitcoin vs Ethereum analysis is deeply flawed, and here is why.

Regarding the Bitcoin vs Ethereum narrative
Allocating capital in Bitcoin but not in Ethereum is a bet that the planned road-map for Ethereum will not be successfully implemented and/or its economic properties will not function as designed once the final phase of ETH 2.0 goes live. The combination of PoS, sharding and EIP-1559 will allow for a monetary policy that can sustain the system with zero, possibly negative, issuance. Detailed explanations of how this is possible has been documented through numerous interviews and blogs with developers and pundits. We also must take into consideration that even if the issuance is above zero, the returns from staking Ether must be accounted to compare the long-term holding value proposition against something like Bitcoin. If the staking rewards provide ~3% annual returns and issuance is ~2% then the equivalent issuance for a PoW protocol would be ~-1% (this will never happen in the Bitcoin protocol).
Addressing the claim that Ether is not money
The narrative that Ether is not money because the Ethereum protocol is not designed to exclusively function as money is akin to saying that the Internet is not a good emailing system because it is not exclusively designed to transmit emails. This type of narrative is trying to restrict the definition of money by suggesting that its underlying protocol should not have functionality that extends beyond the conventional way we think of it. The reality is that Ethereum is much better suited for a digital economy - Ether is its native monetary asset. The ability to issue other forms of digital assets and execute computer logic in a trustless unified system with a natively defined monetary asset encompasses all the fundamental building blocks of a future digital economy. This is a future where monetary, financial and information systems can take advantage of the inclusiveness, permissionless and trustless aspects that are central to the Bitcoin value proposition.
The Ethereum protocol is designed to do a lot of wonderful things, but it costs money to operate the network and that cost must be covered by something of value that can be easily liquidated or exchanged into other things of value.... otherwise known as money. The idea that Ether is more akin to oil than gold/money just because the price metric for computations is called "gas" falls apart under scrutiny. Ether is strictly used as a monetary incentive. It is not magically burned to propel a fictitious machine that runs the network... the computers that run the Ethereum network run under the same physical principles from the ones of Bitcoin - they consume energy and someone has to pay for it. It just so happens that the monetary rewards and cost of transactions operating the Ethereum network are done exclusively in Ether, and therefore it serves as a monetary base. In addition, Ether has been used as the monetary base for the acquisition of other digital assets during their ICO phase. Lastly, Paypal has revealed they will be including Ether as a means of payment for online merchants. Saying that Ether is not money is like saying the sky isn't blue.
Additional thoughts
  1. The combination of staking, EIP-1559 and sharding will allow ETH to reduce issuance ahead of Bitcoin's schedule. It is very likely going to allow for sustainable zero issuance which is something that is still up in the air for Bitcoin.
  2. The switch from PoW to PoS will dramatically reduce the operational cost of the network while incentivizing ownership of Ether. The reduction in operational cost is a huge factor contributing to a sustainable monetary policy.
  3. The true soundness of Ether as a store of wealth needs to account for the returns from staking. That means that even if the nominal issuance remained higher than Bitcoin, it could still a better investment when you account for the staking returns.
  4. Ethereum can operate as an entire financial system. It allows for issuance of new tokens and it can operate autonomously as a digital assets exchange... so that means that it can be an exchange for tokenized FIAT currencies, cryptocurrencies, tokenized securities and commodities. Think of a global market for stocks, commodities, future contracts and derivatives.
  5. The integration with digital assets is done natively in one network. Ethereum serves as a native monetary asset with sound properties. Tokenized bitcoins would not only significantly reduce security (value would be lost if EITHER network is compromised) it also makes little sense if Ethereum's soundness (staking - issuance) is superior to Bitcoin.
  6. There are a gazillion more use cases for Ethereum that would benefit from having a natively defined monetary asset.
  7. Ultimately Bitcoin might serve as digital gold as a hedge against Ethereum. So they can coexist, but they are still competing with each other in terms of building value. Every investor who is getting into cryptocurrencies should be asking what assets to buy and why. Money allocated to Bitcoin cannot be allocated to Ethereum and vice-versa.
submitted by TheWierdGuy to ethereum [link] [comments]

Bitcoin Newcomers FAQ - Please read!

Welcome to the /Bitcoin Sticky FAQ

You've probably been hearing a lot about Bitcoin recently and are wondering what's the big deal? Most of your questions should be answered by the resources below but if you have additional questions feel free to ask them in the comments.
It all started with the release of the release of Satoshi Nakamoto's whitepaper however that will probably go over the head of most readers so we recommend the following videos for a good starting point for understanding how bitcoin works and a little about its long term potential:
Some other great resources include Lopp.net, the Princeton crypto series and James D'Angelo's Bitcoin 101 Blackboard series.
Some excellent writing on Bitcoin's value proposition and future can be found at the Satoshi Nakamoto Institute.
Some Bitcoin statistics can be found here and here. Developer resources can be found here. Peer-reviewed research papers can be found here.
Potential upcoming protocol improvements and scaling resources here and here.
The number of times Bitcoin was declared dead by the media can be found here (LOL!)

Key properties of Bitcoin

Where can I buy bitcoins?

Bitcoin.org and BuyBitcoinWorldwide.com are helpful sites for beginners. You can buy or sell any amount of bitcoin (even just a few dollars worth) and there are several easy methods to purchase bitcoin with cash, credit card or bank transfer. Some of the more popular resources are below, also check out the bitcoinity exchange resources for a larger list of options for purchases.
Here is a listing of local ATMs. If you would like your paycheck automatically converted to bitcoin use Bitwage.
Note: Bitcoins are valued at whatever market price people are willing to pay for them in balancing act of supply vs demand. Unlike traditional markets, bitcoin markets operate 24 hours per day, 365 days per year. Preev is a useful site that that shows how much various denominations of bitcoin are worth in different currencies. Alternatively you can just Google "1 bitcoin in (your local currency)".

Securing your bitcoins

With bitcoin you can "Be your own bank" and personally secure your bitcoins OR you can use third party companies aka "Bitcoin banks" which will hold the bitcoins for you.
Note: For increased security, use Two Factor Authentication (2FA) everywhere it is offered, including email!
2FA requires a second confirmation code to access your account making it much harder for thieves to gain access. Google Authenticator and Authy are the two most popular 2FA services, download links are below. Make sure you create backups of your 2FA codes.
Google Auth Authy OTP Auth
Android Android N/A
iOS iOS iOS

Watch out for scams

As mentioned above, Bitcoin is decentralized, which by definition means there is no official website or Twitter handle or spokesperson or CEO. However, all money attracts thieves. This combination unfortunately results in scammers running official sounding names or pretending to be an authority on YouTube or social media. Many scammers throughout the years have claimed to be the inventor of Bitcoin. Websites like bitcoin(dot)com and the btc subreddit are active scams. Almost all altcoins (shitcoins) are marketed heavily with big promises but are really just designed to separate you from your bitcoin. So be careful: any resource, including all linked in this document, may in the future turn evil. Don't trust, verify. Also as they say in our community "Not your keys, not your coins".

Where can I spend bitcoins?

Check out spendabit or bitcoin directory for millions of merchant options. Also you can spend bitcoin anywhere visa is accepted with bitcoin debit cards such as the CashApp card. Some other useful site are listed below.
Store Product
Gyft Gift cards for hundreds of retailers including Amazon, Target, Walmart, Starbucks, Whole Foods, CVS, Lowes, Home Depot, iTunes, Best Buy, Sears, Kohls, eBay, GameStop, etc.
Spendabit, Overstock and The Bitcoin Directory Retail shopping with millions of results
ShakePay Generate one time use Visa cards in seconds
NewEgg and Dell For all your electronics needs
Bitwa.la, Coinbills, Piixpay, Bitbill.eu, Bylls, Coins.ph, Bitrefill, LivingRoomofSatoshi, Coinsfer, and more Bill payment
Menufy, Takeaway and Thuisbezorgd NL Takeout delivered to your door
Expedia, Cheapair, Destinia, Abitsky, SkyTours, the Travel category on Gyft and 9flats For when you need to get away
Cryptostorm, Mullvad, and PIA VPN services
Namecheap, Porkbun Domain name registration
Stampnik Discounted USPS Priority, Express, First-Class mail postage
Coinmap and AirBitz are helpful to find local businesses accepting bitcoins. A good resource for UK residents is at wheretospendbitcoins.co.uk.
There are also lots of charities which accept bitcoin donations.

Merchant Resources

There are several benefits to accepting bitcoin as a payment option if you are a merchant;
If you are interested in accepting bitcoin as a payment method, there are several options available;

Can I mine bitcoin?

Mining bitcoins can be a fun learning experience, but be aware that you will most likely operate at a loss. Newcomers are often advised to stay away from mining unless they are only interested in it as a hobby similar to folding at home. If you want to learn more about mining you can read more here. Still have mining questions? The crew at /BitcoinMining would be happy to help you out.
If you want to contribute to the bitcoin network by hosting the blockchain and propagating transactions you can run a full node using this setup guide. If you would prefer to keep it simple there are several good options. You can view the global node distribution here.

Earning bitcoins

Just like any other form of money, you can also earn bitcoins by being paid to do a job.
Site Description
WorkingForBitcoins, Bitwage, Cryptogrind, Coinality, Bitgigs, /Jobs4Bitcoins, BitforTip, Rein Project Freelancing
Lolli Earn bitcoin when you shop online!
OpenBazaar, Purse.io, Bitify, /Bitmarket, 21 Market Marketplaces
/GirlsGoneBitcoin NSFW Adult services
A-ads, Coinzilla.io Advertising
You can also earn bitcoins by participating as a market maker on JoinMarket by allowing users to perform CoinJoin transactions with your bitcoins for a small fee (requires you to already have some bitcoins.

Bitcoin-Related Projects

The following is a short list of ongoing projects that might be worth taking a look at if you are interested in current development in the bitcoin space.
Project Description
Lightning Network Second layer scaling
Blockstream, Rootstock and Drivechain Sidechains
Hivemind and Augur Prediction markets
Tierion and Factom Records & Titles on the blockchain
BitMarkets, DropZone, Beaver and Open Bazaar Decentralized markets
JoinMarket and Wasabi Wallet CoinJoin implementation
Coinffeine and Bisq Decentralized bitcoin exchanges
Keybase Identity & Reputation management
Abra Global P2P money transmitter network
Bitcore Open source Bitcoin javascript library

Bitcoin Units

One Bitcoin is quite large (hundreds of £/$/€) so people often deal in smaller units. The most common subunits are listed below:
Unit Symbol Value Info
bitcoin BTC 1 bitcoin one bitcoin is equal to 100 million satoshis
millibitcoin mBTC 1,000 per bitcoin used as default unit in recent Electrum wallet releases
bit bit 1,000,000 per bitcoin colloquial "slang" term for microbitcoin (μBTC)
satoshi sat 100,000,000 per bitcoin smallest unit in bitcoin, named after the inventor
For example, assuming an arbitrary exchange rate of $10000 for one Bitcoin, a $10 meal would equal:
For more information check out the Bitcoin units wiki.
Still have questions? Feel free to ask in the comments below or stick around for our weekly Mentor Monday thread. If you decide to post a question in /Bitcoin, please use the search bar to see if it has been answered before, and remember to follow the community rules outlined on the sidebar to receive a better response. The mods are busy helping manage our community so please do not message them unless you notice problems with the functionality of the subreddit.
Note: This is a community created FAQ. If you notice anything missing from the FAQ or that requires clarification you can edit it here and it will be included in the next revision pending approval.
Welcome to the Bitcoin community and the new decentralized economy!
submitted by BitcoinFan7 to Bitcoin [link] [comments]

Raoul Pal and Michael Saylor's Bitcoin vs Ethereum analysis is deeply flawed... here is why.

Regarding the Bitcoin vs Ethereum narrative
Allocating capital in Bitcoin but not in Ethereum is a bet that the planned road-map for Ethereum will not be successfully implemented and/or its economic properties will not function as designed once the final phase of ETH 2.0 goes live. The combination of PoS, sharding and EIP-1559 will allow for a monetary policy that can sustain the system with zero, possibly negative, issuance. Detailed explanations of how this is possible has been documented through numerous interviews and blogs with developers and pundits. We also must take into consideration that even if the issuance is above zero, the returns from staking Ether must be accounted to compare the long-term holding value proposition against something like Bitcoin. If the staking rewards provide ~3% annual returns and issuance is ~2% then the equivalent issuance for a PoW protocol would be ~-1% (this will never happen in the Bitcoin protocol).
Addressing the claim that Ether is not money
The narrative that Ether is not money because the Ethereum protocol is not designed to exclusively function as money is akin to saying that the Internet is not a good emailing system because it is not exclusively designed to transmit emails. This type of narrative is trying to restrict the definition of money by suggesting that its underlying protocol should not have functionality that extends beyond the conventional way we think of it. The reality is that Ethereum is much better suited for a digital economy - Ether is its native monetary asset. The ability to issue other forms of digital assets and execute computer logic in a trustless unified system with a natively defined monetary asset encompasses all the fundamental building blocks of a future digital economy. This is a future where monetary, financial and information systems can take advantage of the inclusiveness, permissionless and trustless aspects that are central to the Bitcoin value proposition.
The Ethereum protocol is designed to do a lot of wonderful things, but it costs money to operate the network and that cost must be covered by something of value that can be easily liquidated or exchanged into other things of value.... otherwise known as money. The idea that Ether is more akin to oil than gold/money just because the price metric for computations is called "gas" falls apart under scrutiny. Ether is strictly used as a monetary incentive. It is not magically burned to propel a fictitious machine that runs the network... the computers that run the Ethereum network run under the same physical principles from the ones of Bitcoin - they consume energy and someone has to pay for it. It just so happens that the monetary rewards and cost of transactions operating the Ethereum network are done exclusively in Ether, and therefore it serves as a monetary base. In addition, Ether has been used as the monetary base for the acquisition of other digital assets during their ICO phase. Lastly, Paypal has revealed they will be including Ether as a means of payment for online merchants. Saying that Ether is not money is like saying the sky isn't blue.
Additional thoughts
  1. The combination of staking, EIP-1559 and sharding will allow ETH to reduce issuance ahead of Bitcoin's schedule. It is very likely going to allow for sustainable zero issuance which is something that is still up in the air for Bitcoin.
  2. The switch from PoW to PoS will dramatically reduce the operational cost of the network while incentivizing ownership of Ether. The reduction in operational cost is a huge factor contributing to a sustainable monetary policy.
  3. The true soundness of Ether as a store of wealth needs to account for the returns from staking. That means that even if the nominal issuance remained higher than Bitcoin, it could still a better investment when you account for the staking returns.
  4. Ethereum can operate as an entire financial system. It allows for issuance of new tokens and it can operate autonomously as a digital assets exchange... so that means that it can be an exchange for tokenized FIAT currencies, cryptocurrencies, tokenized securities and commodities. Think of a global market for stocks, commodities, future contracts and derivatives.
  5. The integration with digital assets is done natively in one network. Ethereum serves as a native monetary asset with sound properties. Tokenized bitcoins would not only significantly reduce security (value would be lost if EITHER network is compromised) it also makes little sense if Ethereum's soundness (staking - issuance) is superior to Bitcoin.
  6. There are a gazillion more use cases for Ethereum that would benefit from having a natively defined monetary asset.
  7. Ultimately Bitcoin might serve as digital gold as a hedge against Ethereum. So they can coexist, but they are still competing with each other in terms of building value. Every investor who is getting into cryptocurrencies should be asking what assets to buy and why. Money allocated to Bitcoin cannot be allocated to Ethereum and vice-versa.
submitted by TheWierdGuy to ethtrader [link] [comments]

d down, k up, everybody's a game theorist, titcoin, build wiki on Cardano, (e-)voting, competitive marketing analysis, Goguen product update, Alexa likes Charles, David hates all, Adam in and bros in arms with the scientific counterparts of the major cryptocurrency groups, the latest AMA for all!

Decreasing d parameter
Just signed the latest change management document, I was the last in the chain so I signed it today for changing the d parameter from 0.52 to 0.5. That means we are just about to cross the threshold here in a little bit for d to fall below 0.5 which means more than half of all the blocks will be made by the community and not the OBFT nodes. That's a major milestone and at this current rate of velocity it looks like d will decrement to zero around March so lots to do, lots to talk about. Product update, two days from now, we'll go ahead and talk about that but it crossed my desk today and I was really happy and excited about that and it seemed like yesterday that d was equal to one and people were complaining that we delayed it by an epoch and now we're almost at 50 percent. For those of you who want parameter-level changes, k-level changes, they are coming and there's an enormous internal conversation about it and we've written up a powerpoint presentation and a philosophy document about why things were designed the way that they're designed.
Increasing k parameter and upcoming security video and everybody's a game theorist
My chief scientist has put an enormous amount of time into this. Aggelos is very passionate about this particular topic and what I'm going to do is similar to the security video that I did where I did an hour and a half discussion about a best practice for security. I'm going to actually do a screencasted video where I talk about this philosophy document and I'm going to read the entire document with annotations with you guys and kind of talk through it. It might end up being quite a long video. It could be several hours long but I think it's really important to talk around the design philosophy of this. It's kind of funny, everybody, when they see a cryptographic paper or math paper, they tend to just say okay you guys figure that out. No one's an expert in cryptography or math and you don't really get strong opinions about it but game theory despite the fact that the topics as complex and in some cases more complex you tend to get a lot of opinions and everybody's a game theorist. So, there was enormous amount of thought that went into the design of the system, the parameters of system, everything from the reward functions to other things and it's very important that we explain that thought process in as detailed of a way as possible. At least the philosophy behind it then I feel that the community is in a really good position to start working on the change management. It is my position that I'd love to see k largely increased. I do think that the software needs some improvements to get there especially partial delegation delegation portfolios and some enhancements into the operation of staking especially.
E-voting
I'd love to see the existence of hybrid wallets where you have a cold part a hot part and we've had a lot of conversations about that and we will present some of the progress in that matter at the product updates. If not this October certainly in November. A lot of commercialization going along, a lot of things going on and flowing around and you know, commercial teams working hard. As I mentioned we have a lot of deals in the pipeline. The Wyoming event was half political, half sales. We were really looking into e-voting and we had very productive conversations along those lines. It is my goal that Cardano e-voting software is used in political primaries and my hope is for eventually to be used in municipal and state and eventually federal elections and then in national elections for countries like Ethiopia, Mongolia and other places. Now there is a long road, long, long road to get there and many little victories that have to begin but this event. Wyoming was kind of the opener into that conversation there were seven independent parties at the independent national convention and we had a chance to talk to the leadership of many of them. We will also engage in conversation with the libertarian party leadership as well and at the very least we could talk about e-voting and also blockchain-based voting for primaries that would be great start and we'll also look into the state of Wyoming for that as well. We'll you know, tell you guys about that in time. We've already gotten a lot of inquiries about e-voting software. We tend to get them along with the (Atala) Prism inquiries. It's actually quite easy to start conversations but there are a lot of security properties that are very important like end-to-end verifiability hybrid ballots where you have both a digital and a paper ballot delegation mechanics as well as privacy mechanics that are interesting on a case-by-case basis.
Goguen, voting, future fund3, competitive marketing analysis of Ouroboros vs. EOS, Tezos, Algorand, ETH2 and Polkadot, new creative director
We'll keep chipping away at that, a lot of Goguen stuff to talk about but I'm going to reserve all of that for two days from now for the product update. We're right in the middle, Goguen metadata was the very first part of it. We already have some commercialization platform as a result of metadata, more to come and then obviously lots of smart contract stuff to come. This update and the November update are going to be very Goguen focused and also a lot of alternatives as well. We're still on schedule for an HFC event in I think November or December. I can't remember but that's going to be carrying a lot of things related multisig token locking. There's some ledger rule changes so it has to be an HFC event and that opens up a lot of the windows for Goguen foundations as well as voting on chain so fund3 will benefit very heavily from that. We're right in the guts of Daedalus right now building the voting center, the identity center, QR-code work. All this stuff, it's a lot of stuff, you know, the cell phone app was released last week. Kind of an early beta, it'll go through a lot of rapid iterations every few weeks. We'll update it, google play is a great foundation to launch things on because it's so easy to push updates to people automatically so you can rapidly iterate and be very agile in that framework and you know we've already had 3500 people involved heavily in the innovation management platform ideascale and we've got numerous bids from everything. From John Buck and the sociocracy movement to others. A lot of people want to help us improve that and we're going to see steady and systematic growth there. We're still chipping away at product marketing. Liza (Horowitz) is doing a good job, meet with her two three-times a week and right now it's Ouroboros, Ouroboros, Ouroboros... We're doing competitive analysis of Ouroboros versus EOS, Tezos, Algorand, ETH2 and Polkadot. We think that's a good set. We think we have a really good way of explaining it. David (David Likes Crypto now at IOHK) has already made some great content. We're going to release that soon alongside some other content and we'll keep chipping away at that.
We also just hired a creative director for IO Global. His name's Adam, incredibly experienced creative director, he's worked for Mercedes-Benz and dozens of other companies. He does very good work and he's been doing this for well over 20 years and so the very first set of things he's going to do is work with commercial and marketing on product marketing. In addition to building great content where hope is make that content as pretty as possible and we have Rod heavily involved in that as well to talk about distribution channels and see if we can amplify the distribution message and really get a lot of stuff done. Last thing to mention, oh yeah, iOS for catalyst. We're working on that, we submitted it to the apple store, the iOS store, but it takes a little longer to get approval for that than it does with google play but that's been submitted and it's whenever apple approves it or not. Takes a little longer for cryptocurrency stuff.
Wiki shizzle and battle for crypto, make crypto articles on wiki great again, Alexa knows Charles, Everpedia meets Charles podcast, holy-grail land of Cardano, wiki on Cardano, titcoin
Wikipedia... kind of rattled the cage a little bit. Through an intermediary we got contact with Jimmy Wales. Larry Sanger, the other co-founder also reached out to me and the everpedia guys reached out to me. Here's where we stand, we have an article, it has solidified, it's currently labeled as unreliable and you should not believe the things that are said in it which is David Gerard's work if you look at the edits. We will work with the community and try to get that article to a fair and balanced representation of Cardano and especially after the product marketing comes through. We clearly explain the product I think the Cardano article can be massively strengthened. I've told Rod to work with some specialized people to try to get that done but we are going to work very hard at a systematic approval campaign for all of the scientific articles related to blockchain technology in the cryptocurrency space. They're just terrible, if you go to the proof of work article, the proof of stake or all these things, they're just terrible. They're not well written, they're out of date and they don't reflect an adequate sampling of the science. I did talk to my chief scientist Aggelos and what we're gonna do is reach out to the scientific counterparts that most of the major cryptocurrency groups that are doing research and see if they want to work with us at an industry-wide effort to systematically improve the scientific articles in our industry so that there are a fair and balanced representation of what the current state of the art are, the criticisms, the trade-offs as well as the reference space and of course obviously we'll do quite well in that respect because we've done the science. We're the inheritor of it but it's a shame because when people search proof of stake on google usually wikipedia results are highly biased. We care about wikipedia because google cares about wikipedia, amazon cares about wikipedia.
If you ask Alexa who is Charles Hoskinson, the reason why Alexa knows is because it's reading directly from the wikipedia page. If I didn't have a wikipedia page Alexa would know that so if somebody says Alexa what is Cardano it's going to read directly from the wikipedia page and you know and we can either just pretend that reality doesn't exist or we can accept it and we as a community working with partners in the broader cryptocurrency community can universally improve the quality of cryptocurrency pages. There's been a pattern of commercial censorship on wikipedia for cryptocurrencies in general since bitcoin itself. In fact I think the bitcoin article is actually taken down once back in, might have been, 2010 or 2009 but basically wikipedia has not been a friend of cryptocurrencies. That's why everpedia exists and actually their founders reached out to me and I talked to them over twitter through PMs and we agreed to actually do a podcast. I'm going to do a streamyard, stream with these guys and they'll come on talk all about everpedia and what they do and how they are and we'll kind of go through the challenges that they've encountered. How their platform works and so forth and obviously if they want to ever leave that terrible ecosystem EOS and come to the holy-grail land of Cardano we'd be there to help them out. At least they can tell the world how amazing their product is and also the challenges they're having to overcome. We've also been in great contact with Larry Sanger.
He's going to do an internal seminar at some point with with us and talk about some protocols he's been developing since he left wikipedia specifically to decentralize knowledge management and have a truly decentralized encyclopedia. I'm really looking forward to that and I hope that presentation gives us some inspiration as an ecosystem of things we can do. That's a great piece of infrastructure regardless and after we learn a lot more about it and we talk to a lot of people in ecosystem. If we can't get people to move on over, it would be really good to see through ideascale in the innovation management platform for people to utilize the dc fund to build their own variant of wikipedia on Cardano. In the coming months there will certainly be funding available. If you guys are so passionate about this particular problem that you want to go solve it then I'd be happy to play Elon Musk with the hyperloop and write a white paper on a protocol design and really give a good first start and then you guys can go and try to commercialize that technology as Cardano native assets and Plutus smart contracts in addition to other pieces of technology that have to be brought in to make it practical.
Right now we're just, let's talk to everybody phase, and we'll talk to the everpedia guys, we're going to talk to Larry and we're going to see whoever else is in this game and of course we have to accept the incumbency as it is. So, we're working with obviously the wikipedia side to improve the quality of not only our article but all of the articles and the scientific side of things so that there's a fair and accurate representation of information. One of the reasons why I'm so concerned about this is that I am very worried that Cardano projects will get commercially censored like we were commercially censored. So, yes we do have a page but it took five years to get there and we're a multi-billion dollar project with hundreds of thousands of people. If you guys are doing cutting-edge novel interesting stuff I don't want your experience to be the same as ours where you have to wait five years for your project to get a page even after government's adopted. That's absurd, no one should be censored ever. This is very well a fight for the entire ecosystem, the entire community, not just Cardano but all cryptocurrencies: bitcoin, ethereum and Cardano have all faced commercial censorship and article deletions during their tenure so I don't want you guys to go through that. I'm hoping we can prove that situation but you know you don't put all your eggs in one basket and frankly the time has come for wikipedia to be fully decentralized and liberated from a centralized organization and massively variable quality in the editor base. If legends of valor has a page but Cardano didn't have one until recently titcoin, a pornography coin from 2015, that's deprecated, no one uses it, has a page but Cardano couldn't get one there's something seriously wrong with the quality control mechanism and we need to improve that so it'll get done.
submitted by stake_pool to cardano [link] [comments]

Taproot, CoinJoins, and Cross-Input Signature Aggregation

It is a very common misconception that the upcoming Taproot upgrade helps CoinJoin.
TLDR: The upcoming Taproot upgrade does not help equal-valued CoinJoin at all, though it potentially increases the privacy of other protocols, such as the Lightning Network, and escrow contract schemes.
If you want to learn more, read on!

Equal-valued CoinJoins

Let's start with equal-valued CoinJoins, the type JoinMarket and Wasabi use. What happens is that some number of participants agree on some common value all of them use. With JoinMarket the taker defines this value and pays the makers to agree to it, with Wasabi the server defines a value approximately 0.1 BTC.
Then, each participant provides inputs that they unilaterally control, totaling equal or greater than the common value. Typically since each input is unilaterally controlled, each input just requires a singlesig. Each participant also provides up to two addresses they control: one of these will be paid with the common value, while the other will be used for any extra value in the inputs they provided (i.e. the change output).
The participants then make a single transaction that spends all the provided inputs and pays out to the appropriate outputs. The inputs and outputs are shuffled in some secure manner. Then the unsigned transaction is distributed back to all participants.
Finally, each participant checks that the transaction spends the inputs it provided (and more importantly does not spend any other coins it might own that it did not provide for this CoinJoin!) and that the transaction pays out to the appropriate address(es) it controls. Once they have validated the transaction, they ratify it by signing for each of the inputs it provided.
Once every participant has provided signatures for all inputs it registered, the transaction is now completely signed and the CoinJoin transaction is now validly confirmable.
CoinJoin is a very simple and direct privacy boost, it requires no SCRIPTs, needs only singlesig, etc.

Privacy

Let's say we have two participants who have agreed on a common amount of 0.1 BTC. One provides a 0.105 coin as input, the other provides a 0.114 coin as input. This results in a CoinJoin with a 0.105 coin and a 0.114 coin as input, and outputs with 0.1, 0.005, 0.014, and 0.1 BTC.
Now obviously the 0.005 output came from the 0.105 input, and the 0.014 output came from the 0.114 input.
But the two 0.1 BTC outputs cannot be correlated with either input! There is no correlating information, since either output could have come from either input. That is how common CoinJoin implementations like Wasabi and JoinMarket gain privacy.

Banning CoinJoins

Unfortunately, large-scale CoinJoins like that made by Wasabi and JoinMarket are very obvious.
All you have to do is look for a transactions where, say, more than 3 outputs are the same equal value, and the number of inputs is equal or larger than the number of equal-valued outputs. Thus, it is trivial to identify equal-valued CoinJoins made by Wasabi and JoinMarket. You can even trivially differentiate them: Wasabi equal-valued CoinJoins are going to have a hundred or more inputs, with outputs that are in units of approximately 0.1 BTC, while JoinMarket CoinJoins have equal-valued outputs of less than a dozen (between 4 to 6 usually) and with the common value varying wildly from as low as 0.001 BTC to as high as a dozen BTC or more.
This has led to a number of anti-privacy exchanges to refuse to credit custodially-held accounts if the incoming deposit is within a few hops of an equal-valued CoinJoin, usually citing concerns about regulations. Crucially, the exchange continues to hold private keys for those "banned" deposits, and can still spend them, thus this is effectively a theft. If your exchange does this to you, you should report that exchange as stealing money from its customers. Not your keys not your coins.
Thus, CoinJoins represent a privacy tradeoff:

Taproot

Let's now briefly discuss that nice new shiny thing called Taproot.
Taproot includes two components:
This has some nice properties:

Taproot DOES NOT HELP CoinJoin

So let's review!
CoinJoin:
Taproot:
There is absolutely no overlap. Taproot helps things that CoinJoin does not use. CoinJoin uses things that Taproot does not improve.

B-but They Said!!

A lot of early reporting on Taproot claimed that Taproot benefits CoinJoin.
What they are confusing is that earlier drafts of Taproot included a feature called cross-input signature aggregation.
In current Bitcoin, every input, to be spent, has to be signed individually. With cross-input signature aggregation, all inputs that support this feature are signed with a single signature that covers all those inputs. So for example if you would spend two inputs, current Bitcoin requires a signature for each input, but with cross-input signature aggregation you can sign both of them with a single signature. This works even if the inputs have different public keys: two inputs with cross-input signature aggregation effectively define a 2-of-2 public key, and you can only sign for that input if you know the private keys for both inputs, or if you are cooperatively signing with somebody who knows the private key of the other input.
This helps CoinJoin costs. Since CoinJoins will have lots of inputs (each participant will provide at least one, and probably will provide more, and larger participant sets are better for more privacy in CoinJoin), if all of them enabled cross-input signature aggregation, such large CoinJoins can have only a single signature.
This complicates the signing process for CoinJoins (the signers now have to sign cooperatively) but it can be well worth it for the reduced signature size and onchain cost.
But note that the while cross-input signature aggregation improves the cost of CoinJoins, it does not improve the privacy! Equal-valued CoinJoins are still obvious and still readily bannable by privacy-hating exchanges. It does not improve the privacy of CoinJoin. Instead, see https://old.reddit.com/Bitcoin/comments/gqb3udesign_for_a_coinswap_implementation_fo

Why isn't cross-input signature aggregation in?

There's some fairly complex technical reasons why cross-input signature aggregation isn't in right now in the current Taproot proposal.
The primary reason was to reduce the technical complexity of Taproot, in the hope that it would be easier to convince users to activate (while support for Taproot is quite high, developers have become wary of being hopeful that new proposals will ever activate, given the previous difficulties with SegWit).
The main technical complexity here is that it interacts with future ways to extend Bitcoin.
The rest of this writeup assumes you already know about how Bitcoin SCRIPT works. If you don't understand how Bitcoin SCRIPT works at the low-level, then the TLDR is that cross-input signature aggregation complicates how to extend Bitcoin in the future, so it was deferred to let the develoeprs think more about it.
(this is how I understand it; perhaps pwuille or ajtowns can give a better summary.)
In detail, Taproot also introduces OP_SUCCESS opcodes. If you know about the OP_NOP opcodes already defined in current Bitcoin, well, OP_SUCCESS is basically "OP_NOP done right".
Now, OP_NOP is a do-nothing operation. It can be replaced in future versions of Bitcoin by having that operation check some condition, and then fail if the condition is not satisfied. For example, both OP_CHECKLOCKTIMEVERIFY and OP_CHECKSEQUENCEVERIFY were previously OP_NOP opcodes. Older nodes will see an OP_CHECKLOCKTIMEVERIFY and think it does nothing, but newer nodes will check if the nLockTime field has a correct specified value, and fail if the condition is not satisfied. Since most of the nodes on the network are using much newer versions of the node software, older nodes are protected from miners who try to misspend any OP_CHECKLOCKTIMEVERIFY/OP_CHECKSEQUENCEVERIFY, and those older nodes will still remain capable of synching with the rest of the network: a dedication to strict backward-compatibility necessary for a consensus system.
Softforks basically mean that a script that passes in the latest version must also be passing in all older versions. A script cannot be passing in newer versions but failing in older versions, because that would kick older nodes off the network (i.e. it would be a hardfork).
But OP_NOP is a very restricted way of adding opcodes. Opcodes that replace OP_NOP can only do one thing: check if some condition is true. They can't push new data on the stack, they can't pop items off the stack. For example, suppose instead of OP_CHECKLOCKTIMEVERIFY, we had added a OP_GETBLOCKHEIGHT opcode. This opcode would push the height of the blockchain on the stack. If this command replaced an older OP_NOP opcode, then a script like OP_GETBLOCKHEIGHT 650000 OP_EQUAL might pass in some future Bitcoin version, but older versions would see OP_NOP 650000 OP_EQUAL, which would fail because OP_EQUAL expects two items on the stack. So older versions will fail a SCRIPT that newer versions will pass, which is a hardfork and thus a backwards incompatibility.
OP_SUCCESS is different. Instead, old nodes, when parsing the SCRIPT, will see OP_SUCCESS, and, without executing the body, will consider the SCRIPT as passing. So, the OP_GETBLOCKHEIGHT 650000 OP_EQUAL example will now work: a future version of Bitcoin might pass it, and existing nodes that don't understand OP_GETBLOCKHEIGHT will se OP_SUCCESS 650000 OP_EQUAL, and will not execute the SCRIPT at all, instead passing it immediately. So a SCRIPT that might pass in newer versions will pass for older versions, which keeps the back-compatibility consensus that a softfork needs.
So how does OP_SUCCESS make things difficult for cross-input signatur aggregation? Well, one of the ways to ask for a signature to be verified is via the opcodes OP_CHECKSIGVERIFY. With cross-input signature aggregation, if a public key indicates it can be used for cross-input signature aggregation, instead of OP_CHECKSIGVERIFY actually requiring the signature on the stack, the stack will contain a dummy 0 value for the signature, and the public key is instead added to a "sum" public key (i.e. an n-of-n that is dynamically extended by one more pubkey for each OP_CHECKSIGVERIFY operation that executes) for the single signature that is verified later by the cross-input signature aggregation validation algorithm00.
The important part here is that the OP_CHECKSIGVERIFY has to execute, in order to add its public key to the set of public keys to be checked in the single signature.
But remember that an OP_SUCCESS prevents execution! As soon as the SCRIPT is parsed, if any opcode is OP_SUCCESS, that is considered as passing, without actually executing the SCRIPT, because the OP_SUCCESS could mean something completely different in newer versions and current versions should assume nothing about what it means. If the SCRIPT contains some OP_CHECKSIGVERIFY command in addition to an OP_SUCCESS, that command is not executed by current versions, and thus they cannot add any public keys given by OP_CHECKSIGVERIFY. Future versions also have to accept that: if they parsed an OP_SUCCESS command that has a new meaning in the future, and then execute an OP_CHECKSIGVERIFY in that SCRIPT, they cannot add the public key into the same "sum" public key that older nodes use, because older nodes cannot see them. This means that you might need more than one signature in the future, in the presence of an opcode that replaces some OP_SUCCESS.
Thus, because of the complexity of making cross-input signature aggregation work compatibly with future extensions to the protocol, cross-input signature aggregation was deferred.
submitted by almkglor to Bitcoin [link] [comments]

Crypto Weekly News — October, 9

What important crypto events happened last week?
Cryptocurrencies
The AAVE Token Arrives On Gemini
Regulated cryptocurrency exchange Gemini has added support for AAVE (formerly LEND) tokens, of another protocol widely used in DeFi. At the moment, only custody and deposit services are available; trading will arrive shortly.
RSK & RIF Integrate DAI Stablecoin
The RSK team, which developed a sidechain extension of Bitcoin with support for smart contracts, announced that the stablecoin DAI is now available on its platform. This integration allows users to transfer DAI from Ethereum to the RSK sidechain, making the stablecoin available for usage in a DeFi ecosystem supported by Bitcoin.
Litecoin Inches Closer To Greater Privacy With Mimblewimble Testnet
The long-awaited Litecoin update, designed to increase the privacy level of the network, has come closer due to the test launch of MimbleWimble technology. Charlie Lee first announced his plans to integrate this technology in early 2019, but progress only appeared with the arrival of the Grin++ developer David Burkett in December 2019. The MimbleWimble update will allow users to hide their transactions and personal data.
Projects And Updates
Access To DeFi Oracles: Radix Integrates Chainlink
Radix announced integration with Chainlink to make DeFi Oracles more accessible to developers, which will facilitate the spread of traditional financial services through decentralized applications. Radix is a first-level Protocol created specifically for DeFi. According to its CEO Piers Ridyard, the data access that developers will get after integrating with Chainlink is vital for providing the best infrastructure to build next-generation DeFi products.
Brave Websites And Browser Now Available On TOR
Brave web browser, which has user privacy its top priority, has announced that its websites will now be accessed directly from the dark web through .onion addresses. Greater integration with Tor will make users' experience with its services even more secure.
Bitfinex Starts Staking On Cardano
Bitfenix cryptocurrency exchange has announced the launching of Cardano (ADA) staking. There is no minimum amount to stake, and ADA stakeholders can expect an income of up to 4.3% per annum with weekly payouts. In most cases, users will be able to withdraw funds immediately.
Switcheo Launches Zilswap, First Decentralized Exchange DEX On Zilliqa
The team of developers of the decentralized trading platform Switcheo will launch an analog of the Uniswap exchange, specialized in trading the DeFi market’s tokens. The exchange named Zilswap will be based on the scalable Zilliqa blockchain and will be the first of its kind.
Hacking
KuCoin Exchange Hackers Identified
The KuCoin exchange's CEO, Johnny Lyu, informed that his team found out who committed the hacking of the site on September 26. In his tweet, he noted that the company has substantial proof of the suspects' guilt and that the case is being handled by law enforcement officials. According to the official statement, the company has enough funds to cover all losses.
Bungled Theft Of Bitcoin ATM Puts Canadian Business Out Of Action
In the Canadian Kelowna, an attempt to steal a Bitcoin ATM ended with failure. The unlucky burglars could not take the ATM away but caused significant damage to the building by their truck.
Regulations
A Digital Euro May Be Imminent: ECB Could Launch Digital Euro Project In 2021
The European Central Bank is considering its cryptocurrency project. The decision will be made in mid-2021 after conducting surveys among EU citizens and consulting financial experts.
People
John McAfee Could Face A 5-year Jail Term Over Concealed Crypto-Assets And Tax Evasion Allegation
John McAfee, the founder of the antivirus company McAfee, may end up in prison for tax evasion. His charge includes allegations of cryptocurrency frauds, registration of a property for the third persons, and using false names making bank transactions.
Winklevoss-Founded Crypto Exchange Gemini Hires Former Morgan Stanley Exec
Gemini crypto exchange has hired Andy Meehan, a former executive at Morgan Stanley investment bank, to expand in the Asia-Pacific region. According to the press release, Meehan will work with regulators "to promote smart regulations that drive adoption in this growing market".
Exclusive Interview With David Waslen From HedgeTrade
David Waslen, the CEO and co-founder of HedgeTrade, gave CoinJoy an interview in which he talked about new technologies, the accuracy of market predictions, and shared exclusive news about the project.
That’s all for now! For more details follow us on Twitter, subscribe to our YouTube channel, join our Telegram.
submitted by CoinjoyAssistant to CryptoCurrencies [link] [comments]

Crypto Weekly News — October, 9

What important crypto events happened last week?

Cryptocurrencies

The AAVE Token Arrives On Gemini
Regulated cryptocurrency exchange Gemini has added support for AAVE (formerly LEND) tokens, of another protocol widely used in DeFi. At the moment, only custody and deposit services are available; trading will arrive shortly.
RSK & RIF Integrate DAI Stablecoin
The RSK team, which developed a sidechain extension of Bitcoin with support for smart contracts, announced that the stablecoin DAI is now available on its platform. This integration allows users to transfer DAI from Ethereum to the RSK sidechain, making the stablecoin available for usage in a DeFi ecosystem supported by Bitcoin.
Litecoin Inches Closer To Greater Privacy With Mimblewimble Testnet
The long-awaited Litecoin update, designed to increase the privacy level of the network, has come closer due to the test launch of MimbleWimble technology. Charlie Lee first announced his plans to integrate this technology in early 2019, but progress only appeared with the arrival of the Grin++ developer David Burkett in December 2019. The MimbleWimble update will allow users to hide their transactions and personal data.

Projects And Updates

Access To DeFi Oracles: Radix Integrates Chainlink
Radix announced integration with Chainlink to make DeFi Oracles more accessible to developers, which will facilitate the spread of traditional financial services through decentralized applications. Radix is a first-level Protocol created specifically for DeFi. According to its CEO Piers Ridyard, the data access that developers will get after integrating with Chainlink is vital for providing the best infrastructure to build next-generation DeFi products.
Brave Websites And Browser Now Available On TOR
Brave web browser, which has user privacy its top priority, has announced that its websites will now be accessed directly from the dark web through .onion addresses. Greater integration with Tor will make users' experience with its services even more secure.
Bitfinex Starts Staking On Cardano
Bitfenix cryptocurrency exchange has announced the launching of Cardano (ADA) staking. There is no minimum amount to stake, and ADA stakeholders can expect an income of up to 4.3% per annum with weekly payouts. In most cases, users will be able to withdraw funds immediately.
Switcheo Launches Zilswap, First Decentralized Exchange DEX On Zilliqa
The team of developers of the decentralized trading platform Switcheo will launch an analog of the Uniswap exchange, specialized in trading the DeFi market’s tokens. The exchange named Zilswap will be based on the scalable Zilliqa blockchain and will be the first of its kind.

Hacking

KuCoin Exchange Hackers Identified
The KuCoin exchange's CEO, Johnny Lyu, informed that his team found out who committed the hacking of the site on September 26. In his tweet, he noted that the company has substantial proof of the suspects' guilt and that the case is being handled by law enforcement officials. According to the official statement, the company has enough funds to cover all losses.
Bungled Theft Of Bitcoin ATM Puts Canadian Business Out Of Action
In the Canadian Kelowna, an attempt to steal a Bitcoin ATM ended with failure. The unlucky burglars could not take the ATM away but caused significant damage to the building by their truck.

Regulations

A Digital Euro May Be Imminent: ECB Could Launch Digital Euro Project In 2021
The European Central Bank is considering its cryptocurrency project. The decision will be made in mid-2021 after conducting surveys among EU citizens and consulting financial experts.

People

John McAfee Could Face A 5-year Jail Term Over Concealed Crypto-Assets And Tax Evasion Allegation
John McAfee, the founder of the antivirus company McAfee, may end up in prison for tax evasion. His charge includes allegations of cryptocurrency frauds, registration of a property for the third persons, and using false names making bank transactions.
Winklevoss-Founded Crypto Exchange Gemini Hires Former Morgan Stanley Exec
Gemini crypto exchange has hired Andy Meehan, a former executive at Morgan Stanley investment bank, to expand in the Asia-Pacific region. According to the press release, Meehan will work with regulators "to promote smart regulations that drive adoption in this growing market".
Exclusive Interview With David Waslen From HedgeTrade
David Waslen, the CEO and co-founder of HedgeTrade, gave CoinJoy an interview in which he talked about new technologies, the accuracy of market predictions, and shared exclusive news about the project.
That’s all for now! For more details follow us on Twitter, subscribe to our YouTube channel, join our Telegram.
submitted by CoinjoyAssistant to u/CoinjoyAssistant [link] [comments]

Solution Life - New payments solution

Solution Life - New payments solution
Solution Life is an open-source platform that enables to create peer-to-peer marketplace and ecommerce applications.
https://preview.redd.it/ypmpkfwnb6s51.png?width=613&format=png&auto=webp&s=6936dbdd70f1626bb352a426f3b59383b8b8c9cc
Solution Life aims at building a global sharing economy, allowing buyers and sellers to use segments of goods and services (car sharing, service missions, home sharing, etc.) to transact on the open, distributed source web. Using Ethereum blockchain and Interplanetary File System (IPFS), the platform and its participants can interact with the peer-to-peer model, allowing the creation and placement of services and goods without going through traditional middle parties. We plan to build a large-scale commercial network:
• Exchange financial value directly (listing, transactions and service fees) from big corporations like Airbnb, Craigslist, Postmate, ... to individual buyers and retailers.
• Exchange financial value and strategic value (internal aggregation of customer and transaction data) from similar corporations to entire ecosystems
• Create new financial value for market participants who contribute to platform development (e.g. building new technology for the Solution Life platform, developing new vertical products and introducing new users and businesses)
• Build the open, distributed, and shared data layer to promote transparency and collaboration
• Allow the buyers and sellers in the world to transact without difficulty in converting currencies or tariffs
• Promote personal freedom by not allowing a corporation or central government to impose arbitrary and overly conventional rules of business operation. To conduct these ambitious goals, we created the Solution Life Platform with programs that encourage technologists, businesses and consumers to build, contribute, and expand the ecosystem. We plan to build a broad collection of vertical industry applications (e.g. short vacation rental, free software engineering, tutoring) built on standards and data sharing Solution Life. When writing this article, the Solution Life platform is currently in Mainnet Beta. Platform Version 1.0 is expected to be activated in Quarter 3/2020. While the majority of engineering work is being done by the core engineering team, we expect future developments, after launching platform 1.0 from developer, will come to open source community members Together, we will create the Internet economy of the future.
Details of Whitepaper:
• Why is a new model of peer to peer trading necessary?
• Benefits proposed on the Solution Life Platform
• Product strategy, main features and technical overview
• Overview of the Solution Life team and community
https://preview.redd.it/tzepfegpb6s51.png?width=759&format=png&auto=webp&s=62c9933e84e9945b5417591e406390d127fa1070
BACKGROUND
Since the appearance of the Internet, the digital marketplace has connected buyers and sellers of goods and services, allowing transactions that have never happened before. Craigslist launched in 1995 and dominated for many years in local and regional commerce. At the same time, eBay began to grow and create a whole new category of sales based on auction, creating a more efficient way of doing market business. Through 20 years of rapid change, many businesses on the Internet market in both B2C and B2B types have developed strongly. Currently, sharing economy markets such as Airbnb, Uber, Getaround, Fiverr and TaskRmus have been very successful in combining buyers and sellers of the sharing economy. Now, the use of distributed assets can be sold as easily as atomic items, and people around the world are exchanging their excess inventory, time, and skills for profit. New markets including the Gig economy, the service sector and the use of segment assets are particularly suitable to be basis for peer-to-peer systems built on blockchain. Most of the shared economic enterprises have some common points. Firstly, as a collection, these companies have made a big impact on the world. Consumers of the markets were able to improve their lives with access to products and services that they didn't have before. Vendors have been using these platforms to reach customers on a larger and easier scale than before. Each market creates a "private home" for consumers and suppliers to transact together, creating liquidity for that market. Secondly, most sharing economic enterprises follow the same growth cycle. Without a few exceptions, these famous markets are difficult to launch and grow. Enterprises in the market often have to start building with millions of dollars, and in terms of Uber and Airbnb, these two businesses spend billions of dollars to scale. That is also the reason why these businesses suffered serious losses in the early days. In fact, the corporation is subsidizing the use of marketplace for its users. However, due to the very positive cross-network effect, successful marketplace businesses can increase revenue exponentially over time, usually by charging a fee per transaction on the network. Network-effect enterprises, such as share economy market, are often enterprises occupying all directions and growing stage, gaining a disproportionate value from the network for corporation’s management and their shareholders. In many ways, they become the only dictator on the scale they achieve. Finally, although there are huge differences in user experience, business mechanics, and vertical specific features among companies on the Internet market, they all share many parts built and rebuild many times. Lyft, Postmate, and DoorDash themselves has designed their own solutions for user and supplier profiles, shopping experiences, matching algorithms, reviews, and ratings. This is proprietary technology that is valuable on one side. On the other hand, chasing useless things each time creates a new market vertically wasted time and effort. Consumers also create and manage dozens of accounts on these market enterprises themselves, each with their own personal data and transaction history.
In the last few years, blockchain technology innovators and investors have called teams to build peer-topeer versions of businesses in the current sharing economy and to trade the Internet in a more efficient way. P2P lodging sites like Airbnb have already begun to transform the lodging industry by making a public market in private housing. However, adoption may be limited by concerns about safety and security (guests) and property damage (hosts). By enabling a secure, tamper-proof system for managing digital credentials and reputation, we believe blockchain could help accelerate the adoption of P2P lodging and generate.” - Goldman Sachs Research (Blockchain: Putting Theory into Practice) Don Tapscott, the author of the "Blockchain Revolution", said that Bitcoin-based technology could be used to promote the interest in Uber and Airbnb. - The Wall Street Journal "It is difficult for middle parties to achieve sustainable growth in business," [Fritz Joussen] said. "These platforms [tourism middle parties] build accessibility by spending billions of dollars on advertising, and then they generate exclusive profits based on what they have with sales and marketing. They provide great sales and marketing services. Booking.com is a big brand but they make outstanding profits because they own proprietary structures. Blockchain will destroy this. "- Skift However, most of the infrastructure and transmission systems for building distributed-market applications did not exist before Solution Life was born. We aim to address the shortcomings of current market companies and are happy that we have launched the Solution Life Platform, which opens up peer-to-peer commerce with corresponding scale.
📷
ACTIVATE THE OVER THE COUNTER MARKET
Our vision is to build and develop a free service exchange on the new Internet. In order to do this, we have to build a simulation platform of most, if not all, of the functionality of a third-party intermediary on the blockchain and other distribution systems. This is an ambitious and technologically challenging goal, but we have already completed important milestones that demonstrate our technology and the realworld applications of the project. The Solution Life platform has 3 main elements, all of which are open sources:
• Solution Life enabled end user applications
• Solution Life platform for developers
• Solution Life's application protocol
Solution Life enables end user applications The Solution Life flagship marketplace app is our consumer marketplace product that allows buyers and sellers on the network to do business. It is available today on the web at shopSolution Life.com and on both iOS and Android mobile devices.
📷
Summary
For the past two decades, Internet marketplaces and e-commerce stores have changed the way that buyers and sellers connect, creating new opportunities for the exchange of goods and services. However, these marketplaces have always been governed by centralized companies that maintain their individual monopolies on data, transaction and other service fees, and ultimately, user choice. With blockchain and other distributed technologies beginning to hit the mainstream, the world is poised for a new wave of decentralized commerce. SLC is bringing change and innovation to the global peer-to-peer economy. We're excited by the opportunity to lower fees, increase innovation, free customer and transaction data, and decrease censorship and unnecessary regulation. We are building a platform that invites other interested parties including developers and entrepreneurs to build this technology and community with us, altogether working to create the peer-to-peer economy of tomorrow. We hope you’ll join us on this exciting journey.
TOKEN SOLUTION LIFE (SLC)
The Solution Life Token (also known as SLC) is a utility token that serves multiple purposes in ensuring the health and growth of the network. The ERC20 contract is live on the Ethereum network today at:
0x4d44D6c288b7f32fF676a4b2DAfD625992f8Ffbd.
At a high level, this token is intended to serve a number of key functions on the platform. First, the SLC is a multi-purpose incentive token that is intended to drive the behavior of end users, developers, market operators, and other ecosystem participants. Additionally, the SLC is an exchange intermediary that can be used for payments between buyers and sellers on the platform. Ultimately, it is intended that SLC will serve a vital part in future network governance. Since November 2020, the Solution Life token has been used to encourage various forms of participation from the platform's ecosystem participants. Token Solution Life is used to reward users, developers, marketplace operators and / or other participants for performing activities and services conducive to Platform development. Solution Life Rewards Solution Life is an incentive program targeted at end users on the Platform. Buyers and sellers on the platform have been able to earn SLC since our inaugural Solution Life Rewards campaign in Nov of 2020. Solution Life Rewards enables everyone to have a stake in the network. We’ve intentionally designed the program so that even novice, non-technical users can participate. With Solution Life Rewards, users can get SLC from account creation and identity verification. One of the best ways to network is through referrals. As such, end users can also earn tokens by inviting new users. This creates more confidence between the buyer and the seller. Users can also earn SLC by following Solution Life's social networking sites or promoting project news on public channels.
To encourage trading volume on our Solution Life Platform, we also offer a refund mechanism for users who purchase from reputable sellers on our network. Solution Life Commissions Encouraging marketplace developers and managers to use the Solution Life platform is essential. Therefore, we launched an advertising and promotion program, creating an integrated business model for the decentralized marketplace running on Solution Life. Merchants on Solution Life apps can promote their listings using SLCs for greater visibility on search and browse results on our preferred and partner apps. The only way to join this program is to pay with SLC. When a merchant creates a listing, they can add a commission paid in SLC to their listing. This SLC is placed on escrow in the Marketplace Smart Contract.
submitted by slctoken to u/slctoken [link] [comments]

Why i’m bullish on Zilliqa (long read)

Edit: TL;DR added in the comments
 
Hey all, I've been researching coins since 2017 and have gone through 100s of them in the last 3 years. I got introduced to blockchain via Bitcoin of course, analyzed Ethereum thereafter and from that moment I have a keen interest in smart contact platforms. I’m passionate about Ethereum but I find Zilliqa to have a better risk-reward ratio. Especially because Zilliqa has found an elegant balance between being secure, decentralized and scalable in my opinion.
 
Below I post my analysis of why from all the coins I went through I’m most bullish on Zilliqa (yes I went through Tezos, EOS, NEO, VeChain, Harmony, Algorand, Cardano etc.). Note that this is not investment advice and although it's a thorough analysis there is obviously some bias involved. Looking forward to what you all think!
 
Fun fact: the name Zilliqa is a play on ‘silica’ silicon dioxide which means “Silicon for the high-throughput consensus computer.”
 
This post is divided into (i) Technology, (ii) Business & Partnerships, and (iii) Marketing & Community. I’ve tried to make the technology part readable for a broad audience. If you’ve ever tried understanding the inner workings of Bitcoin and Ethereum you should be able to grasp most parts. Otherwise, just skim through and once you are zoning out head to the next part.
 
Technology and some more:
 
Introduction
 
The technology is one of the main reasons why I’m so bullish on Zilliqa. First thing you see on their website is: “Zilliqa is a high-performance, high-security blockchain platform for enterprises and next-generation applications.” These are some bold statements.
 
Before we deep dive into the technology let’s take a step back in time first as they have quite the history. The initial research paper from which Zilliqa originated dates back to August 2016: Elastico: A Secure Sharding Protocol For Open Blockchains where Loi Luu (Kyber Network) is one of the co-authors. Other ideas that led to the development of what Zilliqa has become today are: Bitcoin-NG, collective signing CoSi, ByzCoin and Omniledger.
 
The technical white paper was made public in August 2017 and since then they have achieved everything stated in the white paper and also created their own open source intermediate level smart contract language called Scilla (functional programming language similar to OCaml) too.
 
Mainnet is live since the end of January 2019 with daily transaction rates growing continuously. About a week ago mainnet reached 5 million transactions, 500.000+ addresses in total along with 2400 nodes keeping the network decentralized and secure. Circulating supply is nearing 11 billion and currently only mining rewards are left. The maximum supply is 21 billion with annual inflation being 7.13% currently and will only decrease with time.
 
Zilliqa realized early on that the usage of public cryptocurrencies and smart contracts were increasing but decentralized, secure, and scalable alternatives were lacking in the crypto space. They proposed to apply sharding onto a public smart contract blockchain where the transaction rate increases almost linear with the increase in the amount of nodes. More nodes = higher transaction throughput and increased decentralization. Sharding comes in many forms and Zilliqa uses network-, transaction- and computational sharding. Network sharding opens up the possibility of using transaction- and computational sharding on top. Zilliqa does not use state sharding for now. We’ll come back to this later.
 
Before we continue dissecting how Zilliqa achieves such from a technological standpoint it’s good to keep in mind that a blockchain being decentralised and secure and scalable is still one of the main hurdles in allowing widespread usage of decentralised networks. In my opinion this needs to be solved first before blockchains can get to the point where they can create and add large scale value. So I invite you to read the next section to grasp the underlying fundamentals. Because after all these premises need to be true otherwise there isn’t a fundamental case to be bullish on Zilliqa, right?
 
Down the rabbit hole
 
How have they achieved this? Let’s define the basics first: key players on Zilliqa are the users and the miners. A user is anybody who uses the blockchain to transfer funds or run smart contracts. Miners are the (shard) nodes in the network who run the consensus protocol and get rewarded for their service in Zillings (ZIL). The mining network is divided into several smaller networks called shards, which is also referred to as ‘network sharding’. Miners subsequently are randomly assigned to a shard by another set of miners called DS (Directory Service) nodes. The regular shards process transactions and the outputs of these shards are eventually combined by the DS shard as they reach consensus on the final state. More on how these DS shards reach consensus (via pBFT) will be explained later on.
 
The Zilliqa network produces two types of blocks: DS blocks and Tx blocks. One DS Block consists of 100 Tx Blocks. And as previously mentioned there are two types of nodes concerned with reaching consensus: shard nodes and DS nodes. Becoming a shard node or DS node is being defined by the result of a PoW cycle (Ethash) at the beginning of the DS Block. All candidate mining nodes compete with each other and run the PoW (Proof-of-Work) cycle for 60 seconds and the submissions achieving the highest difficulty will be allowed on the network. And to put it in perspective: the average difficulty for one DS node is ~ 2 Th/s equaling 2.000.000 Mh/s or 55 thousand+ GeForce GTX 1070 / 8 GB GPUs at 35.4 Mh/s. Each DS Block 10 new DS nodes are allowed. And a shard node needs to provide around 8.53 GH/s currently (around 240 GTX 1070s). Dual mining ETH/ETC and ZIL is possible and can be done via mining software such as Phoenix and Claymore. There are pools and if you have large amounts of hashing power (Ethash) available you could mine solo.
 
The PoW cycle of 60 seconds is a peak performance and acts as an entry ticket to the network. The entry ticket is called a sybil resistance mechanism and makes it incredibly hard for adversaries to spawn lots of identities and manipulate the network with these identities. And after every 100 Tx Blocks which corresponds to roughly 1,5 hour this PoW process repeats. In between these 1,5 hour, no PoW needs to be done meaning Zilliqa’s energy consumption to keep the network secure is low. For more detailed information on how mining works click here.
Okay, hats off to you. You have made it this far. Before we go any deeper down the rabbit hole we first must understand why Zilliqa goes through all of the above technicalities and understand a bit more what a blockchain on a more fundamental level is. Because the core of Zilliqa’s consensus protocol relies on the usage of pBFT (practical Byzantine Fault Tolerance) we need to know more about state machines and their function. Navigate to Viewblock, a Zilliqa block explorer, and just come back to this article. We will use this site to navigate through a few concepts.
 
We have established that Zilliqa is a public and distributed blockchain. Meaning that everyone with an internet connection can send ZILs, trigger smart contracts, etc. and there is no central authority who fully controls the network. Zilliqa and other public and distributed blockchains (like Bitcoin and Ethereum) can also be defined as state machines.
 
Taking the liberty of paraphrasing examples and definitions given by Samuel Brooks’ medium article, he describes the definition of a blockchain (like Zilliqa) as: “A peer-to-peer, append-only datastore that uses consensus to synchronize cryptographically-secure data”.
 
Next, he states that: "blockchains are fundamentally systems for managing valid state transitions”. For some more context, I recommend reading the whole medium article to get a better grasp of the definitions and understanding of state machines. Nevertheless, let’s try to simplify and compile it into a single paragraph. Take traffic lights as an example: all its states (red, amber, and green) are predefined, all possible outcomes are known and it doesn’t matter if you encounter the traffic light today or tomorrow. It will still behave the same. Managing the states of a traffic light can be done by triggering a sensor on the road or pushing a button resulting in one traffic lights’ state going from green to red (via amber) and another light from red to green.
 
With public blockchains like Zilliqa, this isn’t so straightforward and simple. It started with block #1 almost 1,5 years ago and every 45 seconds or so a new block linked to the previous block is being added. Resulting in a chain of blocks with transactions in it that everyone can verify from block #1 to the current #647.000+ block. The state is ever changing and the states it can find itself in are infinite. And while the traffic light might work together in tandem with various other traffic lights, it’s rather insignificant comparing it to a public blockchain. Because Zilliqa consists of 2400 nodes who need to work together to achieve consensus on what the latest valid state is while some of these nodes may have latency or broadcast issues, drop offline or are deliberately trying to attack the network, etc.
 
Now go back to the Viewblock page take a look at the amount of transaction, addresses, block and DS height and then hit refresh. Obviously as expected you see new incremented values on one or all parameters. And how did the Zilliqa blockchain manage to transition from a previous valid state to the latest valid state? By using pBFT to reach consensus on the latest valid state.
 
After having obtained the entry ticket, miners execute pBFT to reach consensus on the ever-changing state of the blockchain. pBFT requires a series of network communication between nodes, and as such there is no GPU involved (but CPU). Resulting in the total energy consumed to keep the blockchain secure, decentralized and scalable being low.
 
pBFT stands for practical Byzantine Fault Tolerance and is an optimization on the Byzantine Fault Tolerant algorithm. To quote Blockonomi: “In the context of distributed systems, Byzantine Fault Tolerance is the ability of a distributed computer network to function as desired and correctly reach a sufficient consensus despite malicious components (nodes) of the system failing or propagating incorrect information to other peers.” Zilliqa is such a distributed computer network and depends on the honesty of the nodes (shard and DS) to reach consensus and to continuously update the state with the latest block. If pBFT is a new term for you I can highly recommend the Blockonomi article.
 
The idea of pBFT was introduced in 1999 - one of the authors even won a Turing award for it - and it is well researched and applied in various blockchains and distributed systems nowadays. If you want more advanced information than the Blockonomi link provides click here. And if you’re in between Blockonomi and the University of Singapore read the Zilliqa Design Story Part 2 dating from October 2017.
Quoting from the Zilliqa tech whitepaper: “pBFT relies upon a correct leader (which is randomly selected) to begin each phase and proceed when the sufficient majority exists. In case the leader is byzantine it can stall the entire consensus protocol. To address this challenge, pBFT offers a view change protocol to replace the byzantine leader with another one.”
 
pBFT can tolerate ⅓ of the nodes being dishonest (offline counts as Byzantine = dishonest) and the consensus protocol will function without stalling or hiccups. Once there are more than ⅓ of dishonest nodes but no more than ⅔ the network will be stalled and a view change will be triggered to elect a new DS leader. Only when more than ⅔ of the nodes are dishonest (66%) double-spend attacks become possible.
 
If the network stalls no transactions can be processed and one has to wait until a new honest leader has been elected. When the mainnet was just launched and in its early phases, view changes happened regularly. As of today the last stalling of the network - and view change being triggered - was at the end of October 2019.
 
Another benefit of using pBFT for consensus besides low energy is the immediate finality it provides. Once your transaction is included in a block and the block is added to the chain it’s done. Lastly, take a look at this article where three types of finality are being defined: probabilistic, absolute and economic finality. Zilliqa falls under the absolute finality (just like Tendermint for example). Although lengthy already we skipped through some of the inner workings from Zilliqa’s consensus: read the Zilliqa Design Story Part 3 and you will be close to having a complete picture on it. Enough about PoW, sybil resistance mechanism, pBFT, etc. Another thing we haven’t looked at yet is the amount of decentralization.
 
Decentralisation
 
Currently, there are four shards, each one of them consisting of 600 nodes. 1 shard with 600 so-called DS nodes (Directory Service - they need to achieve a higher difficulty than shard nodes) and 1800 shard nodes of which 250 are shard guards (centralized nodes controlled by the team). The amount of shard guards has been steadily declining from 1200 in January 2019 to 250 as of May 2020. On the Viewblock statistics, you can see that many of the nodes are being located in the US but those are only the (CPU parts of the) shard nodes who perform pBFT. There is no data from where the PoW sources are coming. And when the Zilliqa blockchain starts reaching its transaction capacity limit, a network upgrade needs to be executed to lift the current cap of maximum 2400 nodes to allow more nodes and formation of more shards which will allow to network to keep on scaling according to demand.
Besides shard nodes there are also seed nodes. The main role of seed nodes is to serve as direct access points (for end-users and clients) to the core Zilliqa network that validates transactions. Seed nodes consolidate transaction requests and forward these to the lookup nodes (another type of nodes) for distribution to the shards in the network. Seed nodes also maintain the entire transaction history and the global state of the blockchain which is needed to provide services such as block explorers. Seed nodes in the Zilliqa network are comparable to Infura on Ethereum.
 
The seed nodes were first only operated by Zilliqa themselves, exchanges and Viewblock. Operators of seed nodes like exchanges had no incentive to open them for the greater public. They were centralised at first. Decentralisation at the seed nodes level has been steadily rolled out since March 2020 ( Zilliqa Improvement Proposal 3 ). Currently the amount of seed nodes is being increased, they are public-facing and at the same time PoS is applied to incentivize seed node operators and make it possible for ZIL holders to stake and earn passive yields. Important distinction: seed nodes are not involved with consensus! That is still PoW as entry ticket and pBFT for the actual consensus.
 
5% of the block rewards are being assigned to seed nodes (from the beginning in 2019) and those are being used to pay out ZIL stakers. The 5% block rewards with an annual yield of 10.03% translate to roughly 610 MM ZILs in total that can be staked. Exchanges use the custodial variant of staking and wallets like Moonlet will use the non-custodial version (starting in Q3 2020). Staking is being done by sending ZILs to a smart contract created by Zilliqa and audited by Quantstamp.
 
With a high amount of DS; shard nodes and seed nodes becoming more decentralized too, Zilliqa qualifies for the label of decentralized in my opinion.
 
Smart contracts
 
Let me start by saying I’m not a developer and my programming skills are quite limited. So I‘m taking the ELI5 route (maybe 12) but if you are familiar with Javascript, Solidity or specifically OCaml please head straight to Scilla - read the docs to get a good initial grasp of how Zilliqa’s smart contract language Scilla works and if you ask yourself “why another programming language?” check this article. And if you want to play around with some sample contracts in an IDE click here. The faucet can be found here. And more information on architecture, dapp development and API can be found on the Developer Portal.
If you are more into listening and watching: check this recent webinar explaining Zilliqa and Scilla. Link is time-stamped so you’ll start right away with a platform introduction, roadmap 2020 and afterwards a proper Scilla introduction.
 
Generalized: programming languages can be divided into being ‘object-oriented’ or ‘functional’. Here is an ELI5 given by software development academy: * “all programs have two basic components, data – what the program knows – and behavior – what the program can do with that data. So object-oriented programming states that combining data and related behaviors in one place, is called “object”, which makes it easier to understand how a particular program works. On the other hand, functional programming argues that data and behavior are different things and should be separated to ensure their clarity.” *
 
Scilla is on the functional side and shares similarities with OCaml: OCaml is a general-purpose programming language with an emphasis on expressiveness and safety. It has an advanced type system that helps catch your mistakes without getting in your way. It's used in environments where a single mistake can cost millions and speed matters, is supported by an active community, and has a rich set of libraries and development tools. For all its power, OCaml is also pretty simple, which is one reason it's often used as a teaching language.
 
Scilla is blockchain agnostic, can be implemented onto other blockchains as well, is recognized by academics and won a so-called Distinguished Artifact Award award at the end of last year.
 
One of the reasons why the Zilliqa team decided to create their own programming language focused on preventing smart contract vulnerabilities is that adding logic on a blockchain, programming, means that you cannot afford to make mistakes. Otherwise, it could cost you. It’s all great and fun blockchains being immutable but updating your code because you found a bug isn’t the same as with a regular web application for example. And with smart contracts, it inherently involves cryptocurrencies in some form thus value.
 
Another difference with programming languages on a blockchain is gas. Every transaction you do on a smart contract platform like Zilliqa or Ethereum costs gas. With gas you basically pay for computational costs. Sending a ZIL from address A to address B costs 0.001 ZIL currently. Smart contracts are more complex, often involve various functions and require more gas (if gas is a new concept click here ).
 
So with Scilla, similar to Solidity, you need to make sure that “every function in your smart contract will run as expected without hitting gas limits. An improper resource analysis may lead to situations where funds may get stuck simply because a part of the smart contract code cannot be executed due to gas limits. Such constraints are not present in traditional software systems”. Scilla design story part 1
 
Some examples of smart contract issues you’d want to avoid are: leaking funds, ‘unexpected changes to critical state variables’ (example: someone other than you setting his or her address as the owner of the smart contract after creation) or simply killing a contract.
 
Scilla also allows for formal verification. Wikipedia to the rescue: In the context of hardware and software systems, formal verification is the act of proving or disproving the correctness of intended algorithms underlying a system with respect to a certain formal specification or property, using formal methods of mathematics.
 
Formal verification can be helpful in proving the correctness of systems such as: cryptographic protocols, combinational circuits, digital circuits with internal memory, and software expressed as source code.
 
Scilla is being developed hand-in-hand with formalization of its semantics and its embedding into the Coq proof assistant — a state-of-the art tool for mechanized proofs about properties of programs.”
 
Simply put, with Scilla and accompanying tooling developers can be mathematically sure and proof that the smart contract they’ve written does what he or she intends it to do.
 
Smart contract on a sharded environment and state sharding
 
There is one more topic I’d like to touch on: smart contract execution in a sharded environment (and what is the effect of state sharding). This is a complex topic. I’m not able to explain it any easier than what is posted here. But I will try to compress the post into something easy to digest.
 
Earlier on we have established that Zilliqa can process transactions in parallel due to network sharding. This is where the linear scalability comes from. We can define simple transactions: a transaction from address A to B (Category 1), a transaction where a user interacts with one smart contract (Category 2) and the most complex ones where triggering a transaction results in multiple smart contracts being involved (Category 3). The shards are able to process transactions on their own without interference of the other shards. With Category 1 transactions that is doable, with Category 2 transactions sometimes if that address is in the same shard as the smart contract but with Category 3 you definitely need communication between the shards. Solving that requires to make a set of communication rules the protocol needs to follow in order to process all transactions in a generalised fashion.
 
And this is where the downsides of state sharding comes in currently. All shards in Zilliqa have access to the complete state. Yes the state size (0.1 GB at the moment) grows and all of the nodes need to store it but it also means that they don’t need to shop around for information available on other shards. Requiring more communication and adding more complexity. Computer science knowledge and/or developer knowledge required links if you want to dig further: Scilla - language grammar Scilla - Foundations for Verifiable Decentralised Computations on a Blockchain Gas Accounting NUS x Zilliqa: Smart contract language workshop
 
Easier to follow links on programming Scilla https://learnscilla.com/home Ivan on Tech
 
Roadmap / Zilliqa 2.0
 
There is no strict defined roadmap but here are topics being worked on. And via the Zilliqa website there is also more information on the projects they are working on.
 
Business & Partnerships
 
It’s not only technology in which Zilliqa seems to be excelling as their ecosystem has been expanding and starting to grow rapidly. The project is on a mission to provide OpenFinance (OpFi) to the world and Singapore is the right place to be due to its progressive regulations and futuristic thinking. Singapore has taken a proactive approach towards cryptocurrencies by introducing the Payment Services Act 2019 (PS Act). Among other things, the PS Act will regulate intermediaries dealing with certain cryptocurrencies, with a particular focus on consumer protection and anti-money laundering. It will also provide a stable regulatory licensing and operating framework for cryptocurrency entities, effectively covering all crypto businesses and exchanges based in Singapore. According to PWC 82% of the surveyed executives in Singapore reported blockchain initiatives underway and 13% of them have already brought the initiatives live to the market. There is also an increasing list of organizations that are starting to provide digital payment services. Moreover, Singaporean blockchain developers Building Cities Beyond has recently created an innovation $15 million grant to encourage development on its ecosystem. This all suggests that Singapore tries to position itself as (one of) the leading blockchain hubs in the world.
 
Zilliqa seems to already take advantage of this and recently helped launch Hg Exchange on their platform, together with financial institutions PhillipCapital, PrimePartners and Fundnel. Hg Exchange, which is now approved by the Monetary Authority of Singapore (MAS), uses smart contracts to represent digital assets. Through Hg Exchange financial institutions worldwide can use Zilliqa's safe-by-design smart contracts to enable the trading of private equities. For example, think of companies such as Grab, Airbnb, SpaceX that are not available for public trading right now. Hg Exchange will allow investors to buy shares of private companies & unicorns and capture their value before an IPO. Anquan, the main company behind Zilliqa, has also recently announced that they became a partner and shareholder in TEN31 Bank, which is a fully regulated bank allowing for tokenization of assets and is aiming to bridge the gap between conventional banking and the blockchain world. If STOs, the tokenization of assets, and equity trading will continue to increase, then Zilliqa’s public blockchain would be the ideal candidate due to its strategic positioning, partnerships, regulatory compliance and the technology that is being built on top of it.
 
What is also very encouraging is their focus on banking the un(der)banked. They are launching a stablecoin basket starting with XSGD. As many of you know, stablecoins are currently mostly used for trading. However, Zilliqa is actively trying to broaden the use case of stablecoins. I recommend everybody to read this text that Amrit Kumar wrote (one of the co-founders). These stablecoins will be integrated in the traditional markets and bridge the gap between the crypto world and the traditional world. This could potentially revolutionize and legitimise the crypto space if retailers and companies will for example start to use stablecoins for payments or remittances, instead of it solely being used for trading.
 
Zilliqa also released their DeFi strategic roadmap (dating November 2019) which seems to be aligning well with their OpFi strategy. A non-custodial DEX is coming to Zilliqa made by Switcheo which allows cross-chain trading (atomic swaps) between ETH, EOS and ZIL based tokens. They also signed a Memorandum of Understanding for a (soon to be announced) USD stablecoin. And as Zilliqa is all about regulations and being compliant, I’m speculating on it to be a regulated USD stablecoin. Furthermore, XSGD is already created and visible on block explorer and XIDR (Indonesian Stablecoin) is also coming soon via StraitsX. Here also an overview of the Tech Stack for Financial Applications from September 2019. Further quoting Amrit Kumar on this:
 
There are two basic building blocks in DeFi/OpFi though: 1) stablecoins as you need a non-volatile currency to get access to this market and 2) a dex to be able to trade all these financial assets. The rest are built on top of these blocks.
 
So far, together with our partners and community, we have worked on developing these building blocks with XSGD as a stablecoin. We are working on bringing a USD-backed stablecoin as well. We will soon have a decentralised exchange developed by Switcheo. And with HGX going live, we are also venturing into the tokenization space. More to come in the future.”
 
Additionally, they also have this ZILHive initiative that injects capital into projects. There have been already 6 waves of various teams working on infrastructure, innovation and research, and they are not from ASEAN or Singapore only but global: see Grantees breakdown by country. Over 60 project teams from over 20 countries have contributed to Zilliqa's ecosystem. This includes individuals and teams developing wallets, explorers, developer toolkits, smart contract testing frameworks, dapps, etc. As some of you may know, Unstoppable Domains (UD) blew up when they launched on Zilliqa. UD aims to replace cryptocurrency addresses with a human-readable name and allows for uncensorable websites. Zilliqa will probably be the only one able to handle all these transactions onchain due to ability to scale and its resulting low fees which is why the UD team launched this on Zilliqa in the first place. Furthermore, Zilliqa also has a strong emphasis on security, compliance, and privacy, which is why they partnered with companies like Elliptic, ChainSecurity (part of PwC Switzerland), and Incognito. Their sister company Aqilliz (Zilliqa spelled backwards) focuses on revolutionizing the digital advertising space and is doing interesting things like using Zilliqa to track outdoor digital ads with companies like Foodpanda.
 
Zilliqa is listed on nearly all major exchanges, having several different fiat-gateways and recently have been added to Binance’s margin trading and futures trading with really good volume. They also have a very impressive team with good credentials and experience. They don't just have “tech people”. They have a mix of tech people, business people, marketeers, scientists, and more. Naturally, it's good to have a mix of people with different skill sets if you work in the crypto space.
 
Marketing & Community
 
Zilliqa has a very strong community. If you just follow their Twitter their engagement is much higher for a coin that has approximately 80k followers. They also have been ‘coin of the day’ by LunarCrush many times. LunarCrush tracks real-time cryptocurrency value and social data. According to their data, it seems Zilliqa has a more fundamental and deeper understanding of marketing and community engagement than almost all other coins. While almost all coins have been a bit frozen in the last months, Zilliqa seems to be on its own bull run. It was somewhere in the 100s a few months ago and is currently ranked #46 on CoinGecko. Their official Telegram also has over 20k people and is very active, and their community channel which is over 7k now is more active and larger than many other official channels. Their local communities also seem to be growing.
 
Moreover, their community started ‘Zillacracy’ together with the Zilliqa core team ( see www.zillacracy.com ). It’s a community-run initiative where people from all over the world are now helping with marketing and development on Zilliqa. Since its launch in February 2020 they have been doing a lot and will also run their own non-custodial seed node for staking. This seed node will also allow them to start generating revenue for them to become a self sustaining entity that could potentially scale up to become a decentralized company working in parallel with the Zilliqa core team. Comparing it to all the other smart contract platforms (e.g. Cardano, EOS, Tezos etc.) they don't seem to have started a similar initiative (correct me if I’m wrong though). This suggests in my opinion that these other smart contract platforms do not fully understand how to utilize the ‘power of the community’. This is something you cannot ‘buy with money’ and gives many projects in the space a disadvantage.
 
Zilliqa also released two social products called SocialPay and Zeeves. SocialPay allows users to earn ZILs while tweeting with a specific hashtag. They have recently used it in partnership with the Singapore Red Cross for a marketing campaign after their initial pilot program. It seems like a very valuable social product with a good use case. I can see a lot of traditional companies entering the space through this product, which they seem to suggest will happen. Tokenizing hashtags with smart contracts to get network effect is a very smart and innovative idea.
 
Regarding Zeeves, this is a tipping bot for Telegram. They already have 1000s of signups and they plan to keep upgrading it for more and more people to use it (e.g. they recently have added a quiz features). They also use it during AMAs to reward people in real-time. It’s a very smart approach to grow their communities and get familiar with ZIL. I can see this becoming very big on Telegram. This tool suggests, again, that the Zilliqa team has a deeper understanding of what the crypto space and community needs and is good at finding the right innovative tools to grow and scale.
 
To be honest, I haven’t covered everything (i’m also reaching the character limited haha). So many updates happening lately that it's hard to keep up, such as the International Monetary Fund mentioning Zilliqa in their report, custodial and non-custodial Staking, Binance Margin, Futures, Widget, entering the Indian market, and more. The Head of Marketing Colin Miles has also released this as an overview of what is coming next. And last but not least, Vitalik Buterin has been mentioning Zilliqa lately acknowledging Zilliqa and mentioning that both projects have a lot of room to grow. There is much more info of course and a good part of it has been served to you on a silver platter. I invite you to continue researching by yourself :-) And if you have any comments or questions please post here!
submitted by haveyouheardaboutit to CryptoCurrency [link] [comments]

Weekly Wrap: This Week In Chainlink August 31 - September 6

Weekly Wrap: This Week In Chainlink August 31 - September 6

Chainlink Hackathon - September 7th - 27th!

We've already received over 300 developer signups from over 40 countries for the Chainlink Hackathon. Get ready to join one of the biggest blockchain developer events of the year and build the next top DeFi dApp with support from our world-class mentors and a chance to win over $40K prizes.

Announcements and Integrations 🎉

@synthetix_io is now fully #PoweredByChainlink, switching all its cryptocurrency and index synths to Chainlink's Price Reference Data due to its high-quality data, decentralization, & ability to scale the platform to secure more value.

Bitcoin smart contract platform @RSKsmart has successfully integrated Chainlink oracles via its @rif_os technology. This allows RSK developers to build smart contract applications connected to real-world data that are secured by the Bitcoin blockchain.

@01node is a new node operator supporting Chainlink's live Price Reference Contracts. As operators experienced in securing millions of USD for PoS chains, they bring added decentralization by running their own physical servers that further minimize cloud dependencies.

Decentralized lending protocol @useteller is live on testnet consuming Chainlink's Price Reference Data for ETH/USD, BTC/USD & LINK/USD. These price feeds help Teller ensure that all APR calculations for unsecured loans reflect real market conditions.

Telecommunications blockchain @QLCchain is integrating Chainlink to make its aggregated data available to smart contracts. This data can power new DeFi applications like automating payments between telco providers, tokenizing telco infrastructure & more.

Social reputation score provider @DecentrNet is integrating Chainlink to allow users to share their data in #DeFi dApps to obtain better interest & collateralization rates. Chainlink also helps Decentr make this key data available across any blockchain.

@opium_network is using Chainlink's USDT/USD Price Reference Data live on mainnet to launch the first credit default swap on a centralized stablecoin—USDT. This is another example of how Chainlink oracles are powering innovative DeFi products.

Blockchain card game @EtherLegends will use Chainlink VRF to power their random distribution of NFT-backed end-of-season rewards. These rare items will be awarded to top players at the end of the ongoing season with verifiable proof of fair distribution.

We’re thrilled to award a grant to @ensdomains from the Chainlink Community Grant Program: https://blog.chain.link/introducing-the-chainlink-community-grant-program/…, supporting them in developing human-readable names for oracle contracts, making Chainlink easily accessible to even more smart contract devs

Featured Videos & Educational Pieces 🎥

DeFi is disrupting finance and crypto by moving beyond tokens and wallets into sophisticated smart contract applications that allow p2p lending, liquidity mining, synthetic assets, derivatives, and more. DeFi pioneers Sergey Nazarov (Chainlink), Andre Cronje (yEarn), Stani Kulechoiv (Aave), and Kain Warwick (Synthetix) come together to explain DeFi’s remarkable growth on Ethereum, expansion into new markets, and the impact of live infrastructure, especially decentralized oracles, on DeFi protocols.

Chainlink Labs Chief Scientist Dr. Ari Jules gives a #SmartCon Keynote explaining how DECO helps Chainlink oracles liberate more web data for smart contracts, including identity records, financial data, accredited investor confirmation, supply chain logistics, and more. Using DECO to make this data available on-chain allows blockchains to enhance many enterprise use cases today while still retaining the key property of data confidentiality.

Chainlink-powered decentralized oracles provide smart contracts with definitive truth about the validity of real-world data. In this Keynote, Sergey Nazarov explains how Chainlink is using its secure and reliable oracles to expand the addressable market of smart contract applications into the trillions, thanks to opening up blockchain applications in DeFi, CeFi, Fintech, Web 2.0, and enterprise systems. He also discusses Chainlink’s new DECO acquisition and how it opens up access to web data for smart contracts while preserving data security and confidentiality.

Other SmartCon talks now posted include:

Ecosystem & Community Celebrations 👏


Upcoming Events 📅


Are you interested in hosting your own meetup? Apply to become a Chainlink Community Advocate today: https://events.chain.link/advocate

Chainlink Labs is hiring to build Chainlink’s network: Check out these open roles 👩‍💼

View all open roles at https://careers.smartcontract.com
Are there other community content and celebrations that we missed? Post them in the comments below! ⤵️
submitted by linkedkeenan to Chainlink [link] [comments]

A new whitepaper analysing the performance and scalability of the Streamr pub/sub messaging Network is now available. Take a look at some of the fascinating key results in this introductory blog

A new whitepaper analysing the performance and scalability of the Streamr pub/sub messaging Network is now available. Take a look at some of the fascinating key results in this introductory blog

Streamr Network: Performance and Scalability Whitepaper


https://preview.redd.it/bstqyn43x4j51.png?width=2600&format=png&auto=webp&s=81683ca6303ab84ab898c096345464111d674ee5
The Corea milestone of the Streamr Network went live in late 2019. Since then a few people in the team have been working on an academic whitepaper to describe its design principles, position it with respect to prior art, and prove certain properties it has. The paper is now ready, and it has been submitted to the IEEE Access journal for peer review. It is also now published on the new Papers section on the project website. In this blog, I’ll introduce the paper and explain its key results. All the figures presented in this post are from the paper.
The reasons for doing this research and writing this paper were simple: many prospective users of the Network, especially more serious ones such as enterprises, ask questions like ‘how does it scale?’, ‘why does it scale?’, ‘what is the latency in the network?’, and ‘how much bandwidth is consumed?’. While some answers could be provided before, the Network in its currently deployed form is still small-scale and can’t really show a track record of scalability for example, so there was clearly a need to produce some in-depth material about the structure of the Network and its performance at large, global scale. The paper answers these questions.
Another reason is that decentralized peer-to-peer networks have experienced a new renaissance due to the rise in blockchain networks. Peer-to-peer pub/sub networks were a hot research topic in the early 2000s, but not many real-world implementations were ever created. Today, most blockchain networks use methods from that era under the hood to disseminate block headers, transactions, and other events important for them to function. Other megatrends like IoT and social media are also creating demand for new kinds of scalable message transport layers.

The latency vs. bandwidth tradeoff

The current Streamr Network uses regular random graphs as stream topologies. ‘Regular’ here means that nodes connect to a fixed number of other nodes that publish or subscribe to the same stream, and ‘random’ means that those nodes are selected randomly.
Random connections can of course mean that absurd routes get formed occasionally, for example a data point might travel from Germany to France via the US. But random graphs have been studied extensively in the academic literature, and their properties are not nearly as bad as the above example sounds — such graphs are actually quite good! Data always takes multiple routes in the network, and only the fastest route counts. The less-than-optimal routes are there for redundancy, and redundancy is good, because it improves security and churn tolerance.
There is an important parameter called node degree, which is the fixed number of nodes to which each node in a topology connects. A higher node degree means more duplication and thus more bandwidth consumption for each node, but it also means that fast routes are more likely to form. It’s a tradeoff; better latency can be traded for worse bandwidth consumption. In the following section, we’ll go deeper into analyzing this relationship.

Network diameter scales logarithmically

One useful metric to estimate the behavior of latency is the network diameter, which is the number of hops on the shortest path between the most distant pair of nodes in the network (i.e. the “longest shortest path”. The below plot shows how the network diameter behaves depending on node degree and number of nodes.

Network diameter
We can see that the network diameter increases logarithmically (very slowly), and a higher node degree ‘flattens the curve’. This is a property of random regular graphs, and this is very good — growing from 10,000 nodes to 100,000 nodes only increases the diameter by a few hops! To analyse the effect of the node degree further, we can plot the maximum network diameter using various node degrees:
Network diameter in network of 100 000 nodes
We can see that there are diminishing returns for increasing the node degree. On the other hand, the penalty (number of duplicates, i.e. bandwidth consumption), increases linearly with node degree:

Number of duplicates received by the non-publisher nodes
In the Streamr Network, each stream forms its own separate overlay network and can even have a custom node degree. This allows the owner of the stream to configure their preferred latency/bandwidth balance (imagine such a slider control in the Streamr Core UI). However, finding a good default value is important. From this analysis, we can conclude that:
  • The logarithmic behavior of network diameter leads us to hope that latency might behave logarithmically too, but since the number of hops is not the same as latency (in milliseconds), the scalability needs to be confirmed in the real world (see next section).
  • A node degree of 4 yields good latency/bandwidth balance, and we have selected this as the default value in the Streamr Network. This value is also used in all the real-world experiments described in the next section.
It’s worth noting that in such a network, the bandwidth requirement for publishers is determined by the node degree and not the number of subscribers. With a node degree 4 and a million subscribers, the publisher only uploads 4 copies of a data point, and the million subscribing nodes share the work of distributing the message among themselves. In contrast, a centralized data broker would need to push out a million copies.

Latency scales logarithmically

To see if actual latency scales logarithmically in real-world conditions, we ran large numbers of nodes in 16 different Amazon AWS data centers around the world. We ran experiments with network sizes between 32 to 2048 nodes. Each node published messages to the network, and we measured how long it took for the other nodes to get the message. The experiment was repeated 10 times for each network size.
The below image displays one of the key results of the paper. It shows a CDF (cumulative distribution function) of the measured latencies across all experiments. The y-axis runs from 0 to 1, i.e. 0% to 100%.
CDF of message propagation delay
From this graph we can easily read things like: in a 32 nodes network (blue line), 50% of message deliveries happened within 150 ms globally, and all messages were delivered in around 250 ms. In the largest network of 2048 nodes (pink line), 99% of deliveries happened within 362 ms globally.
To put these results in context, PubNub, a centralized message brokering service, promises to deliver messages within 250 ms — and that’s a centralized service! Decentralization comes with unquestionable benefits (no vendor lock-in, no trust required, network effects, etc.), but if such protocols are inferior in terms of performance or cost, they won’t get adopted. It’s pretty safe to say that the Streamr Network is on par with centralized services even when it comes to latency, which is usually the Achilles’ heel of P2P networks (think of how slow blockchains are!). And the Network will only get better with time.
Then we tackled the big question: does the latency behave logarithmically?
Mean message propagation delay in Amazon experiments
Above, the thick line is the average latency for each network size. From the graph, we can see that the latency grows logarithmically as the network size increases, which means excellent scalability.
The shaded area shows the difference between the best and worst average latencies in each repeat. Here we can see the element of chance at play; due to the randomness in which nodes become neighbours, some topologies are faster than others. Given enough repeats, some near-optimal topologies can be found. The difference between average topologies and the best topologies gives us a glimpse of how much room for optimisation there is, i.e. with a smarter-than-random topology construction, how much improvement is possible (while still staying in the realm of regular graphs)? Out of the observed topologies, the difference between the average and the best observed topology is between 5–13%, so not that much. Other subclasses of graphs, such as irregular graphs, trees, and so on, can of course unlock more room for improvement, but they are different beasts and come with their own disadvantages too.
It’s also worth asking: how much worse is the measured latency compared to the fastest possible latency, i.e. that of a direct connection? While having direct connections between a publisher and subscribers is definitely not scalable, secure, or often even feasible due to firewalls, NATs and such, it’s still worth asking what the latency penalty of peer-to-peer is.

Relative delay penalty in Amazon experiments
As you can see, this plot has the same shape as the previous one, but the y-axis is different. Here, we are showing the relative delay penalty (RDP). It’s the latency in the peer-to-peer network (shown in the previous plot), divided by the latency of a direct connection measured with the ping tool. So a direct connection equals an RDP value of 1, and the measured RDP in the peer-to-peer network is roughly between 2 and 3 in the observed topologies. It increases logarithmically with network size, just like absolute latency.
Again, given that latency is the Achilles’ heel of decentralized systems, that’s not bad at all. It shows that such a network delivers acceptable performance for the vast majority of use cases, only excluding the most latency-sensitive ones, such as online gaming or arbitrage trading. For most other use cases, it doesn’t matter whether it takes 25 or 75 milliseconds to deliver a data point.

Latency is predictable

It’s useful for a messaging system to have consistent and predictable latency. Imagine for example a smart traffic system, where cars can alert each other about dangers on the road. It would be pretty bad if, even minutes after publishing it, some cars still haven’t received the warning. However, such delays easily occur in peer-to-peer networks. Everyone in the crypto space has seen first-hand how plenty of Bitcoin or Ethereum nodes lag even minutes behind the latest chain state.
So we wanted to see whether it would be possible to estimate the latencies in the peer-to-peer network if the topology and the latencies between connected pairs of nodes are known. We applied Dijkstra’s algorithm to compute estimates for average latencies from the input topology data, and compared the estimates to the actual measured average latencies:
Mean message propagation delay in Amazon experiments
We can see that, at least in these experiments, the estimates seemed to provide a lower bound for the actual values, and the average estimation error was 3.5%. The measured value is higher than the estimated one because the estimation only considers network delays, while in reality there is also a little bit of a processing delay at each node.

Conclusion

The research has shown that the Streamr Network can be expected to deliver messages in roughly 150–350 milliseconds worldwide, even at a large scale with thousands of nodes subscribing to a stream. This is on par with centralized message brokers today, showing that the decentralized and peer-to-peer approach is a viable alternative for all but the most latency-sensitive applications.
It’s thrilling to think that by accepting a latency only 2–3 times longer than the latency of an unscalable and insecure direct connecion, applications can interconnect over an open fabric with global scalability, no single point of failure, no vendor lock-in, and no need to trust anyone — all that becomes available out of the box.
In the real-time data space, there are plenty of other aspects to explore, which we didn’t cover in this paper. For example, we did not measure throughput characteristics of network topologies. Different streams are independent, so clearly there’s scalability in the number of streams, and heavy streams can be partitioned, allowing each stream to scale too. Throughput is mainly limited, therefore, by the hardware and network connection used by the network nodes involved in a topology. Measuring the maximum throughput would basically be measuring the hardware as well as the performance of our implemented code. While interesting, this is not a high priority research target at this point in time. And thanks to the redundancy in the network, individual slow nodes do not slow down the whole topology; the data will arrive via faster nodes instead.
Also out of scope for this paper is analysing the costs of running such a network, including the OPEX for publishers and node operators. This is a topic of ongoing research, which we’re currently doing as part of designing the token incentive mechanisms of the Streamr Network, due to be implemented in a later milestone.
I hope that this blog has provided some insight into the fascinating results the team uncovered during this research. For a more in-depth look at the context of this work, and more detail about the research, we invite you to read the full paper.
If you have an interest in network performance and scalability from a developer or enterprise perspective, we will be hosting a talk about this research in the coming weeks, so keep an eye out for more details on the Streamr social media channels. In the meantime, feedback and comments are welcome. Please add a comment to this Reddit thread or email [[email protected]](mailto:[email protected]).
Originally published by. Henri at blog.streamr.network on August 24, 2020.
submitted by thamilton5 to streamr [link] [comments]

Announcing r/Avalanche_, a positive, well-moderated, community-run subreddit for open and cordial discussion of Avalanche (AVAX), whose mainnet launches on September 21st.

Avalanche_ is the name of the cryptocurrency subreddit I have launched, to give the community of people learning about Avalanche and Avalanche Enthusiasts a place to discuss Avalanche free from toxicity and negativity.
This is an "unofficial" subreddit, in the context of Avalanche this means it is not owned or controlled by AvaLabs, which is the developing force behind the Avalanche project.
I am already up past 80 members in just a few days of social networking with my peers. I have also written an article about the subreddit, its purpose and origin, and my moderation strategies and philosophies, that you may read if you'd like. This article is posted in the subreddit.

What is Avalanche?

Avalanche is a cryptocurrency based on Team Rocket's whitepaper "From Snowflake to Avalanche" developed by AvaLabs, which is headed by the computer scientist Emin Gün Sirer, his cofounders, and a large global team of developers, community managers, and marketing evangelists.
Avalanche is named after the Avalanche Protocol, which is a specific and new consensus algorithm detailed in the original whitepaper. In my own words, I would describe it as a gossip/pandemic algorithm for a third-generation blockchain, utilizing multiple rounds of locally-random peer sampling, polling, and pre-state consensus. Essentially, this allows for a very fast and efficient consensus algorithm with mathematically sound properties. Proof of Stake is used as the underlying anti-sybil mechanism.
Some properties that you may find interesting:
1) Anyone can run a node. On your laptop, on your rasberry pie, it will all work with very minimum hardware requirements that basically all household computers meet.
2) Even if everyone runs these lightweight nodes, the Avalanche network has proven to process over 4500 tps. This is without high-grade hardware and also without sharding.
3) Transaction finality is under a second, with few exceptions that never take longer than about three seconds. By transaction finality i mean there is a practical 0% chance your transaction will be reversed, equal to that of around 6 bitcoin confirmations.
4) All validating nodes take part in the block-production process, and basically anyone can become a validator. There is no distinction (that I know of) between validating as a full node and validating as a block producer. Both are just validating and can influence the network.
5) The minimum amount of AVAX you need to validate is 2000, but this is parameterized, which means that the validators periodically are able to vote onchain to incrementally change it. There are other parameterized constants including the expensiveness of transaction fees.
Avalanche (AVAX) is also a robust and versatile smart contract chain that is fully equipped with it own virtual machine as well as the Ethereum Virtual Machine. All Ethereum applications can be ported over to Avalanche with no developer downtime.
Avalanche (AVAX) also has this unique and interesting property of inherent cross-chain same-asset atomic swaps, allowing for the creation of subnetworks with custom virtual machines, custom economics, custom anti-sybil mechanisms, basically the full scope of a custom blockchain utilizing the Avalanche consensus engine, fully interoperable with Avalanche's default chains. You can launch a subnetwork that still uses the Avalanche token, or you can launch one with its own token, really the possibilities are super open-ended.
Avalanche (AVAX) also has this interesting architecture of being kind of like a hybrid between a blockchain and a DAG. This one is hard for me to explain, so I will explain what I do understand. Avalanche actually is made of three default blockchains/networks, all synergistically working together and supporting the same native AVAX token:
Finally, a characteristic of AVAX that I absolutely love is how deflationary it is. Its the most deflationary cryptocurrency I have ever heard of. It has a maximum supply cap like Bitcoin (720M is max), all fees are burned like in Ethereum's EIP-1559, and there is incentivized staking that is open to basically everybody. I say the staking is incentivized, because you need to stake and validate in order to run a subnetwork, which can offer rewards (or be a business goal).

What is Avalanche_?

Avalanche_ is the cryptocurrency subreddit I am launching which allows for clean and open discussion of Avalanche as well as other cryptocurrencies. This subreddit focuses primarily on offering a moderated space to discuss Avalanche, where personal attacks are not allowed. Freedom of belief in opinion is protected, but anything that hurts people or is damaging to the culture of openness and positivity is prohibited and moderated. 99% of all moderation is performed by the moderator, and the other 1% is done by me and anyone else who may later be added to the team.
If you are wondering "Okay but why you?" I cannot give an adequate response besides "Because I care". I would point out that out of the 3000+ people participating in "Avalanche Hub" (which is AvaLabs' official incentivized community participation and marketing platform) i am #5 in terms of influence and my community contributions, but I suppose this is a bit anecdotal. I have written many articles in support of AVAX though, and this is a bit more material. I may share them below.
I have gotten a bit sophisticated with the automoderator tool, and I have done this because I want to protect noobs from toxicity and fighting. Essentially what I have done is 5 things, and together its created a toxicity neutralizing mechanism thats held up so far:
1) An initial vetting process that requires all accounts to have positive comment karma and be a week old, have 50 post karma, or 25 comment karma. This is a relaxed anti-spam measure. Currently parts of this are turned off, but they will be re-enabled in a few days. I did not make this measure too stringent because I want it to be easy for noobs to join and participate.
2) A well researched and extensive blacklist of indisputably toxic words, phrases, and domains that trigger instant removal.
3) A well researched and extensive greylist of fairly toxic words, phrases, and domains that trigger removal if the poster lacks a certain amount of post karma, comment karma, age, or goes reported.
4) A hierarchical and tiered report threshold where non-greylisted comments get removed on X number of reports, based on both post karma and comment karma. The lesser of these two make the weakest link.
5) An anti-doxxing protection algorithm which protects against credit cards, phone numbers, emails, and more from being doxxed. I myself have not created this, but I have been modifying it to fit the group.
I probably will not have to do much of any manual moderation aside from updating automoderator to keep it up to date with the lingo of people with bad intentions. However I am on Reddit everyday and I will be here to protect people from community attackers if its ever needed.
It is my belief that culture is absolutely crucial to the health of a project, and for this reason I am using both automated moderation and community moderation to curtail outsider attacks and infighting alike. As soon as the subreddit becomes a bit more active, it will make it easier for me to ask around the community for moderators.

Welcome to Avalanche

All are welcome in the Avalanche community. All I ask is that you don't intentionally and maliciously try to make others feel unwelcome. Avalanche is a global, decentralized, cryptocurrency and open source software for all to build on and utilize.
I am not a gatekeeper, but a mere member of the community which seeks to provide a positive and friendly platform for redditors to social network on. Links to relevant and official Avalanche community resources are in the "About" section of Avalanche_.
I hope to see many of you there! Let's change the world with this new, globally-scalable, deflationary, interoperable technology.
Build The Internet of Finance.
submitted by Jstodd_ to CryptoCurrencies [link] [comments]

Bitcoin is forking: educate and prepare - Lecture by ... Mike Hearn, Bitcoin Core Developer, About Smart Contracts And Smart Property Bitcoin with a stable protocol takes away power ( Craig Wright AKA Satoshi Nakamoto) SF Scala: Chris Stewart, Bitcoin in Scala The Bitcoin Network

A bitcoin transaction looks less like a transfer of property and more like an assignment of rights. Specifically, a bitcoin transaction is a digital assignment of the right to receive any benefits incidental to ownership or control of the transferred bitcoins. The most obvious example is the right to transfer those bitcoins to another party. But it also includes the right to receive benefits ... But bitcoin, which launched in January 2009, was the first real-world application of the technology and perhaps its most well-known. That’s why blockchain and bitcoin are often spoken of in the same breath. In a nutshell, the bitcoin protocol is built on the blockchain. In a research paper introducing the digital currency, bitcoin’s ... It is tempting instead to take Bitcoin as given, and to engage in speculation about how to get rich with Bitcoin, whether Bitcoin is a bubble, whether Bitcoin might one day mean the end of taxation, and so on. That’s fun, but severely limits your understanding. Understanding the details of the Bitcoin protocol opens up otherwise inaccessible vistas. In particular, it’s the basis for ... Smart property is property whose ownership is controlled via the Bitcoin block chain, using contracts. Examples could include physical property such as cars, phones or houses. Smart property also includes non-physical property like shares in a company or access rights to a remote computer. Making property smart allows it to be traded with radically less trust. This reduces fraud, mediation ... Digital money that’s instant, private, and free from bank fees. Download our official wallet app and start using Bitcoin today. Read news, start mining, and buy BTC or BCH.

[index] [44254] [33512] [40929] [48288] [11428] [46132] [36287] [18264] [26418] [3724]

Bitcoin is forking: educate and prepare - Lecture by ...

Bitcoin Transaction Details - Part 1 - Duration: 15:47. djp3 48,807 views. 15:47 . GOTO 2015 • One Hacker Way • Erik Meijer - Duration: 55:37. GOTO Conferences 52,149 views. 55:37. Ben Swann ... Bitcoin with a stable protocol takes away power ( Craig Wright AKA Satoshi Nakamoto) The Bitcoin Network - Bitcoin and Cryptocurrency Technologies Part 3 - Mechanics of Bitcoin Learn how the individual components of the Bitcoin protocol make the whole system tick: transactions ... Scale By the Bay 2019 is held on November 13-15 in sunny Oakland, California, on the shores of Lake Merritt: https://scale.bythebay.io. Join us! ----- Bitcoin-s is an implementation of the bitcoin ... A look at the details of the transactions... it gets a little technical, put on your seatbelts. This video is part of a larger online course, "From Barter to Bitcoin: Society, Technology and the ...

#