A simple explanation of Bitcoin “Sidechains” Richard ...

06-30 18:32 - 'Bitcoin isnt fighting to do anything. Core knows 2x is easy as apple pie, it just doesn't help the side chain business model.' by /u/In_the_cave_mining removed from /r/Bitcoin within 77-87min

'''
Bitcoin isnt fighting to do anything. Core knows 2x is easy as apple pie, it just doesn't help the side chain business model.
'''
Context Link
Go1dfish undelete link
unreddit undelete link
Author: In_the_cave_mining
submitted by removalbot to removalbot [link] [comments]

Hypothetical question: in the event of a large world disrupting event (war)...

Hypothetically, in the event of a large war, the undersea cables that connect the world with internet could be severed. Satellite internet could be disabled through further wartime acts. We could end up with several discreet, independant internets that are not only separated by great firewalls, but also physically separated from each other.
It's my understanding that the block chain would essentially continue in each internet pocket unhindered, but not synchronized.
So what happened if after a prolonged period of isolation, let's say 5 years, the networks become reconnected.
Would the two block chains "merge"
If they merge, how would potentially conflicting transactions be handled? Double spending? Would one block chain fully overwrite the other? Effectively nullifying 5 years worth of transactions? Or would the possible overwrite only affect specific conflicting transactions?
Or would they stay as separate discreet chains at that point?
Would that mean someone who has keys today would then have keys for multiple sets of bitcoins on multiple block chains with the same keys? (so if I had 1btc today, after the "war" would I then have 1btc [Americas] 1btc [Asia] and 1btc [Europe])
submitted by StupidAuthentication to Bitcoin [link] [comments]

Why I’m Bullish on Yield Farming Ahead of the Eth 2.0 Launch

Hello everyone! I noticed that the hype around yield farming and DEX protocols kinda died down and that people focus more on NFTs and artwork-based projects like Rarible. I figured it would be great to (shortly) explain why yield farming lost its popularity and why they will have a comeback ahead of the new ETH 2.0 launch.
If you’re not new here, you know how the DeFi market evolved in the past months. We had a surge of yield farming (liquidity providing) platforms that were hyped at the very beginning but lost a majority of their users real fast, sometimes only days after launching.
I believe that most people were disappointed by this sort of mini speculative bubble and the fact that most projects had devs who rug pulled. Combined with the fact that Ethereum had high network congestion at several points in September and October, traders simply decided to prevent further losses and leave this niche place LP once and for all.
Don’t get me wrong, there are still plenty of yield farming projects that people use and it’s not like people stopped token swapping on Uniswap or anything. Ethereum also calmed down a lot now and the average transaction costs only like what, 80 gwei? But still, I think that people are pretty much aware that if another hype cycle started, the very same pattern would repeat again.
My take on this is that yield farming will regain its popularity in December around the time Ethereum 2.0 launches with its first phase and a lot of scaling solutions like Optimistic launch. If everything runs smoothly, we should have the building blocks for resuming the DeFi bull run and turning yield farming stable, rewarding, and popular once more.
Sure, Ethereum is only launching a small network upgrade that will run side-by-side with the original network, so we won’t see any technical changes anytime soon. But I really believe that ETH 2.0, along with other scaling solutions, will bring back trust and show that there is indeed a bright future for blockchain-based technology ahead of us. And in that future, Proof-of-Stake and liquidity providing will be the modern mining equivalent of running a Bitcoin farm in 2011.
One thing that I’m worried about is that enthusiasts, traders, and investors will still fall for the same projects that promise too much and deliver little. We saw numerous projects that were regarded as reputable in the beginning collapse within a week, like SushiSwap. But at the same time, my line of thinking is that projects that focus on development and spend minimal time on marketing will surface to the top in the end.
For example, while everyone was using Uniswap to swap tokens and provide liquidity, I was doing the same exact thing but cheaper on Anyswap. It is kinda funny since people boast that they earned $1200 through the UNI airdrop but I know for a fact that they spent way more on fees. And guess what? I didn’t even break a $100 threshold in the last three months while using Anyswap. I’m not trying to bash Uniswap here, but all I’m saying is that we already have scalable solutions now but people are too scared to introduce new changes in their lives.
I’m not here to market you anything. I just want to show you that even today, in October 2020, you can discover scalable and rewarding projects that simply work. Find any developer team that works all the time and doesn’t have the time to brag and you’ll know you’re on the right road! Last time I checked, the Anyswap team revealed that the average APY return for their yield farming pools ranges between 100% to 900%. When I asked my crypto friends if they know about this, I found that none of them even heard of Anyswap.
DYOR and find out about the project on your own. I promise that reading about Anyswap and the blockchain it’s based on (Fusion) will be worth the time.
submitted by cryptomir to CryptoCurrency [link] [comments]

The list of best coins (in my humble opinion)

*This is not financial advice or suggestion. Just my opinion*
Legend:
"S" - super
"A" - really good
"B" - good
"C" - has potential
"D" - keeping an eye on it
"E" - coins to gamble on

Digibyte [DGB]: "S"
I mentioned this coin a few times already. It's because DGB is a true successor of Satoshi's philosophy. It's the purest coin in the market. DGB is the "people's money".

Dash [DASH]: "S"
DAO and masternodes are the future. Satoshi had a vision of altruism. But we cannot expect people to be altruists and lend their infrastructure for the wellbeing of others. The community is just not strong enough to do so. Masternodes are a meritatory focused system to reward those who are willing to lend their infrastructure to be a node in the network. It's a win-win situation for the network and the node owner. Besides acting as a node, it allowed the development of some other features like optional privacy and instant payments.

Monero [XMR]: "S"
When we think about cash, one of its best features that come to mind is privacy. Monero is probably the most famous privacy coin. Transactions are private by default. Another great thing that Monero is taking care of is the prevention of mining centralization. Being able to mine a coin with a CPU is probably one of the main concepts we forgot when it comes to allowing every person to participate in the network.

Vechain [VET]: "A"
If you think about the use-cases of blockchain, you cannot forget how impactful it will be for supply chains. So far, Vechain is one of the best solutions. It's also the most adopted for now.

Nexus [NXS]: "A"
NXS is a coin that deserves to be in the "S" category. But there's still a long way to go for it to achieve that rank. It's a forward-thinking project. They understood how far decentralization has to go to achieve the real meaning of the word. They even though of the quantum computer problem. Fast database, satellites, quantum-resistant, decentralized internet, and user-friendliness are just a few keywords they focus on while developing the coin.

Bitcoin [BTC]: "A"
I'm somehow ashamed to put Bitcoin this low. But let me explain why I did so, while still keeping it in my top list. First of all, I have to say: "Thank you Satoshi!". Bitcoin got this low on my list because I have a feeling too many powerful people got their hands on it. Some got in for the right reasons, while others are not so benevolent. Bitcoin is not "people money" anymore. IMO (very very humble opinion), Bitcoin was a demo project. A very successful demo project. Satoshi gave us an open-source code as a gift to do with it whatever we want. Blockchain is the gift he gave us, not Bitcoin. And we (the community) did it. Bitcoin became a brand. More people heard of the word "Bitcoin" then "cryptocurrency". On the bright side, Bitcoin is the biggest network in the world. While this is true, hodling some is a good idea.

Litecoin [LTC]: "B"
At its time, not many understood what Bitcoin is, and what potential blockchains as technology have. Imagine how forward-thinking was Mr. Charlie Lee. He created the first altcoin. Technology-wise, LTC is a different coin. Mr. Lee didn't just copy-paste the code and name it differently. In my eyes, LTC will always be the "crypto silver" making it a good store of value and medium of exchange.

Chainlink [LINK]: "B"
I believe the solution they are going to provide is too important for the crypto space to ignore it. Oracles are the future, but until we don't see real use-case, it will remain listed as "B". Another reason that doesn't give him the right to be higher in the list is that it's an Eth token.

Dogecoin [DOGE]: "C"
When you think about content creation, you'll see it's highly centralized. Creators depend on the platform's policies and bread crumbs those platforms leave them even after people click on ads. One of the solutions to reward good creators is to make a fast and easy to use tipping system. The first thing that crosses your mind are probably tokens. But imagine a blockchain of its own that enables fast and cheap transactions. Yes, DGB is the way to go. But there is a coin with higher inflation which you don't want to hold for a long time, but spent around to reward other's work that helped you in some way or you enjoy reading or watching. Dogecoin has the potential of becoming the chosen one for this exact purpose.

Verge [XVG]: "C"
When Wikileaks added BTC as a donation medium, Satoshi politely asked to remove it because we were poking the hornet's nest. I don't remember he's exact words, but this was the context. A similar thing happened to Verge. It was like the flight of Icarus. Pornhub listed it as an optional payment method drawing a lot of attention to it. Verge was not mature enough for that kind of exposure. After that, it suffered an attack, and people gave up on it. But if you look closely at the technology behind it, you'll see it's a really good coin. It offers privacy differently then Monero does. If you already haven't, I strongly encourage you to read about Verge's tech. You'll be amazed.

"D" coins:
Polkadot [DOT]
Ethereum [ETH]
Electroneum [ETN]
Cardano [ADA]
Siacoin [SC]

"E" coins:
Theta [THETA]
Zilliqa [ZIL]
Decred [DCR]
Golem [GNT]
Enjin [ENJ]
Zcoin [XZC]
Energi [NRG]

Thank you Satoshi!
submitted by BlueBloodStrawberry to SatoshisPhilosophy [link] [comments]

Why Osana takes so long? (Programmer's point of view on current situation)

I decided to write a comment about «Why Osana takes so long?» somewhere and what can be done to shorten this time. It turned into a long essay. Here's TL;DR of it:
The cost of never paying down this technical debt is clear; eventually the cost to deliver functionality will become so slow that it is easy for a well-designed competitive software product to overtake the badly-designed software in terms of features. In my experience, badly designed software can also lead to a more stressed engineering workforce, in turn leading higher staff churn (which in turn affects costs and productivity when delivering features). Additionally, due to the complexity in a given codebase, the ability to accurately estimate work will also disappear.
Junade Ali, Mastering PHP Design Patterns (2016)
Longer version: I am not sure if people here wanted an explanation from a real developer who works with C and with relatively large projects, but I am going to do it nonetheless. I am not much interested in Yandere Simulator nor in this genre in general, but this particular development has a lot to learn from for any fellow programmers and software engineers to ensure that they'll never end up in Alex's situation, especially considering that he is definitely not the first one to got himself knee-deep in the development hell (do you remember Star Citizen?) and he is definitely not the last one.
On the one hand, people see that Alex works incredibly slowly, equivalent of, like, one hour per day, comparing it with, say, Papers, Please, the game that was developed in nine months from start to finish by one guy. On the other hand, Alex himself most likely thinks that he works until complete exhaustion each day. In fact, I highly suspect that both those sentences are correct! Because of the mistakes made during early development stages, which are highly unlikely to be fixed due to the pressure put on the developer right now and due to his overall approach to coding, cost to add any relatively large feature (e.g. Osana) can be pretty much comparable to the cost of creating a fan game from start to finish. Trust me, I've seen his leaked source code (don't tell anybody about that) and I know what I am talking about. The largest problem in Yandere Simulator right now is its super slow development. So, without further ado, let's talk about how «implementing the low hanging fruit» crippled the development and, more importantly, what would have been an ideal course of action from my point of view to get out. I'll try to explain things in the easiest terms possible.
  1. else if's and lack any sort of refactoring in general
The most «memey» one. I won't talk about the performance though (switch statement is not better in terms of performance, it is a myth. If compiler detects some code that can be turned into a jump table, for example, it will do it, no matter if it is a chain of if's or a switch statement. Compilers nowadays are way smarter than one might think). Just take a look here. I know that it's his older JavaScript code, but, believe it or not, this piece is still present in C# version relatively untouched.
I refactored this code for you using C language (mixed with C++ since there's no this pointer in pure C). Take a note that else if's are still there, else if's are not the problem by itself.
The refactored code is just objectively better for one simple reason: it is shorter, while not being obscure, and now it should be able to handle, say, Trespassing and Blood case without any input from the developer due to the usage of flags. Basically, the shorter your code, the more you can see on screen without spreading your attention too much. As a rule of thumb, the less lines there are, the easier it is for you to work with the code. Just don't overkill that, unless you are going to participate in International Obfuscated C Code Contest. Let me reiterate:
Perfection is achieved, not when there is nothing more to add, but when there is nothing left to take away.
Antoine de Saint-Exupéry
This is why refactoring — activity of rewriting your old code so it does the same thing, but does it quicker, in a more generic way, in less lines or simpler — is so powerful. In my experience, you can only keep one module/class/whatever in your brain if it does not exceed ~1000 lines, maybe ~1500. Splitting 17000-line-long class into smaller classes probably won't improve performance at all, but it will make working with parts of this class way easier.
Is it too late now to start refactoring? Of course NO: better late than never.
  1. Comments
If you think that you wrote this code, so you'll always easily remember it, I have some bad news for you: you won't. In my experience, one week and that's it. That's why comments are so crucial. It is not necessary to put a ton of comments everywhere, but just a general idea will help you out in the future. Even if you think that It Just Works™ and you'll never ever need to fix it. Time spent to write and debug one line of code almost always exceeds time to write one comment in large-scale projects. Moreover, the best code is the code that is self-evident. In the example above, what the hell does (float) 6 mean? Why not wrap it around into the constant with a good, self-descriptive name? Again, it won't affect performance, since C# compiler is smart enough to silently remove this constant from the real code and place its value into the method invocation directly. Such constants are here for you.
I rewrote my code above a little bit to illustrate this. With those comments, you don't have to remember your code at all, since its functionality is outlined in two tiny lines of comments above it. Moreover, even a person with zero knowledge in programming will figure out the purpose of this code. It took me less than half a minute to write those comments, but it'll probably save me quite a lot of time of figuring out «what was I thinking back then» one day.
Is it too late now to start adding comments? Again, of course NO. Don't be lazy and redirect all your typing from «debunk» page (which pretty much does the opposite of debunking, but who am I to judge you here?) into some useful comments.
  1. Unit testing
This is often neglected, but consider the following. You wrote some code, you ran your game, you saw a new bug. Was it introduced right now? Is it a problem in your older code which has shown up just because you have never actually used it until now? Where should you search for it? You have no idea, and you have one painful debugging session ahead. Just imagine how easier it would be if you've had some routines which automatically execute after each build and check that environment is still sane and nothing broke on a fundamental level. This is called unit testing, and yes, unit tests won't be able to catch all your bugs, but even getting 20% of bugs identified at the earlier stage is a huge boon to development speed.
Is it too late now to start adding unit tests? Kinda YES and NO at the same time. Unit testing works best if it covers the majority of project's code. On the other side, a journey of a thousand miles begins with a single step. If you decide to start refactoring your code, writing a unit test before refactoring will help you to prove to yourself that you have not broken anything without the need of running the game at all.
  1. Static code analysis
This is basically pretty self-explanatory. You set this thing once, you forget about it. Static code analyzer is another «free estate» to speed up the development process by finding tiny little errors, mostly silly typos (do you think that you are good enough in finding them? Well, good luck catching x << 4; in place of x <<= 4; buried deep in C code by eye!). Again, this is not a silver bullet, it is another tool which will help you out with debugging a little bit along with the debugger, unit tests and other things. You need every little bit of help here.
Is it too late now to hook up static code analyzer? Obviously NO.
  1. Code architecture
Say, you want to build Osana, but then you decided to implement some feature, e.g. Snap Mode. By doing this you have maybe made your game a little bit better, but what you have just essentially done is complicated your life, because now you should also write Osana code for Snap Mode. The way game architecture is done right now, easter eggs code is deeply interleaved with game logic, which leads to code «spaghettifying», which in turn slows down the addition of new features, because one has to consider how this feature would work alongside each and every old feature and easter egg. Even if it is just gazing over one line per easter egg, it adds up to the mess, slowly but surely.
A lot of people mention that developer should have been doing it in object-oritented way. However, there is no silver bullet in programming. It does not matter that much if you are doing it object-oriented way or usual procedural way; you can theoretically write, say, AI routines on functional (e.g. LISP)) or even logical language if you are brave enough (e.g. Prolog). You can even invent your own tiny programming language! The only thing that matters is code quality and avoiding the so-called shotgun surgery situation, which plagues Yandere Simulator from top to bottom right now. Is there a way of adding a new feature without interfering with your older code (e.g. by creating a child class which will encapsulate all the things you need, for example)? Go for it, this feature is basically «free» for you. Otherwise you'd better think twice before doing this, because you are going into the «technical debt» territory, borrowing your time from the future by saying «I'll maybe optimize it later» and «a thousand more lines probably won't slow me down in the future that much, right?». Technical debt will incur interest on its own that you'll have to pay. Basically, the entire situation around Osana right now is just a huge tale about how just «interest» incurred by technical debt can control the entire project, like the tail wiggling the dog.
I won't elaborate here further, since it'll take me an even larger post to fully describe what's wrong about Yandere Simulator's code architecture.
Is it too late to rebuild code architecture? Sadly, YES, although it should be possible to split Student class into descendants by using hooks for individual students. However, code architecture can be improved by a vast margin if you start removing easter eggs and features like Snap Mode that currently bloat Yandere Simulator. I know it is going to be painful, but it is the only way to improve code quality here and now. This will simplify the code, and this will make it easier for you to add the «real» features, like Osana or whatever you'd like to accomplish. If you'll ever want them back, you can track them down in Git history and re-implement them one by one, hopefully without performing the shotgun surgery this time.
  1. Loading times
Again, I won't be talking about the performance, since you can debug your game on 20 FPS as well as on 60 FPS, but this is a very different story. Yandere Simulator is huge. Once you fixed a bug, you want to test it, right? And your workflow right now probably looks like this:
  1. Fix the code (unavoidable time loss)
  2. Rebuild the project (can take a loooong time)
  3. Load your game (can take a loooong time)
  4. Test it (unavoidable time loss, unless another bug has popped up via unit testing, code analyzer etc.)
And you can fix it. For instance, I know that Yandere Simulator makes all the students' photos during loading. Why should that be done there? Why not either move it to project building stage by adding build hook so Unity does that for you during full project rebuild, or, even better, why not disable it completely or replace with «PLACEHOLDER» text for debug builds? Each second spent watching the loading screen will be rightfully interpreted as «son is not coding» by the community.
Is it too late to reduce loading times? Hell NO.
  1. Jenkins
Or any other continuous integration tool. «Rebuild a project» can take a long time too, and what can we do about that? Let me give you an idea. Buy a new PC. Get a 32-core Threadripper, 32 GB of fastest RAM you can afford and a cool motherboard which would support all of that (of course, Ryzen/i5/Celeron/i386/Raspberry Pi is fine too, but the faster, the better). The rest is not necessary, e.g. a barely functional second hand video card burned out by bitcoin mining is fine. You set up another PC in your room. You connect it to your network. You set up ramdisk to speed things up even more. You properly set up Jenkins) on this PC. From now on, Jenkins cares about the rest: tracking your Git repository, (re)building process, large and time-consuming unit tests, invoking static code analyzer, profiling, generating reports and whatever else you can and want to hook up. More importantly, you can fix another bug while Jenkins is rebuilding the project for the previous one et cetera.
In general, continuous integration is a great technology to quickly track down errors that were introduced in previous versions, attempting to avoid those kinds of bug hunting sessions. I am highly unsure if continuous integration is needed for 10000-20000 source lines long projects, but things can be different as soon as we step into the 100k+ territory, and Yandere Simulator by now has approximately 150k+ source lines of code. I think that probably continuous integration might be well worth it for Yandere Simulator.
Is it too late to add continuous integration? NO, albeit it is going to take some time and skills to set up.
  1. Stop caring about the criticism
Stop comparing Alex to Scott Cawton. IMO Alex is very similar to the person known as SgtMarkIV, the developer of Brutal Doom, who is also a notorious edgelord who, for example, also once told somebody to kill himself, just like… However, being a horrible person, SgtMarkIV does his job. He simply does not care much about public opinion. That's the difference.
  1. Go outside
Enough said. Your brain works slower if you only think about games and if you can't provide it with enough oxygen supply. I know that this one is probably the hardest to implement, but…
That's all, folks.
Bonus: Do you think how short this list would have been if someone just simply listened to Mike Zaimont instead of breaking down in tears?
submitted by Dezhitse to Osana [link] [comments]

"Is there going to be a split?"

This question is (understandably) asked all the time by people coming to the BCHN telegram for clarity on what has become a pretty unclear situation. It's not an easy question to answer, though, so I wanted to make this post that people could refer to. Here's my take on answering that question:
It's impossible to know 100% for sure if there will be a split, because it's impossible to know for sure how all the relevant parties will behave. For there to be a split, two node implementations need to irreconcilably disagree on a consensus-related issue, both need to release code with those conflicting consensus rules, and then miners need to also be split enough on the issue for both chains to have enough hashpower to be viable.
The narrative that there is ALMOST DEFINITELY going to be a split seems to have been dumped on us out of nowhere a month or so ago, along with what I consider to be increasingly unreasonable behavior from Bitcoin ABC. The tin foil hat in my heart finds that suspicious. (And between the IFP and recent Grasberg proposal, it almost seems like ABC is doing it's best to propose things that make a split as likely as possible.) This political baggage doesn't really matter though, in the grand scheme of things.
What matters is that BCHN is, as far as I can tell, just trying to be a reasonable and professional node implementation for Bitcoin Cash. If sticking to those principals leads to a divergence from ABC on a consensus related issue, whatever issue that actually would end up being, then that's how it will be. And it wouldn't be the case that "BCHN split the chain", nor would it be the case that "ABC split the chain". It would just be the case that two groups released node implementations with different consensus rules from each other. (And then, if a non-negligible amount of hashpower mines using both clients, then there would indeed be a chain split caused by that situation.)
Keeping BCH unified is obviously a HUGE priority for BCHN. Their initial release of what was effectively a non-IFP version of Bitcoin ABC was even designed so that, if the IFP activated with a majority of the hashpower, miners mining with BCHN would follow that longer chain, instead of rejecting IFP blocks as "invalid" or anything like that. This is in stark contrast to the narrative I've seen flooding this sub recently with claims that BCHN tried to split the chain this past upgrade, and are trying to split the chain again this November. So please take the time to consider which sides of these inevitable disagreements are being reasonable, and which are not, make sure to fact check and ask for sources for any claims you see being made that you can't verify or debunk on your own, and remember that, while chain splits are messy unfortunate things (at least in the short term), if there's an irreconcilable disagreement, it's definitely better for those parties to go their separate ways. I hope that that doesn't have to happen anytime soon for BCH.
submitted by AD1AD to btc [link] [comments]

I’m a commentator for a tournament of nightmares. I’m not sure the participants are willing.

You’d think being a psychiatric ward for 38 months would be enough to deter a guy from ever going back to a sport that involves watching human beings at the height of their physical prowess beat the living shit out of each other. Sometimes regulated, sometimes not.
But, here I am, fresh outta the loony bin and reading the most unusual advertising slogan I’d ever laid eyes on;
“The most terrifying tournament has come around once again! Conquer your fears in the NFC*…* literally.”
This was the business card that accompanied my black envelope as it was handed to me on the discharge ward by a well dressed and gangly fella with an uncomfortable wide smile. He didn’t say much of anything, just that his name was “Watson” before bowing and holding up the envelope.
“Heh, like the butler, right?” I said, taking the envelope from his plasticine hands. His smile ripples across his face and he nods slowly, his perfect hair unmoving in the strong wind before he turns on his heel and walks back to the black sedan.
The cold air chilled my bones, and I pulled the medical bracelet from my wrist, grimacing at the marks underneath before following Watson to the Sedan and hauling my luggage into the trunk before setting off, not knowing how I came to even be there in the first place.
I guess right now, that doesn’t really matter.
What matters is where I am now and what I’m doing.
"blood strewn across the canvas, frayed brain matter sailing across my head and splattering against the wall, a woman standing in a pool of blood as the deformed creature twitches on the ground"
My name is Sal “Motormouth” Sabotta, I’m a sports commentator by trade. Be it combat sports, pro wrestling, death-matches or martial arts tournament, I’ve done it all.
I won’t lie; Work can be hard to come by. I’ve spent months struggling for rent and resorting to less tried-and-true commentary methods in order to survive. That has, at times, involved trying my hand at some of the more underground competitions; unregulated fights, sick, illegal games bet on by people on the dark web and worse… Things I’m not going to detail here. Things I’m not proud to have taken a hefty pay-check for from greasy, sweaty fucks in Armani tracksuits and stinking of cheap booze and coke all the way up to well-dressed bitcoin farmers in their 20s who probably own child slaves.
In short, I’m no stranger to the grim underworld or the secrecies with which they conduct their work. I see money and an easy way to make it with my voice; I don’t ask questions.
So when I received an email the day of my discharge from the hospital and I’m told “you’ll receive a letter from Mr. Watson, take it and follow the instructions to the venue. Pay up front as agreed.”, I don’t question it. Especially when the note is personalised, and the doctor says my medical fees were covered.
We drove past numerous landscapes, vistas and neighbourhoods before veering off into an industrial estate and entering an underground tunnel. Half a mile in, Watson stops the car and peers back, smiling.
He directs a thumb to the service door in the side tunnel and rubs his neck, a scar running from ear to ear. Was he a former fighter? Gangster?
I sighed and got out, still in my medical gown and hauling ass to the door. It opened before I could reach out and a tall, muscular woman in her late 30s greeted me with a smile. She was imposing, powerful in her gait, a black eyepatch with several seals adorning the sides accompanying a thick scar down her face did nothing to stop her beauty. She wore a tank top with a black cloak with white fur on the tops and sleeves, a thick black chain clasp around the neck. I won’t lie; she looked badass. Terrifying, but badass.
“‘Bout time ya showed up, Sabotta!” She grinned and put a cinderblock of a hand on my shoulder. I’m 5’10 and 180lbs, but she made me feel like a child in front of her. The power emanating from her fist was unbelievable. “C’mon, the trial match is starting and I don’t want no tourney without a broken in commentator! You gotta know the ropes of this place!”
“You know your driver was standing right outside when I was discharged, right? Couldn’t think to give me an extra day or two to freshen up?” I frowned. This wasn’t normal protocol, even for back-alley promotions like this. She just laughed at me and slapped my shoulder.
“The tournament waits for nobody, Sal. Times a-wasting.”
The hallway is dimly lit and the sounds of a ruckus above us are as impossible to ignore as the sounds of thudding, screaming and snapping. As we pass several doors with one-way mirrors on the front panes, I hear sounds I could have never placed in the animal kingdom or otherwise; gurgles, clicks, grunts and even otherworldly whispers.
“What the fuck is that? You guys doing animal fights down here? I mean I called a monkey fight once, but it’s not exactly… pleasant.” I shuddered, thinking of the violence chimpanzees can inflict on one another, let alone humans. She never stopped walking or staring directly ahead when she responded.“Those ain’t animals. Not by a long shot.”
Before I can probe further, I’m hurried into a changing room and practically swept off my feet by her strength. I turn back and she’s already poking her head out the door.
“You’ve got 5 minutes, get your shit and head up the left stairs, Watson will guide you.” She grinned, and I saw gold filings in her teeth that glinted as much as her bedazzled eye patch. “Ya came highly recommended… I expect good things!”
I do as instructed and within 5 minutes I’m back in my commentary clothes; an open buttoned Hawaiian shirt with my old Hotel Inertia shirt underneath, skinny black jeans and shimmering black shoes. I found some old slick gorilla powder in my hair and dusted it up, opting for the dishevelled look as I knew I’d be sweating by the end of the ordeal.
“You shouldn’t bother putting in so much effort, y’know. They’re not gonna care how good you look, only how well you talk.”
Standing in the doorway was a woman in her 40s, dark-skinned and hair clad in meticulous dreadlocks, tied back into a large bun with a pair draped down the sides of her head. She held a thick book in one hand and pocketed a serrated blade in the other before motioning to me.
“We’ll have to do the pleasantries on the way, the match is starting and you don’t wanna miss that. The commissioner isn’t the type you want to upset. Especially when you’re not here by choice.” I looked for a moment, dumbfounded.
“I’m here because I was invited, already got my pay from the woman who let me in.” I shrugged, pocketing the envelope and getting my equipment from the suitcase. The woman gave a sad smile and shook her head.
“Of course you’d think that. She likes it that way. Bet she didn’t introduce herself either, did she? C’mon.”
I follow her down and after a few minutes we come to a fork in the hallway, an elevator system to our right and a stairway to the left. Dutifully, Watson stood patiently, still grinning and motioning us to go up.
Once we’re situated in our booth upstairs, I set my equipment up and look down at the table, expecting a slew of papers and fighter information in front of me. I look to the woman to ask, but she doesn’t break her stare in the darkness, looking down at the arena floor some 100ft below us.“You won’t need that. Not for this match.”
The lights flicker on and the enormity of this venue reveals itself to me. It’s a structure of imposing steel, dried blood, claw marks and other unknown substances that littered the 40ft wide circular pit the fighters contested in, a black lift on either side from the fighters corners that I can only assume ascended up from their locker room area. Around them were chain-link fences that rose up to the audience stands above, situating around 300 people across all four sides. At the very top sat our booth, the commissioner’s office directly opposite, the judges booth to our right and the fight analysts/medical area to our left. Standing in the centre with a spotlight over them was the commissioner, microphone in hand and an energy that was almost palpable.
“Ladies, Gentlemen and Freaks of all kinds out there in the universe. I welcome you once more to the annual Nightmare Fighting Championship Tournament! It’s been a long year, but we have new blood to pit against our resident night terrors and some fresh fears to feast on the fortuitous soul that frolics into their den. As always, our contestants will be fighting for their freedom, a chance to get their wish or to fight for the ultimate prize.” The crowd cheers and the majority are hidden behind thick plexiglass and lighting, but I can see some have Karate Gi’s, weapons in hand and others with demon masks as they whoop and holler. The clientele here were, at least in my estimation, experienced. But I was feeling a lump in my throat at that one phrase The Commissioner so surreptitiously added in without issue;
“As always, our contestants will be fighting for their freedom*”*
I leaned to the woman next to me and as if she knew what I was going to ask; she put a finger up and shook her head. Eyes awash with fear and a grimness I had only seen on that of trainers who knew their fighter was not ready for the bout ahead. She pointed the finger down to my machine, then to the pit. Turning it on, I looked down as the commissioner began to talk, readying myself to commentate on whatever weirdos came up to battle.
“But before we get to that, we have an exciting exhibition match for our loyal supporters who bankroll this event every year. Without you elite few, we could not do this. You are the pound for pound goats of support! Now, without further ado; let’s get this show on the road!”The rest of the lights clicked on and spun around the venue as they raised the profile of the bout, the elevators both whirring into action as the right one arose first.
“In this corner, from the marionettes shop and accompanied by his Bunraku doll “Mr. Stares”, it’s the man who pulls the strings… THE PUPPET MAN!”
Out steps a tall, thin Japanese man in full clown makeup. His head shaven save for two ridiculous strands of hair stretched out and fluffed up to their limits, like red antennae. His eyebrows large m’s that practically cover his forehead, the nose a completely vacant slot with a black hole drawn in and the mouth… the fucking mouth was nailed shut. Literally. Sharp rusted nails had been hammered down through the lips with such force that they’d bent. A sickening crimson red face-paint stretched across the entire bottom half of his face, making it seem far larger by comparison. He carefully held a small bundle underneath a sheet and bowed deeply to the audience before standing at his designated spot.
“In the other corner, from the streets of god knows where and the womb of someone who misses him… "Hulked Up" Michael O’Donnell!”
I watched with wide eyes and a stomach threatening to evacuate its contents at any moment as the smoke cleared and a boy no older than 17 rushed out, beating his chest and screaming to the crowd as if he was the Incredible Hulk. I don’t know if they drugged the poor kid, but he clearly had no idea where he was.
“There are no rules, no referees and judges only exist in case of a draw or unclear victory. Our commentary team will take over and we wish you a phenomenal match.” She drools a little before she speaks again, looking up at me and winking. “Let’s make this a violent one.”
She snaps her fingers and leaps for the fence, climbing up with ungodly ease before sitting on her makeshift chair in her office.
I have no idea what I’m seeing but every cell in my body is urging me to run; I feel my knees tense and my frame rise ever so slightly before the woman next to me puts her hand on my thigh, pushing me down with great force.
“You have a job to do, so do I. Trust me, you think you can leave but if you get out of this chair, not only will YOUR life end. Mine will too.” She unsheathes the serrated blade and looks at me with pity. “We both have a part to play here, so put the headset on and let’s do our job, no matter how hard it is.”
Hands shaking, I pick up the headset and connect it to the portable recorder and take a breath.
“I… I need your name. What is it you do?” I stutter, trying to calm myself. She hands me a bottle of water as the surrounding lights dim and the spotlight focuses on the spectacle below.
“I’m Madame Nelle Lockwood, cryptid hunter and your co-host to guide you through tonight. Good to meet you, Sal.”
-
NFC EXHIBITION MATCH: "Hulked Up" Michael O’Donnell vs The Puppet Man w/ Mr. Stares
“Welcome fight fans from around the world, god knows how you’re listening to this or WHY, but here we are. I’m your host Sal “MotorMouth” Sabotta, wishing this was all a bad dream. Joining me this evening is our cryptid specialist and all round badass Madame Nelle Lockwood. How are you doing, Nelle?”
She looks at me with a bewildered look on her face before blinking and coming to her senses.
“Uhh… good! All things considered… boy, you really have a professional knack for this, huh? I can see why Commissioner Alduin brought you in."
“Ahh, yes. That’s right, folks! NFC Commissioner Alduin invited me here personally and our exhibition match proves to be… challenging. Let’s check in on the action below.”
I look down and see The Puppet Man sat down and gesturing to the figure under the sheet, like he’s got a negotiation going on. The boy, undeterred and furious, rushes towards him and takes his back, slapping his head and even pulling on his hair with extreme prejudice.
“Well take a gander at that, that kid has absolutely NO fear. When I was his age, I would have stayed FAR the fuck away from a nightmare spectre like that. But hell, this is all part of the show, right? Hope they’re paying that poor guy down there a sizeable sum to throw a fight to a child. What do you think, Nelle; is this the weirdest make-a-wish fulfilment task or what?”
I look over to her, hoping she’d indulge me and that I could believe this was just going to end with a pissed off actor storming away when the child hit him too hard. But Nelle was scanning her now open book and looking for information on dolls.
“He’s talking to his doll because it’s desperate to be let loose. He’s trying to bargain with it to spare him. This is the nature of the puppeteer and his master.” She pushes the book to the centre of the table and shows me a faded illustration of a pristine Bunraku doll; a kind of meticulously crafted Japanese take on the ventriloquist doll. The limbs are thinner and the face is more minimalist, but still no more frightening. “They usually have a symbiotic relationship, but it seems this one obeys the doll and will not want to face more punishment.”
“What do you mean more punishment?” I ask, looking back down at the feverish puppet man as he tries signing frantically under the sheet, even putting his head under as the kid bites his arm and kicks him, screeching.
“The nails, Sal. Those aren’t to silence him, they’re to punish him.”
The rest happened in slow motion; the sheet fell down. The puppet man stood up and walked to his side of the fighters corner, facing the elevator and placing his face into his forearms as he shook. The boy followed to keep attacking, but with one swift kick to the midsection, the boy was propelled back to the centre of the pit where the doll sat.
If there was a human face, I didn’t see it. Instead, I was staring down at a small wood carved spider, the head sporting black geisha hair and the makeup still present, but rows of sharpened black teeth protruded from the clicking mouth and two larger eyes jutted out from the base of the skull, smaller ones dotted closely around it. It was like seeing a puppet ogre spider.
“Looks like The Puppet Man has let Mr. Stares out to say hi and I can certainly see why he was under that sheet, this one isn’t pretty folks! The face doth fit the name. The question is, what’s he doing to do ne-
“I didn’t need to finish the question. My hands shook, and the world spun around me as this creature crawled towards the still wheezing boy with ungodly speed and perched itself expertly beside him. I don’t know if it was my eyes or the distance from where I sat, but this was NOT a small puppet. He was easily half of the boy’s height and that became more unnerving when he reared up on his back legs, the head clicking up and the raspy voice hissing out like a gas leak in a building.
“Hey, hey, kid! Wanna make a deal?” The kid rubbed his eyes, seemingly realising where he was as he calmed down and an air of utter confusion around him.
“If you let me be your new master and you promise to take care of me, I’ll let you go!” His head spun around and the jaw clicked ferociously as he giggled, extending out a clawed paw. “Whaddya say?”
The boy, still confused, slowly reached out his hand and the moment immediately reminded me of a slew of nature shows I’d seen as a kid; where a predator waits until the prey is lulled before striking. I felt the chill up my spine as he extended his hand and grabbed Mr. Stares.
In that moment, he leapt up the arm and bore his way into the boy’s mouth, down his throat and shredded his flesh. The sound was so horrifying, so visceral that it outshines any backyard stabbing, joint snap or broken nose. The boy didn’t even have time to scream, he simply looked up with tear-stained eyes as the puppet disappeared.
Then he started walking without him realising. He looked down at his limbs, terrified, looked over at The Puppet Master, who still had his head to the elevator and pleaded with someone, anyone to help him. I looked to Nelle who refused to take her eyes away, studying the battle in an almost morbid scientific curiosity, detached entirely from the scenario.
I couldn’t fathom how she did it, how she ignored this boy begging us to get him out of there.
I wanted to. Every instinct in me as a fight fan and a decent human was to scream “STOP THE FIGHT!”.
But clearly, when my own life is at risk and money is involved...
I am not a decent human.
Instead, with bile in my throat and a sweating forehead, I did my job.
“M-My goodness! The P-uppet, I mean, “Mr. Stares” has BECAME the puppet master, surely the fight will be over with our young competitor incapacitated? What does our commissioner have to say about this?”
She stared at me, her one eye gleaming and her face elated with the violence.
“It ain’t over yet, church boy. We haven’t even seen the finale, have we Puppet Master?!” She laughs and slaps her knee, the puppet master sobbing as he sinks to the floor and she continues.
“He ain’t done feeding, not yet.”
The way she said that word “feeding” nearly made me lose what food I had in me. That was a young man, somebody's baby boy…
“What does she mean by that, Nelle? What is the strategy to victory here?”
Nelle looked down at her book and traced her finger across a passage before wiping her forehead and pushing the locks aside. If her composure wasn’t breaking yet, it would do soon.
“This kind of parasitic doll feasts on its prey and targets non-essential organs first, controls the host with the neurotoxin in its tail and then, when it’s finally content, it gives the brain a second injection.”
“What happens then?” I asked, my own professionalism hanging on by a fucking thread at this point. She shook her head and pinched the bridge of her nose.“I guess you’ll see in a moment, I sure as hell don’t want to. Not again.”
Before I can prompt her further, the boy lets out an ear-piercing shriek and falls to his knees, gripping at his head before it turned red, then purple and finally an ugly shade of puce before…
The sound of a watermelon hitting the ground from a great height is the best comparison you’re going to get without making me want to rush to the toilet to puke for a third time. But that’s what happened. His head burst and chunks of his skull, flesh and brain matter sprayed the pit and the walls, some hitting my desk and making me audibly shriek, much to the commissioner's delight.
“HA! You didn’t run! I like you, Sal. You pass for the tournament!” She hauls her body up and slams down to the pit, applauding as the microphone descends from the heavens. “And your winner; The Puppet Man and Mr. Stares!”
The crowd erupts with applause as the weeping puppet man pulls the blood-soaked puppet out, places him under the sheet and silently begins to walk back to the elevator while attendees clear up the boy’s corpse.
“What… what the fuck IS this place?” I ask Nelle, pausing my recording.
“This is where nightmares are kept and set upon mostly unwilling competitors for the world’s amusement. You HAVE done dark web fights before, right? Mafia snitches being put into lions pits, bum fights, addicts fighting women to score… this can’t be THAT unusual to you?”
I stared at her incredulously. Was that even a question?
“I did the dark web ONCE and it damn sure didn’t involve monsters!”
She scoffs and closes her book, stretching before looking at me with contempt.
“Oh, it did. Just not the ones you hear about in fairytales. Good luck with the selection process. I’ll be back for the opening round. Don’t try to run, they’ll devour us both in minutes, if you think this is the pinnacle of what lurks beneath this club, you're in for a rough night.” She sauntered off, leaving me deflated, sickened and terrified. Unable to leave and frustrated to the point of tears that I couldn’t express that concoction of emotions, I did what I always do; I regressed and pressed “record” on the device as Commissioner Alduin continued.
At that moment, however, I was deaf to it all. The gravity of the situation had fully enveloped me…
They weren’t kidding about the unwilling participants, I just didn’t realise I would be one of them.On every side of me sits men and women with a desire for violence that goes beyond the norm, beyond the sane and beyond the boundaries of humanity.Below me are an untold number of creatures rattling their cages and howling for blood.
Across from me is a woman so powerful she could crush my skull beneath her boot with the utmost ease if it so amused her.
That invitation was nothing more than my own ransom note in pretty colours and flattering platitudes.
I was in a tournament housing nightmares incarnate.
And it would only get more violent from here on out.
-
The opening round was a blood bath.
submitted by tjaylea to nosleep [link] [comments]

Eth 2.0 vs Polkadot and other musings by a fundamental investor

Spent about two hours on this post and I decided it would help the community if I made it more visible. Comment was made as a response to this
I’m trying to avoid falling into a maximalist mindset over time. This isn’t a 100% ETH question, but I’m trying to stay educated about emerging tech.
Can someone help me see the downsides of diversifying into DOTs?
I know Polkadot is more centralized, VC backed, and generally against our ethos here. On chain governance might introduce some unknown risks. What else am I missing?
I see a bunch of posts about how Ethereum and Polkadot can thrive together, but are they not both L1 competitors?
Response:
What else am I missing?
The upsides.
Most of the guys responding to you here are full Eth maxis who drank the Parity is bad koolaid. They are married to their investment and basically emotional / tribal in an area where you should have a cool head. Sure, you might get more upvotes on Reddit if you do and say what the crowd wants, but do you want upvotes and fleeting validation or do you want returns on your investment? Do you want to be these guys or do you want to be the shareholder making bank off of those guys?
Disclaimer: I'm both an Eth whale and a Dot whale, and have been in crypto for close to a decade now. I originally bought ether sub $10 after researching it for at least a thousand hours. Rode to $1500 and down to $60. Iron hands - my intent has always been to reconsider my Eth position after proof of stake is out. I invested in the 2017 Dot public sale with the plan of flipping profits back to Eth but keeping Dots looks like the right short and long term play now. I am not a trader, I just take a deep tech dive every couple of years and invest in fundamentals.
Now as for your concerns:
I know Polkadot is more centralized
The sad truth is that the market doesn't really care about this. At all. There is no real statistic to show at what point a coin is "decentralized" or "too centralized". For example, bitcoin has been completely taken over by Chinese mining farms for about five years now. Last I checked, they control above 85% of the hashing power, they just spread it among different mining pools to make it look decentralized. They have had the ability to fake or block transactions for all this time but it has never been in their best interest to do so: messing with bitcoin in that way would crash its price, therefore their bitcoin holdings, their mining equipment, and their company stock (some of them worth billions) would evaporate. So they won't do it due to economics, but not because they can't.
That is the major point I want to get across; originally Bitcoin couldn't be messed with because it was decentralized, but now Bitcoin is centralized but it's still not messed with due to economics. It is basically ChinaCoin at this point, but the market doesn't care, and it still enjoys over 50% of the total crypto market cap.
So how does this relate to Polkadot? Well fortunately most chains - Ethereum included - are working towards proof of stake. This is obviously better for the environment, but it also has a massive benefit for token holders. If a hostile party wanted to take over a proof of stake chain they'd have to buy up a massive share of the network. The moment they force through a malicious transaction a proof of stake blockchain has the option to fork them off. It would be messy for a few days, but by the end of the week the hostile party would have a large amount of now worthless tokens, and the proof of stake community would have moved on to a version of the blockchain where the hostile party's tokens have been slashed to zero. So not only does the market not care about centralization (Bitcoin example), but proof of stake makes token holders even safer.
That being said, Polkadot's "centralization" is not that far off to Ethereum. The Web3 foundation kept 30% of the Dots while the Ethereum Foundation kept 17%. There are whales in Polkadot but Ethereum has them too - 40% of all genesis Ether went to 100 wallets, and many suspect that the original Ethereum ICO was sybiled to make it look more popular and decentralized than it really was. But you don't really care about that do you? Neither do I. Whales are a fact of life.
VC backed
VCs are part of the crypto game now. There is no way to get rid of them, and there is no real reason why you should want to get rid of them. They put their capital at risk (same as you and me) and seek returns on their investment (same as you and me). They are both in Polkadot and Ethereum, and have been for years now. I have no issue with them as long as they don't play around with insider information, but that is another topic. To be honest, I would be worried if VCs did not endorse chains I'm researching, but maybe that's because my investing style isn't chasing hype and buying SUSHI style tokens from anonymous (at the time) developers. That's just playing hot potato. But hey, some people are good at that.
As to the amount of wallets that participated in the Polkadot ICO: a little known fact is that more individual wallets participated in Polkadot's ICO than Ethereum's, even though Polkadot never marketed their ICO rounds due to regulatory reasons.
generally against our ethos here
Kool aid.
Some guy that works(ed?) at Parity (who employs what, 200+ people?) correctly said that Ethereum is losing its tech lead and that offended the Ethereum hivemind. Oh no. So controversial. I'm so personally hurt by that.
Some guy that has been working for free on Ethereum basically forever correctly said that Polkadot is taking the blockchain tech crown. Do we A) Reflect on why he said that? or B) Rally the mob to chase him off?
"I did not quit social media, I quit Ethereum. I did not go dark, I just left the community. I am no longer coordinating hard forks, building testnets, or contributing otherwise. I did not work on Polkadot, I never did, I worked on Ethereum. I did not hate Ethereum, I loved it."
Also Parity locked their funds (and about 500+ other wallets not owned by them) and proposed a solution to recover them. When the community voted no they backed off and did not fork the chain, even if they had the influence to do so. For some reason this subreddit hates them for that, even if Parity did the 100% moral thing to do. Remember, 500+ other teams or people had their funds locked, so Parity was morally bound to try its best to recover them.
Its just lame drama to be honest. Nothing to do with ethos, everything to do with emotional tribalism.
Now for the missing upsides (I'll also respond to random fragments scattered in the thread):
This isn’t a 100% ETH question, but I’m trying to stay educated about emerging tech.
A good quick intro to Eth's tech vs Polkadot's tech can be found on this thread, especially this reply. That thread is basically mandatory reading if you care about your investment.
Eth 2.0's features will not really kick in for end users until about 2023. That means every dapp (except DeFI, where the fees make sense due to returns and is leading the fee market) who built on Eth's layer 1 are dead for three years. Remember the trading card games... Gods Unchained? How many players do you think are going to buy and sell cards when the transaction fee is worth more than the cards? All that development is now practically worthless until it can migrate to its own shard. This story repeats for hundreds of other dapp teams who's projects are now priced out for three years. So now they either have to migrate to a one of the many unpopulated L2 options (which have their own list of problems and risks, but that's another topic) or they look for another platform, preferably one interoperable with Ethereum. Hence Polkadot's massive growth in developer activity. If you check out https://polkaproject.com/ you'll see 205 projects listed at the time of this post. About a week ago they had 202 listed. That means about one team migrated from another tech stack to build on Polkadot every two days, and trust me, many more will come in when parachains are finally activated, and it will be a complete no brainer when Polkadot 2.0 is released.
Another huge upside for Polkadot is the Initial Parachain Offerings. Polkadot's version of ICOs. The biggest difference is that you can vote for parachains using your Dots to bind them to the relay chain, and you get some of the parachain's tokens in exchange. After a certain amount of time you get your Dots back. The tokenomics here are impressive: Dots are locked (reduced supply) instead of sold (sell pressure) and you still earn your staking rewards. There's no risk of scammers running away with your Ether and the governance mechanism allows for the community to defund incompetent devs who did not deliver what was promised.
Wouldn’t an ETH shard on Polkadot gain a bunch of scaling benefits that we won’t see natively for a couple years?
Yes. That is correct. Both Edgeware and Moonbeam are EVM compatible. And if the original dapp teams don't migrate their projects someone else will fork them, exactly like SUSHI did to Uniswap, and how Acala is doing to MakerDao.
Although realistically Ethereum has a 5 yr headstart and devs haven't slowed down at all
Ethereum had a five year head start but it turns out that Polkadot has a three year tech lead.
Just because it's "EVM Compatible" doesn't mean you can just plug Ethereum into Polkadot or vica versa, it just means they both understand Ethereum bytecode and you can potentially copy/paste contracts from Ethereum to Polkadot, but you'd still need to add a "bridge" between the 2 chains, so it adds additional complexity and extra steps compared to using any of the existing L2 scaling solutions
That only applies of you are thinking from an Eth maximalist perspective. But if you think from Polkadot's side, why would you need to use the bridge back to Ethereum at all? Everything will be seamless, cheaper, and quicker once the ecosystem starts to flourish.
I see a bunch of posts about how Ethereum and Polkadot can thrive together, but are they not both L1 competitors?
They are competitors. Both have their strategies, and both have their strengths (tech vs time on the market) but they are clearly competing in my eyes. Which is a good thing, Apple and Samsung competing in the cell phone market just leads to more innovation for consumers. You can still invest in both if you like.
Edit - link to post and the rest of the conversation: https://www.reddit.com/ethfinance/comments/iooew6/daily_general_discussion_september_8_2020/g4h5yyq/
Edit 2 - one day later PolkaProject count is 210. Devs are getting the hint :)
submitted by redditsucks_goruqqus to polkadot_market [link] [comments]

Cyptocurrency pegged to electricity price

Meter.io aims to create a low volatile currency following 10 kwh electricity price.
Meter uses a hybrid PoW/PoS solution; PoW mining for stable coin creation and PoS for txn ordering
  1. MTR is stablecoin soft pegged around the global competitive price of 10 kwh electricity
  2. MTRG is the finite supply governance token, which is used by PoS validators to validate transactions.
Pow mining in Meter is as open and decentralized as in Bitcoin but differs from that in Bitcoin in two fundamental ways
  1. Block rewards are dynamic. It’s determined as a function of pow difficulty. The wining Meter miner will earn more MTR if hash rate is high and less MTR if hash rate is low, ensuring a stable cost of production for each MTR at 10 kWh electricity price using mainstream mining equipment
  2. Miner’s don’t validate transactions. They simply compete to solve PoW. Txn ordering is done by PoS validators who secure the network and in return earn txn fees.
All stablecoins must essentialy have stability mechanisms to account for cases where demand is high and where demand is low. MTR has 2 stability mechanisms set to solve this mission.
Supply side stability mechanism (long term)
First and foremost MTR can’t be produced out of thin air. It’s issuance follows a disciplined monetary policy that solely depends on profit seeking behavior of miners. The only way to issue MTR is via PoW mining. When miners notice that price of MTR is getting higher than the cost to produce them (remember cost of production is always fixed at 10 kwh elec. price = around 0.9-1.2 usd) they will turn on their equipment and start creating new supply. If demand keeps increasing more miners will join, and more MTR will be printed to keep up with demand. Eventually supply will outperfrom the demand and price will get back to equilibrium.
When demand is low and MTR price is dropping below 10 kwh elec. price miners will not risk their profit margin to shrink and switch to mine other coins instead of MTR. In return MTR production will stop and no additional MTR will enter circulation. Given that mining is a competitive, open enviroment, price of MTR will eventually equal to the cost to produce it. (Marginal Revenue = Marginal Cost).
The long term stability is achieved through this unique and simple mechanism at layer 1 which doesn’t require use of capital inefficient collateral, complicated oracles, seignorage shares or algorithmic rebasing mechanisms.
Relative to nation based fiat currencies, switching cost between crytocurrencies is significantly lower. Sudden demand changes in crypto is therefore very common and must be addressed. Huge drop in demand may temporarly cause MTR to get traded below it’s cost of production making pow mining a losing game. How can the system recover from that and restart production? On the contrary, a sudden increase in demand may cause MTR to get traded at a premium making mining temporarly very profitable. Meter has a second layer stability mechanism in order to absorb sudden demand changes.
Demand side stability mechanism (short term)
An on chain auction (will become live in October 2020) resets every 24 hours offering newly minted fixed number of MTRGs in exchange for bids in MTR. Participants bid at no specific price and at the end of auction recieve MTRG proportional to their percentage of total bid. The main purpose of this auction is to consume MTR. A portion of MTR (initally %60) that is bidded in the auction ends up going to a reserve that is collectively owned by MTRG holders, essentially getting out of circulation. Future use of MTR in Reserve can be decided by governance. The remaining %40 gets gradually distributed to PoS validators as block rewards. This reserve allocation ratio can be adjusted via governance depending on the amount of MTR needed to be removed out of circulation at any point in time.
Meter team working to make Meter compatible with other blockchain. In fact both MTR and MTRG can currently be 1:1 bridged to their Ethereum versions as eMTR and eMTRG respectively. In near term, stablecoin MTR is set out on a mission to serve as collateral and a crypto native unit of account for DeFi.
submitted by cangurel to CryptoMoonShots [link] [comments]

Since they're calling for r/btc to be banned...

Maybe it's time to discuss bitcoin's history again. Credit to u/singularity87 for the original post over 3 years ago.

People should get the full story of bitcoin because it is probably one of the strangest of all reddit subs.
bitcoin, the main sub for the bitcoin community is held and run by a person who goes by the pseudonym u/theymos. Theymos not only controls bitcoin, but also bitcoin.org and bitcointalk.com. These are top three communication channels for the bitcoin community, all controlled by just one person.
For most of bitcoin's history this did not create a problem (at least not an obvious one anyway) until around mid 2015. This happened to be around the time a new player appeared on the scene, a for-profit company called Blockstream. Blockstream was made up of/hired many (but not all) of the main bitcoin developers. (To be clear, Blockstream was founded before mid 2015 but did not become publicly active until then). A lot of people, including myself, tried to point out there we're some very serious potential conflicts of interest that could arise when one single company controls most of the main developers for the biggest decentralised and distributed cryptocurrency. There were a lot of unknowns but people seemed to give them the benefit of the doubt because they were apparently about to release some new software called "sidechains" that could offer some benefits to the network.
Not long after Blockstream came on the scene the issue of bitcoin's scalability once again came to forefront of the community. This issue came within the community a number of times since bitcoins inception. Bitcoin, as dictated in the code, cannot handle any more than around 3 transactions per second at the moment. To put that in perspective Paypal handles around 15 transactions per second on average and VISA handles something like 2000 transactions per second. The discussion in the community has been around how best to allow bitcoin to scale to allow a higher number of transactions in a given amount of time. I suggest that if anyone is interested in learning more about this problem from a technical angle, they go to btc and do a search. It's a complex issue but for many who have followed bitcoin for many years, the possible solutions seem relatively obvious. Essentially, currently the limit is put in place in just a few lines of code. This was not originally present when bitcoin was first released. It was in fact put in place afterwards as a measure to stop a bloating attack on the network. Because all bitcoin transactions have to be stored forever on the bitcoin network, someone could theoretically simply transmit a large number of transactions which would have to be stored by the entire network forever. When bitcoin was released, transactions were actually for free as the only people running the network were enthusiasts. In fact a single bitcoin did not even have any specific value so it would be impossible set a fee value. This meant that a malicious person could make the size of the bitcoin ledger grow very rapidly without much/any cost which would stop people from wanting to join the network due to the resource requirements needed to store it, which at the time would have been for very little gain.
Towards the end of the summer last year, this bitcoin scaling debate surfaced again as it was becoming clear that the transaction limit for bitcoin was semi regularly being reached and that it would not be long until it would be regularly hit and the network would become congested. This was a very serious issue for a currency. Bitcoin had made progress over the years to the point of retailers starting to offer it as a payment option. Bitcoin companies like, Microsoft, Paypal, Steam and many more had began to adopt it. If the transaction limit would be constantly maxed out, the network would become unreliable and slow for users. Users and businesses would not be able to make a reliable estimate when their transaction would be confirmed by the network.
Users, developers and businesses (which at the time was pretty much the only real bitcoin subreddit) started to discuss how we should solve the problem bitcoin. There was significant support from the users and businesses behind a simple solution put forward by the developer Gavin Andreesen. Gavin was the lead developer after Satoshi Nakamoto left bitcoin and he left it in his hands. Gavin initially proposed a very simple solution of increasing the limit which was to change the few lines of code to increase the maximum number of transactions that are allowed. For most of bitcoin's history the transaction limit had been set far far higher than the number of transactions that could potentially happen on the network. The concept of increasing the limit one time was based on the fact that history had proven that no issue had been cause by this in the past.
A certain group of bitcoin developers decided that increasing the limit by this amount was too much and that it was dangerous. They said that the increased use of resources that the network would use would create centralisation pressures which could destroy the network. The theory was that a miner of the network with more resources could publish many more transactions than a competing small miner could handle and therefore the network would tend towards few large miners rather than many small miners. The group of developers who supported this theory were all developers who worked for the company Blockstream. The argument from people in support of increasing the transaction capacity by this amount was that there are always inherent centralisation pressure with bitcoin mining. For example miners who can access the cheapest electricity will tend to succeed and that bigger miners will be able to find this cheaper electricity easier. Miners who have access to the most efficient computer chips will tend to succeed and that larger miners are more likely to be able to afford the development of them. The argument from Gavin and other who supported increasing the transaction capacity by this method are essentially there are economies of scale in mining and that these economies have far bigger centralisation pressures than increased resource cost for a larger number of transactions (up to the new limit proposed). For example, at the time the total size of the blockchain was around 50GB. Even for the cost of a 500GB SSD is only $150 and would last a number of years. This is in-comparison to the $100,000's in revenue per day a miner would be making.
Various developers put forth various other proposals, including Gavin Andresen who put forth a more conservative increase that would then continue to increase over time inline with technological improvements. Some of the employees of blockstream also put forth some proposals, but all were so conservative, it would take bitcoin many decades before it could reach a scale of VISA. Even though there was significant support from the community behind Gavin's simple proposal of increasing the limit it was becoming clear certain members of the bitcoin community who were part of Blockstream were starting to become increasingly vitriolic and divisive. Gavin then teamed up with one of the other main bitcoin developers Mike Hearn and released a coded (i.e. working) version of the bitcoin software that would only activate if it was supported by a significant majority of the network. What happened next was where things really started to get weird.
After this free and open source software was released, Theymos, the person who controls all the main communication channels for the bitcoin community implemented a new moderation policy that disallowed any discussion of this new software. Specifically, if people were to discuss this software, their comments would be deleted and ultimately they would be banned temporarily or permanently. This caused chaos within the community as there was very clear support for this software at the time and it seemed our best hope for finally solving the problem and moving on. Instead a censorship campaign was started. At first it 'all' they were doing was banning and removing discussions but after a while it turned into actively manipulating the discussion. For example, if a thread was created where there was positive sentiment for increasing the transaction capacity or being negative about the moderation policies or negative about the actions of certain bitcoin developers, the mods of bitcoin would selectively change the sorting order of threads to 'controversial' so that the most support opinions would be sorted to the bottom of the thread and the most vitriolic would be sorted to the top of the thread. This was initially very transparent as it was possible to see that the most downvoted comments were at the top and some of the most upvoted were at the bottom. So they then implemented hiding the voting scores next to the users name. This made impossible to work out the sentiment of the community and when combined with selectively setting the sorting order to controversial it was possible control what information users were seeing. Also, due to the very very large number of removed comments and users it was becoming obvious the scale of censorship going on. To hide this they implemented code in their CSS for the sub that completely hid comments that they had removed so that the censorship itself was hidden. Anyone in support of scaling bitcoin were removed from the main communication channels. Theymos even proudly announced that he didn't care if he had to remove 90% of the users. He also later acknowledged that he knew he had the ability to block support of this software using the control he had over the communication channels.
While this was all going on, Blockstream and it's employees started lobbying the community by paying for conferences about scaling bitcoin, but with the very very strange rule that no decisions could be made and no complete solutions could be proposed. These conferences were likely strategically (and successfully) created to stunt support for the scaling software Gavin and Mike had released by forcing the community to take a "lets wait and see what comes from the conferences" kind of approach. Since no final solutions were allowed at these conferences, they only served to hinder and splinter the communities efforts to find a solution. As the software Gavin and Mike released called BitcoinXT gained support it started to be attacked. Users of the software were attack by DDOS. Employees of Blockstream were recommending attacks against the software, such as faking support for it, to only then drop support at the last moment to put the network in disarray. Blockstream employees were also publicly talking about suing Gavin and Mike from various different angles simply for releasing this open source software that no one was forced to run. In the end Mike Hearn decided to leave due to the way many members of the bitcoin community had treated him. This was due to the massive disinformation campaign against him on bitcoin. One of the many tactics that are used against anyone who does not support Blockstream and the bitcoin developers who work for them is that you will be targeted in a smear campaign. This has happened to a number of individuals and companies who showed support for scaling bitcoin. Theymos has threatened companies that he will ban any discussion of them on the communication channels he controls (i.e. all the main ones) for simply running software that he disagrees with (i.e. any software that scales bitcoin).
As time passed, more and more proposals were offered, all against the backdrop of ever increasing censorship in the main bitcoin communication channels. It finally come down the smallest and most conservative solution. This solution was much smaller than even the employees of Blockstream had proposed months earlier. As usual there was enormous attacks from all sides and the most vocal opponents were the employees of Blockstream. These attacks still are ongoing today. As this software started to gain support, Blockstream organised more meetings, especially with the biggest bitcoin miners and made a pact with them. They promised that they would release code that would offer an on-chain scaling solution hardfork within about 4 months, but if the miners wanted this they would have to commit to running their software and only their software. The miners agreed and the ended up not running the most conservative proposal possible. This was in February last year. There is no hardfork proposal in sight from the people who agreed to this pact and bitcoin is still stuck with the exact same transaction limit it has had since the limit was put in place about 6 years ago. Gavin has also been publicly smeared by the developers at Blockstream and a plot was made against him to have him removed from the development team. Gavin has now been, for all intents an purposes, expelled from bitcoin development. This has meant that all control of bitcoin development is in the hands of the developers working at Blockstream.
There is a new proposal that offers a market based approach to scaling bitcoin. This essentially lets the market decide. Of course, as usual there has been attacks against it, and verbal attacks from the employees of Blockstream. This has the biggest chance of gaining wide support and solving the problem for good.
To give you an idea of Blockstream; It has hired most of the main and active bitcoin developers and is now synonymous with the "Core" bitcoin development team. They AFAIK no products at all. They have received around $75m in funding. Every single thing they do is supported by theymos. They have started implementing an entirely new economic system for bitcoin against the will of it's users and have blocked any and all attempts to scaling the network in line with the original vision.
Although this comment is ridiculously long, it really only covers the tip of the iceberg. You could write a book on the last two years of bitcoin. The things that have been going on have been mind blowing. One last thing that I think is worth talking about is the u/bashco's claim of vote manipulation.
The users that the video talks about have very very large numbers of downvotes mostly due to them having a very very high chance of being astroturfers. Around about the same time last year when Blockstream came active on the scene every single bitcoin troll disappeared, and I mean literally every single one. In the years before that there were a large number of active anti-bitcoin trolls. They even have an active sub buttcoin. Up until last year you could go down to the bottom of pretty much any thread in bitcoin and see many of the usual trolls who were heavily downvoted for saying something along the lines of "bitcoin is shit", "You guys and your tulips" etc. But suddenly last year they all disappeared. Instead a new type of bitcoin user appeared. Someone who said they were fully in support of bitcoin but they just so happened to support every single thing Blockstream and its employees said and did. They had the exact same tone as the trolls who had disappeared. Their way to talking to people was aggressive, they'd call people names, they had a relatively poor understanding of how bitcoin fundamentally worked. They were extremely argumentative. These users are the majority of the list of that video. When the 10's of thousands of users were censored and expelled from bitcoin they ended up congregating in btc. The strange thing was that the users listed in that video also moved over to btc and spend all day everyday posting troll-like comments and misinformation. Naturally they get heavily downvoted by the real users in btc. They spend their time constantly causing as much drama as possible. At every opportunity they scream about "censorship" in btc while they are happy about the censorship in bitcoin. These people are astroturfers. What someone somewhere worked out, is that all you have to do to take down a community is say that you are on their side. It is an astoundingly effective form of psychological attack.
submitted by CuriousTitmouse to btc [link] [comments]

Two Prime, under the radar coin worth looking into.

Two Prime has released their FF1 MacroToken.
"We show how this methodology can be applied as an Open Source application, in the vein of BTC and ETH, with all the creative and value generative potential that comes along with it. We leverage store of value functions of cryptocurrencies to arrive at value creation and accretion in the real economy by the intermediary of crypto exchanges on which we propose to provide protective measures. We detail treasury and reserve formation for the Open Source Finance Foundation, describe its relation to Two Prime and detail the emission of a new crypto-asset called the FF1 Token.
We seek liquidity for the FF1 treasury within the secondary exchanges for the purpose of applying M4 in the real world, both in the private and public sector. We first apply this to the vertical of cryptocurrencies while outlining the genericity and stability of the model which we indeed to apply to esoteric financial needs (e.g. Smart City financing). In so doing, we extend the scope and control of applications that a system of digital units of value stored on decentralized, public ledgers can aim to advance. We call this approach Open Source Finance and the resulting coin class a MacroToken.
MODERN MONETARY THEORY FRAMEWORKModern Monetary Theory states two interdependent phenomenological axioms and the banking system operates on a resulting syllogism:
In the past 10 years, the formation and emergence of BTC and ETH has verifiably falsified Axiom 2 [1]. The phenomenon of crypto-currencies has created ab-initio global stores of value of type 1a. Cryptoc Currencies have displaced trust by means of government violence and associated, implied violence, with instead, open source distribution, cloud computing, objective mathematics, and the algorithmic integrity of blockchain ledgers. The first “killer app” of these open source ledgers areis stores of value, e.g. Bitcoin, or “open source money” as it was first characterized by its semi-anonymous creators. Leading crypto-currencies have proven themselves as viable global stores of value. They are regulated as Gold is in the United States. However, as type 1a units of value, they have tended towards high volatility inevitably leading to speculative market behavior and near 0 “real” asset-” backing or floor price [2], albeit with an aggregate value of $350bn ab-initio creation.
We therefore advance Axiom 2 to Axiom 2’
At N < 1 we have dilutive debasement of fungible units of value, aka inflation. At 1, the new monies are therefore stable coins. At N > 1, these tokens are designed to grow with demand. Axiom Two Prime (or 2’) displaces government endorsed violence as our macro-socio organizing principle, with algorithmic objectivity and verifiable transparency. This occurs within the landscape we call Open Source Finance.
THE TWO PRIME MODEL
Two Prime refers to the financial management company managing the OSFF. FF1 refers to the Macro Token of the OSFF. The first stage is reserve and treasury formation, the second stage describes the mechanics of the public markets and the protective measures of the reserves and third stage is treasury liquidity via the Continuous Token Offering both in public and private markets. We will now describe these in more detail.
MACRO INVESTMENT THESIS AND RATIONALE FOR FF1The FF1 MacroToken is a synthetic token based on the proven killer applications of Cryptoc-Currencies. After 110 years since the inception of the blockchain technology, the killer apps of crypto are already here and they are primarily all financial, not technical. The historical killers apps are:
The FF1 MacroToken is a pot-pourri of these features, a synthetic token that mixes the best of breed practices of crypto mixing Store-of-Value, Capital Formation and Fractional Asset-Backing.
MACRO INVESTMENT THESIS AND RATIONALE FOR FF1Treasury Generation: Ab-Initio Store of Value On the supply side, The OSFFTwo Prime has created is creating 100, 000, 000 FF1 Macro Tokens, which it keeps in treasury. They are pure stores of value for they have no assets backing them at birth. They are ab-initio instruments. The FF1 Macro Tokens are listed on public crypto exchanges. Two Prime manages operates market- making for these stores of value.
Treasury Management: Supply- Side Tokenomics All FF1 are held in the Open Source Finance Foundation treasury. Crypto aAssets that enter into treasury are, at first, not traded. The FF1 supply will be offered upon sufficient demand. which Two Prime generates publicly and privately. The total supply will be finite in total units (100, 000, 000), but variable in its aggregate value for supply and demand will make the price move. The proceeds are the property of the OSFF (not Two Prime) and Two Prime places invests the liquid treasury (post FF1 liquidation) in crypto assets to protect against depreciation and create a macro-hedge reserve andor floor for the price. It should be noted that the price and the NAV of assets are, by design, not equal. In other words, the additional OSFF treasury is locked and can enter circulation if, and only if, there is a corresponding demand which is then placed invested in crypto assets with a target value N 1. This results in fractional asset- backing at first.
EXCHANGES, CONTINUOUS TOKEN OFFERING, AND DEMAND- SIDE TOKENOMICSPublic Exchanges Two Prime will maintain listings for the FF1 Tokens on behalf of the OSFF. Two Prime maintains market- making operations in public crypto exchanges on behalf of the OSFF.
Continuous Token Offering Two Prime works on creating new liquidity for the FF1 Macro Tokens to comply with the supply side constraints detailed above, namely that a token enters circulation when matched by demand. Two Prime does demand generation in public as above as well as private. This CTO results in something akin to a reverse-ICO, letting the reserves be set by public trading and then marketing to private purchasers investors (accredited US for example) after the public liquidity event. Demand generation is done via marketing to relevant audiences, e.g. as a macro way to HODL with exclusive private equity investments for crypto holders, and as a diversified and de-risked way to gain crypto exposure for FIAT holders (Sharpe ratio: 1.55, Beta to BTC: 0.75).
PARTNER NETWORK, USE OF PROCEEDS, ACCRETION AND FLOOR PROTECTIONThough this mathematical approach allows for a broad and differentiated set of financial applications and outcomes, Two Prime founding Members will first apply this work to the realm of project finance within the Blockchain space via algorithmic balancing of an equity and debt based treasury consisting of real crypto assets and future cash flows.
Proof of Value Mining in Partner Network Funds and projects can apply to the foundation for financing. This is the partner network and is akin to the way a network of miners secure the chain. Here a network of partners protects the value. The Foundation invests the proceeds in liquid crypto assets, interest bearing crypto assets and equity crypto assets via partner funds, creating a bridge to the real economy (crypto companies) in the last step. The foundation holds these (real economic) assets.
M4 Asset Mix The funds raised are invested in public and private sector projects. We consider the following mix
This completes the M4 step and the flow of funds for the FF1 Token. It shows a feedback loop, for the Foundation can buy back it’s token, leading to an idiosyncratic tokenomics: the FF1 Token has a fixed (and potentially diminishing) SUPPLY alongside (potentially increasing) endogenous and exogenous DEMAND."
This seems pretty interesting imo, thoughts?
submitted by Stock-Accountant to CryptoMoonShots [link] [comments]

IPFS— The New-gen Tech Revolution, or Another Illusion?

IPFS— The New-gen Tech Revolution, or Another Illusion?
Founded in 2014, after 6 years of R & D as well as its expansion, and after nearly a year of extensive testing and preparation, IPFS (Interplanetary File System) was officially launched on the afternoon of October 15, 2020, UTC time. 12 hours after the mainnet went online, Its token price fluctuated between 50~70 USD/FIL. However, panic and pessimism began to spread between the IPFS community and FIL token holders. Based on the total amount of FIL (2 billion) and the unit price of 50 US dollars estimation, its market value has exceeded 100 billion US dollars, second only to BTC. As such a mega valued IPFS/FIL went online, if there is no enough application to support, the selling pressure after the FIL is gradually unlocked will become huge. According to calculations, on the first day of the mainnet launch, there will be 239,000 FILs to be sold. Assuming that the unit price is 30 USD, the released circulation will be 8 million USD. Assuming the unit price remains unchanged, the market will usher in a similar value on the 10th day. With a release amount of 15 million USD, it is very likely that the corresponding token price can be supported.
IPFS is a network transmission protocol designed to create persistent and distributed storage and sharing of files. In terms of its current active projects and companies, IPFS has added more than 5 billion files, involving multiple industries, and there are also many blockchain companies using this technology. When the IPFS mainnet and its Filecoin goes online, the market value will be based on the applications brought by the IPFS network. After Bitcoin and Ethereum, Filecoin is an upstart in the blockchain industry with a revolutionary technological breakthrough. The market predicts that Filecoin’s market value will surpass Bitcoin. Now let us analyze this project together that was given high hopes:
IPFS major features and disadvantages
The basic application of IPFS, Filecoin’s financial attributes and its incentive mechanism make it a very exciting global collaborative open source project. On this basis, the data of all mankind is stored in the IPFS network, and no one can tamper with it.
This magnificent scene provides at least three values ​​for us:
  1. It creates a storage network service that is license-free and trust-free. This is very important. When you want to access a digital file, you don’t need to get approval or filing from any organization, and strict certification. As for non-centralized trust, it does not require user to trust the supplier that provides storage services, which significantly reduces the cost.
  2. The successful application of IPFS will most likely enable all idle storage resources in the world to be gathered to form a network and be effectively used, and such a network is unprecedented.
  3. Through such a model, network redundancy can be effectively reduced, and the complete separation of data can be achieved. There is no need to store files in a fixed location, only the content needs to be stored in an IPFS and Filecoin network.
However, Plentiful in ideal yet bony in reality. The design flaws of the IPFS project make it difficult to truly apply in the practical environment. Its design flaws mainly focus in the following aspects:
  1. Cannot support hot data storage.
Based on the principle of data timeliness, the higher the frequency of data access, the greater the value of the data possess. At present, IPFS only supports cold data storage scenarios. The lack of support for network transmission makes it impossible to establish a transmission network for hot data, which means the lack of the most valuable support for the network.
  1. The disaster tolerance mechanism is missing.
Disaster tolerance means that when an IT system stops working due to an accident (such as fire, earthquake, etc.), the entire application system can be switched to another location so that the system functions can continue to work normally. IPFS / Filecoin does not provide reliable disaster recovery and recovery mechanisms for storage users. Storage miners arecentrally handling disaster recovery backup and recovery works, resulting in an increase in storage space redundancy by 2–3 times.
  1. The storage performance is reduced by more than 60 times.
The IPFS data verification mechanism is too ideal and complex, and its storage performance is more than 60 times lower than that of a traditional centralized storage system. 1TB files usually need to be verified for more than 10 hours and cannot be stored normally and efficiently.
  1. Centralized technology architecture.
IPFS requires pretty advanced hardware, which leads to a very high threshold for joining its storage network. At present, only specialized storage devices can join the IPFS network as storage nodes. This means that IPFS initially advertised to users that connecting ordinary idle storage and reducing storage costs, has become a flubdub. It is difficult to store the entire network in a centralized structure in a disguised form, which cannot greatly reduce the storage cost of the entire network and ensure the security of the entire network.
  1. Due to the lack of the decentralized governance mechanism, its governance is too despotic, leading to a certain harm to the participated communities.
The above are the main obstacles currently hindering IPFS and Filecoin. The good news is that some of them can be improved and perfected, while some are design mechanism problems and cannot be fixed. Let’s take a look at another project initiated in the tech circle in 2017 — -HOP:
What is HOP
The HOP protocol provides a decentralized and completely anonymous traffic service for people all over the world based on block chain. HOP combines P2P network transmission and block chain technology to establish a block chain micro-payment protocol based on the block chain transmission encryption protocol between P2P network bandwidth contributors and bandwidth users, and merge it into traffic mining. In the mining pool side, the whole protocol is built on the main network, which has Micro Payment and mining functions. In addition, HOP also supports traffic mining of ERC20 in any currency. So far, HOP is the only protocol that combines the above functions and is officially available in commercial application. It can provide terminal nodes for secure access to decentralized networks.
HOP features and comparison with IPFS
HOP and IPFS have certain similarities. The following table is a comparison of the two projects in terms of technology and application characteristics:

https://preview.redd.it/d4klovzngmt51.png?width=1178&format=png&auto=webp&s=f14f0b2290430f6861f6da27f2d1e47ed196b741
Why HOP might be a phenomenal project in the future
Compared with the disadvantages of IPFS which are not supporting for hot data storage, low storage efficiency, low disaster tolerance and the high threshold of providing storage capacity, The advantages of HOP are summarized as follows:
  1. High operating efficiency.
The smart micro-payment system runs payments with unlimited TPS, and the efficiency is 90% higher than Ethereum.
  1. High level of open source. Supports all ERC-20 token access.
  2. Low threshold of participation.
Any participant who has a certain fundamental knowledge of computer science and network technology can set up mining pools and miners.
  1. High scalability.
It can be combined with Starlink satellites, repeaters, sim/esim cards and mobile phones in actual application scenarios to form a next-generation distributed interconnected communication network globally.
At present, most of the participants of HOP are top tier tech specialists, famous investors and politicians with global vision. We believe that projects like HOP, due to the open source and far-sighted technical foundation, which can not only achieve internal self-consistent circulation, but also integrate well with external ecology. Just as water conservancy is invisible, HOP has unlimited inclusiveness and scalability, and has a strong platform-level vitality!
We are looking forward to the accumulation of HOP, bringing a revolution in technology and applications to the blockchain and the practical universe!
submitted by Hayley_HOP to u/Hayley_HOP [link] [comments]

Why i’m bullish on Zilliqa (long read)

Edit: TL;DR added in the comments
 
Hey all, I've been researching coins since 2017 and have gone through 100s of them in the last 3 years. I got introduced to blockchain via Bitcoin of course, analyzed Ethereum thereafter and from that moment I have a keen interest in smart contact platforms. I’m passionate about Ethereum but I find Zilliqa to have a better risk-reward ratio. Especially because Zilliqa has found an elegant balance between being secure, decentralized and scalable in my opinion.
 
Below I post my analysis of why from all the coins I went through I’m most bullish on Zilliqa (yes I went through Tezos, EOS, NEO, VeChain, Harmony, Algorand, Cardano etc.). Note that this is not investment advice and although it's a thorough analysis there is obviously some bias involved. Looking forward to what you all think!
 
Fun fact: the name Zilliqa is a play on ‘silica’ silicon dioxide which means “Silicon for the high-throughput consensus computer.”
 
This post is divided into (i) Technology, (ii) Business & Partnerships, and (iii) Marketing & Community. I’ve tried to make the technology part readable for a broad audience. If you’ve ever tried understanding the inner workings of Bitcoin and Ethereum you should be able to grasp most parts. Otherwise, just skim through and once you are zoning out head to the next part.
 
Technology and some more:
 
Introduction
 
The technology is one of the main reasons why I’m so bullish on Zilliqa. First thing you see on their website is: “Zilliqa is a high-performance, high-security blockchain platform for enterprises and next-generation applications.” These are some bold statements.
 
Before we deep dive into the technology let’s take a step back in time first as they have quite the history. The initial research paper from which Zilliqa originated dates back to August 2016: Elastico: A Secure Sharding Protocol For Open Blockchains where Loi Luu (Kyber Network) is one of the co-authors. Other ideas that led to the development of what Zilliqa has become today are: Bitcoin-NG, collective signing CoSi, ByzCoin and Omniledger.
 
The technical white paper was made public in August 2017 and since then they have achieved everything stated in the white paper and also created their own open source intermediate level smart contract language called Scilla (functional programming language similar to OCaml) too.
 
Mainnet is live since the end of January 2019 with daily transaction rates growing continuously. About a week ago mainnet reached 5 million transactions, 500.000+ addresses in total along with 2400 nodes keeping the network decentralized and secure. Circulating supply is nearing 11 billion and currently only mining rewards are left. The maximum supply is 21 billion with annual inflation being 7.13% currently and will only decrease with time.
 
Zilliqa realized early on that the usage of public cryptocurrencies and smart contracts were increasing but decentralized, secure, and scalable alternatives were lacking in the crypto space. They proposed to apply sharding onto a public smart contract blockchain where the transaction rate increases almost linear with the increase in the amount of nodes. More nodes = higher transaction throughput and increased decentralization. Sharding comes in many forms and Zilliqa uses network-, transaction- and computational sharding. Network sharding opens up the possibility of using transaction- and computational sharding on top. Zilliqa does not use state sharding for now. We’ll come back to this later.
 
Before we continue dissecting how Zilliqa achieves such from a technological standpoint it’s good to keep in mind that a blockchain being decentralised and secure and scalable is still one of the main hurdles in allowing widespread usage of decentralised networks. In my opinion this needs to be solved first before blockchains can get to the point where they can create and add large scale value. So I invite you to read the next section to grasp the underlying fundamentals. Because after all these premises need to be true otherwise there isn’t a fundamental case to be bullish on Zilliqa, right?
 
Down the rabbit hole
 
How have they achieved this? Let’s define the basics first: key players on Zilliqa are the users and the miners. A user is anybody who uses the blockchain to transfer funds or run smart contracts. Miners are the (shard) nodes in the network who run the consensus protocol and get rewarded for their service in Zillings (ZIL). The mining network is divided into several smaller networks called shards, which is also referred to as ‘network sharding’. Miners subsequently are randomly assigned to a shard by another set of miners called DS (Directory Service) nodes. The regular shards process transactions and the outputs of these shards are eventually combined by the DS shard as they reach consensus on the final state. More on how these DS shards reach consensus (via pBFT) will be explained later on.
 
The Zilliqa network produces two types of blocks: DS blocks and Tx blocks. One DS Block consists of 100 Tx Blocks. And as previously mentioned there are two types of nodes concerned with reaching consensus: shard nodes and DS nodes. Becoming a shard node or DS node is being defined by the result of a PoW cycle (Ethash) at the beginning of the DS Block. All candidate mining nodes compete with each other and run the PoW (Proof-of-Work) cycle for 60 seconds and the submissions achieving the highest difficulty will be allowed on the network. And to put it in perspective: the average difficulty for one DS node is ~ 2 Th/s equaling 2.000.000 Mh/s or 55 thousand+ GeForce GTX 1070 / 8 GB GPUs at 35.4 Mh/s. Each DS Block 10 new DS nodes are allowed. And a shard node needs to provide around 8.53 GH/s currently (around 240 GTX 1070s). Dual mining ETH/ETC and ZIL is possible and can be done via mining software such as Phoenix and Claymore. There are pools and if you have large amounts of hashing power (Ethash) available you could mine solo.
 
The PoW cycle of 60 seconds is a peak performance and acts as an entry ticket to the network. The entry ticket is called a sybil resistance mechanism and makes it incredibly hard for adversaries to spawn lots of identities and manipulate the network with these identities. And after every 100 Tx Blocks which corresponds to roughly 1,5 hour this PoW process repeats. In between these 1,5 hour, no PoW needs to be done meaning Zilliqa’s energy consumption to keep the network secure is low. For more detailed information on how mining works click here.
Okay, hats off to you. You have made it this far. Before we go any deeper down the rabbit hole we first must understand why Zilliqa goes through all of the above technicalities and understand a bit more what a blockchain on a more fundamental level is. Because the core of Zilliqa’s consensus protocol relies on the usage of pBFT (practical Byzantine Fault Tolerance) we need to know more about state machines and their function. Navigate to Viewblock, a Zilliqa block explorer, and just come back to this article. We will use this site to navigate through a few concepts.
 
We have established that Zilliqa is a public and distributed blockchain. Meaning that everyone with an internet connection can send ZILs, trigger smart contracts, etc. and there is no central authority who fully controls the network. Zilliqa and other public and distributed blockchains (like Bitcoin and Ethereum) can also be defined as state machines.
 
Taking the liberty of paraphrasing examples and definitions given by Samuel Brooks’ medium article, he describes the definition of a blockchain (like Zilliqa) as: “A peer-to-peer, append-only datastore that uses consensus to synchronize cryptographically-secure data”.
 
Next, he states that: "blockchains are fundamentally systems for managing valid state transitions”. For some more context, I recommend reading the whole medium article to get a better grasp of the definitions and understanding of state machines. Nevertheless, let’s try to simplify and compile it into a single paragraph. Take traffic lights as an example: all its states (red, amber, and green) are predefined, all possible outcomes are known and it doesn’t matter if you encounter the traffic light today or tomorrow. It will still behave the same. Managing the states of a traffic light can be done by triggering a sensor on the road or pushing a button resulting in one traffic lights’ state going from green to red (via amber) and another light from red to green.
 
With public blockchains like Zilliqa, this isn’t so straightforward and simple. It started with block #1 almost 1,5 years ago and every 45 seconds or so a new block linked to the previous block is being added. Resulting in a chain of blocks with transactions in it that everyone can verify from block #1 to the current #647.000+ block. The state is ever changing and the states it can find itself in are infinite. And while the traffic light might work together in tandem with various other traffic lights, it’s rather insignificant comparing it to a public blockchain. Because Zilliqa consists of 2400 nodes who need to work together to achieve consensus on what the latest valid state is while some of these nodes may have latency or broadcast issues, drop offline or are deliberately trying to attack the network, etc.
 
Now go back to the Viewblock page take a look at the amount of transaction, addresses, block and DS height and then hit refresh. Obviously as expected you see new incremented values on one or all parameters. And how did the Zilliqa blockchain manage to transition from a previous valid state to the latest valid state? By using pBFT to reach consensus on the latest valid state.
 
After having obtained the entry ticket, miners execute pBFT to reach consensus on the ever-changing state of the blockchain. pBFT requires a series of network communication between nodes, and as such there is no GPU involved (but CPU). Resulting in the total energy consumed to keep the blockchain secure, decentralized and scalable being low.
 
pBFT stands for practical Byzantine Fault Tolerance and is an optimization on the Byzantine Fault Tolerant algorithm. To quote Blockonomi: “In the context of distributed systems, Byzantine Fault Tolerance is the ability of a distributed computer network to function as desired and correctly reach a sufficient consensus despite malicious components (nodes) of the system failing or propagating incorrect information to other peers.” Zilliqa is such a distributed computer network and depends on the honesty of the nodes (shard and DS) to reach consensus and to continuously update the state with the latest block. If pBFT is a new term for you I can highly recommend the Blockonomi article.
 
The idea of pBFT was introduced in 1999 - one of the authors even won a Turing award for it - and it is well researched and applied in various blockchains and distributed systems nowadays. If you want more advanced information than the Blockonomi link provides click here. And if you’re in between Blockonomi and the University of Singapore read the Zilliqa Design Story Part 2 dating from October 2017.
Quoting from the Zilliqa tech whitepaper: “pBFT relies upon a correct leader (which is randomly selected) to begin each phase and proceed when the sufficient majority exists. In case the leader is byzantine it can stall the entire consensus protocol. To address this challenge, pBFT offers a view change protocol to replace the byzantine leader with another one.”
 
pBFT can tolerate ⅓ of the nodes being dishonest (offline counts as Byzantine = dishonest) and the consensus protocol will function without stalling or hiccups. Once there are more than ⅓ of dishonest nodes but no more than ⅔ the network will be stalled and a view change will be triggered to elect a new DS leader. Only when more than ⅔ of the nodes are dishonest (66%) double-spend attacks become possible.
 
If the network stalls no transactions can be processed and one has to wait until a new honest leader has been elected. When the mainnet was just launched and in its early phases, view changes happened regularly. As of today the last stalling of the network - and view change being triggered - was at the end of October 2019.
 
Another benefit of using pBFT for consensus besides low energy is the immediate finality it provides. Once your transaction is included in a block and the block is added to the chain it’s done. Lastly, take a look at this article where three types of finality are being defined: probabilistic, absolute and economic finality. Zilliqa falls under the absolute finality (just like Tendermint for example). Although lengthy already we skipped through some of the inner workings from Zilliqa’s consensus: read the Zilliqa Design Story Part 3 and you will be close to having a complete picture on it. Enough about PoW, sybil resistance mechanism, pBFT, etc. Another thing we haven’t looked at yet is the amount of decentralization.
 
Decentralisation
 
Currently, there are four shards, each one of them consisting of 600 nodes. 1 shard with 600 so-called DS nodes (Directory Service - they need to achieve a higher difficulty than shard nodes) and 1800 shard nodes of which 250 are shard guards (centralized nodes controlled by the team). The amount of shard guards has been steadily declining from 1200 in January 2019 to 250 as of May 2020. On the Viewblock statistics, you can see that many of the nodes are being located in the US but those are only the (CPU parts of the) shard nodes who perform pBFT. There is no data from where the PoW sources are coming. And when the Zilliqa blockchain starts reaching its transaction capacity limit, a network upgrade needs to be executed to lift the current cap of maximum 2400 nodes to allow more nodes and formation of more shards which will allow to network to keep on scaling according to demand.
Besides shard nodes there are also seed nodes. The main role of seed nodes is to serve as direct access points (for end-users and clients) to the core Zilliqa network that validates transactions. Seed nodes consolidate transaction requests and forward these to the lookup nodes (another type of nodes) for distribution to the shards in the network. Seed nodes also maintain the entire transaction history and the global state of the blockchain which is needed to provide services such as block explorers. Seed nodes in the Zilliqa network are comparable to Infura on Ethereum.
 
The seed nodes were first only operated by Zilliqa themselves, exchanges and Viewblock. Operators of seed nodes like exchanges had no incentive to open them for the greater public. They were centralised at first. Decentralisation at the seed nodes level has been steadily rolled out since March 2020 ( Zilliqa Improvement Proposal 3 ). Currently the amount of seed nodes is being increased, they are public-facing and at the same time PoS is applied to incentivize seed node operators and make it possible for ZIL holders to stake and earn passive yields. Important distinction: seed nodes are not involved with consensus! That is still PoW as entry ticket and pBFT for the actual consensus.
 
5% of the block rewards are being assigned to seed nodes (from the beginning in 2019) and those are being used to pay out ZIL stakers. The 5% block rewards with an annual yield of 10.03% translate to roughly 610 MM ZILs in total that can be staked. Exchanges use the custodial variant of staking and wallets like Moonlet will use the non-custodial version (starting in Q3 2020). Staking is being done by sending ZILs to a smart contract created by Zilliqa and audited by Quantstamp.
 
With a high amount of DS; shard nodes and seed nodes becoming more decentralized too, Zilliqa qualifies for the label of decentralized in my opinion.
 
Smart contracts
 
Let me start by saying I’m not a developer and my programming skills are quite limited. So I‘m taking the ELI5 route (maybe 12) but if you are familiar with Javascript, Solidity or specifically OCaml please head straight to Scilla - read the docs to get a good initial grasp of how Zilliqa’s smart contract language Scilla works and if you ask yourself “why another programming language?” check this article. And if you want to play around with some sample contracts in an IDE click here. The faucet can be found here. And more information on architecture, dapp development and API can be found on the Developer Portal.
If you are more into listening and watching: check this recent webinar explaining Zilliqa and Scilla. Link is time-stamped so you’ll start right away with a platform introduction, roadmap 2020 and afterwards a proper Scilla introduction.
 
Generalized: programming languages can be divided into being ‘object-oriented’ or ‘functional’. Here is an ELI5 given by software development academy: * “all programs have two basic components, data – what the program knows – and behavior – what the program can do with that data. So object-oriented programming states that combining data and related behaviors in one place, is called “object”, which makes it easier to understand how a particular program works. On the other hand, functional programming argues that data and behavior are different things and should be separated to ensure their clarity.” *
 
Scilla is on the functional side and shares similarities with OCaml: OCaml is a general-purpose programming language with an emphasis on expressiveness and safety. It has an advanced type system that helps catch your mistakes without getting in your way. It's used in environments where a single mistake can cost millions and speed matters, is supported by an active community, and has a rich set of libraries and development tools. For all its power, OCaml is also pretty simple, which is one reason it's often used as a teaching language.
 
Scilla is blockchain agnostic, can be implemented onto other blockchains as well, is recognized by academics and won a so-called Distinguished Artifact Award award at the end of last year.
 
One of the reasons why the Zilliqa team decided to create their own programming language focused on preventing smart contract vulnerabilities is that adding logic on a blockchain, programming, means that you cannot afford to make mistakes. Otherwise, it could cost you. It’s all great and fun blockchains being immutable but updating your code because you found a bug isn’t the same as with a regular web application for example. And with smart contracts, it inherently involves cryptocurrencies in some form thus value.
 
Another difference with programming languages on a blockchain is gas. Every transaction you do on a smart contract platform like Zilliqa or Ethereum costs gas. With gas you basically pay for computational costs. Sending a ZIL from address A to address B costs 0.001 ZIL currently. Smart contracts are more complex, often involve various functions and require more gas (if gas is a new concept click here ).
 
So with Scilla, similar to Solidity, you need to make sure that “every function in your smart contract will run as expected without hitting gas limits. An improper resource analysis may lead to situations where funds may get stuck simply because a part of the smart contract code cannot be executed due to gas limits. Such constraints are not present in traditional software systems”. Scilla design story part 1
 
Some examples of smart contract issues you’d want to avoid are: leaking funds, ‘unexpected changes to critical state variables’ (example: someone other than you setting his or her address as the owner of the smart contract after creation) or simply killing a contract.
 
Scilla also allows for formal verification. Wikipedia to the rescue: In the context of hardware and software systems, formal verification is the act of proving or disproving the correctness of intended algorithms underlying a system with respect to a certain formal specification or property, using formal methods of mathematics.
 
Formal verification can be helpful in proving the correctness of systems such as: cryptographic protocols, combinational circuits, digital circuits with internal memory, and software expressed as source code.
 
Scilla is being developed hand-in-hand with formalization of its semantics and its embedding into the Coq proof assistant — a state-of-the art tool for mechanized proofs about properties of programs.”
 
Simply put, with Scilla and accompanying tooling developers can be mathematically sure and proof that the smart contract they’ve written does what he or she intends it to do.
 
Smart contract on a sharded environment and state sharding
 
There is one more topic I’d like to touch on: smart contract execution in a sharded environment (and what is the effect of state sharding). This is a complex topic. I’m not able to explain it any easier than what is posted here. But I will try to compress the post into something easy to digest.
 
Earlier on we have established that Zilliqa can process transactions in parallel due to network sharding. This is where the linear scalability comes from. We can define simple transactions: a transaction from address A to B (Category 1), a transaction where a user interacts with one smart contract (Category 2) and the most complex ones where triggering a transaction results in multiple smart contracts being involved (Category 3). The shards are able to process transactions on their own without interference of the other shards. With Category 1 transactions that is doable, with Category 2 transactions sometimes if that address is in the same shard as the smart contract but with Category 3 you definitely need communication between the shards. Solving that requires to make a set of communication rules the protocol needs to follow in order to process all transactions in a generalised fashion.
 
And this is where the downsides of state sharding comes in currently. All shards in Zilliqa have access to the complete state. Yes the state size (0.1 GB at the moment) grows and all of the nodes need to store it but it also means that they don’t need to shop around for information available on other shards. Requiring more communication and adding more complexity. Computer science knowledge and/or developer knowledge required links if you want to dig further: Scilla - language grammar Scilla - Foundations for Verifiable Decentralised Computations on a Blockchain Gas Accounting NUS x Zilliqa: Smart contract language workshop
 
Easier to follow links on programming Scilla https://learnscilla.com/home Ivan on Tech
 
Roadmap / Zilliqa 2.0
 
There is no strict defined roadmap but here are topics being worked on. And via the Zilliqa website there is also more information on the projects they are working on.
 
Business & Partnerships
 
It’s not only technology in which Zilliqa seems to be excelling as their ecosystem has been expanding and starting to grow rapidly. The project is on a mission to provide OpenFinance (OpFi) to the world and Singapore is the right place to be due to its progressive regulations and futuristic thinking. Singapore has taken a proactive approach towards cryptocurrencies by introducing the Payment Services Act 2019 (PS Act). Among other things, the PS Act will regulate intermediaries dealing with certain cryptocurrencies, with a particular focus on consumer protection and anti-money laundering. It will also provide a stable regulatory licensing and operating framework for cryptocurrency entities, effectively covering all crypto businesses and exchanges based in Singapore. According to PWC 82% of the surveyed executives in Singapore reported blockchain initiatives underway and 13% of them have already brought the initiatives live to the market. There is also an increasing list of organizations that are starting to provide digital payment services. Moreover, Singaporean blockchain developers Building Cities Beyond has recently created an innovation $15 million grant to encourage development on its ecosystem. This all suggests that Singapore tries to position itself as (one of) the leading blockchain hubs in the world.
 
Zilliqa seems to already take advantage of this and recently helped launch Hg Exchange on their platform, together with financial institutions PhillipCapital, PrimePartners and Fundnel. Hg Exchange, which is now approved by the Monetary Authority of Singapore (MAS), uses smart contracts to represent digital assets. Through Hg Exchange financial institutions worldwide can use Zilliqa's safe-by-design smart contracts to enable the trading of private equities. For example, think of companies such as Grab, Airbnb, SpaceX that are not available for public trading right now. Hg Exchange will allow investors to buy shares of private companies & unicorns and capture their value before an IPO. Anquan, the main company behind Zilliqa, has also recently announced that they became a partner and shareholder in TEN31 Bank, which is a fully regulated bank allowing for tokenization of assets and is aiming to bridge the gap between conventional banking and the blockchain world. If STOs, the tokenization of assets, and equity trading will continue to increase, then Zilliqa’s public blockchain would be the ideal candidate due to its strategic positioning, partnerships, regulatory compliance and the technology that is being built on top of it.
 
What is also very encouraging is their focus on banking the un(der)banked. They are launching a stablecoin basket starting with XSGD. As many of you know, stablecoins are currently mostly used for trading. However, Zilliqa is actively trying to broaden the use case of stablecoins. I recommend everybody to read this text that Amrit Kumar wrote (one of the co-founders). These stablecoins will be integrated in the traditional markets and bridge the gap between the crypto world and the traditional world. This could potentially revolutionize and legitimise the crypto space if retailers and companies will for example start to use stablecoins for payments or remittances, instead of it solely being used for trading.
 
Zilliqa also released their DeFi strategic roadmap (dating November 2019) which seems to be aligning well with their OpFi strategy. A non-custodial DEX is coming to Zilliqa made by Switcheo which allows cross-chain trading (atomic swaps) between ETH, EOS and ZIL based tokens. They also signed a Memorandum of Understanding for a (soon to be announced) USD stablecoin. And as Zilliqa is all about regulations and being compliant, I’m speculating on it to be a regulated USD stablecoin. Furthermore, XSGD is already created and visible on block explorer and XIDR (Indonesian Stablecoin) is also coming soon via StraitsX. Here also an overview of the Tech Stack for Financial Applications from September 2019. Further quoting Amrit Kumar on this:
 
There are two basic building blocks in DeFi/OpFi though: 1) stablecoins as you need a non-volatile currency to get access to this market and 2) a dex to be able to trade all these financial assets. The rest are built on top of these blocks.
 
So far, together with our partners and community, we have worked on developing these building blocks with XSGD as a stablecoin. We are working on bringing a USD-backed stablecoin as well. We will soon have a decentralised exchange developed by Switcheo. And with HGX going live, we are also venturing into the tokenization space. More to come in the future.”
 
Additionally, they also have this ZILHive initiative that injects capital into projects. There have been already 6 waves of various teams working on infrastructure, innovation and research, and they are not from ASEAN or Singapore only but global: see Grantees breakdown by country. Over 60 project teams from over 20 countries have contributed to Zilliqa's ecosystem. This includes individuals and teams developing wallets, explorers, developer toolkits, smart contract testing frameworks, dapps, etc. As some of you may know, Unstoppable Domains (UD) blew up when they launched on Zilliqa. UD aims to replace cryptocurrency addresses with a human-readable name and allows for uncensorable websites. Zilliqa will probably be the only one able to handle all these transactions onchain due to ability to scale and its resulting low fees which is why the UD team launched this on Zilliqa in the first place. Furthermore, Zilliqa also has a strong emphasis on security, compliance, and privacy, which is why they partnered with companies like Elliptic, ChainSecurity (part of PwC Switzerland), and Incognito. Their sister company Aqilliz (Zilliqa spelled backwards) focuses on revolutionizing the digital advertising space and is doing interesting things like using Zilliqa to track outdoor digital ads with companies like Foodpanda.
 
Zilliqa is listed on nearly all major exchanges, having several different fiat-gateways and recently have been added to Binance’s margin trading and futures trading with really good volume. They also have a very impressive team with good credentials and experience. They don't just have “tech people”. They have a mix of tech people, business people, marketeers, scientists, and more. Naturally, it's good to have a mix of people with different skill sets if you work in the crypto space.
 
Marketing & Community
 
Zilliqa has a very strong community. If you just follow their Twitter their engagement is much higher for a coin that has approximately 80k followers. They also have been ‘coin of the day’ by LunarCrush many times. LunarCrush tracks real-time cryptocurrency value and social data. According to their data, it seems Zilliqa has a more fundamental and deeper understanding of marketing and community engagement than almost all other coins. While almost all coins have been a bit frozen in the last months, Zilliqa seems to be on its own bull run. It was somewhere in the 100s a few months ago and is currently ranked #46 on CoinGecko. Their official Telegram also has over 20k people and is very active, and their community channel which is over 7k now is more active and larger than many other official channels. Their local communities also seem to be growing.
 
Moreover, their community started ‘Zillacracy’ together with the Zilliqa core team ( see www.zillacracy.com ). It’s a community-run initiative where people from all over the world are now helping with marketing and development on Zilliqa. Since its launch in February 2020 they have been doing a lot and will also run their own non-custodial seed node for staking. This seed node will also allow them to start generating revenue for them to become a self sustaining entity that could potentially scale up to become a decentralized company working in parallel with the Zilliqa core team. Comparing it to all the other smart contract platforms (e.g. Cardano, EOS, Tezos etc.) they don't seem to have started a similar initiative (correct me if I’m wrong though). This suggests in my opinion that these other smart contract platforms do not fully understand how to utilize the ‘power of the community’. This is something you cannot ‘buy with money’ and gives many projects in the space a disadvantage.
 
Zilliqa also released two social products called SocialPay and Zeeves. SocialPay allows users to earn ZILs while tweeting with a specific hashtag. They have recently used it in partnership with the Singapore Red Cross for a marketing campaign after their initial pilot program. It seems like a very valuable social product with a good use case. I can see a lot of traditional companies entering the space through this product, which they seem to suggest will happen. Tokenizing hashtags with smart contracts to get network effect is a very smart and innovative idea.
 
Regarding Zeeves, this is a tipping bot for Telegram. They already have 1000s of signups and they plan to keep upgrading it for more and more people to use it (e.g. they recently have added a quiz features). They also use it during AMAs to reward people in real-time. It’s a very smart approach to grow their communities and get familiar with ZIL. I can see this becoming very big on Telegram. This tool suggests, again, that the Zilliqa team has a deeper understanding of what the crypto space and community needs and is good at finding the right innovative tools to grow and scale.
 
To be honest, I haven’t covered everything (i’m also reaching the character limited haha). So many updates happening lately that it's hard to keep up, such as the International Monetary Fund mentioning Zilliqa in their report, custodial and non-custodial Staking, Binance Margin, Futures, Widget, entering the Indian market, and more. The Head of Marketing Colin Miles has also released this as an overview of what is coming next. And last but not least, Vitalik Buterin has been mentioning Zilliqa lately acknowledging Zilliqa and mentioning that both projects have a lot of room to grow. There is much more info of course and a good part of it has been served to you on a silver platter. I invite you to continue researching by yourself :-) And if you have any comments or questions please post here!
submitted by haveyouheardaboutit to CryptoCurrency [link] [comments]

Syscoin Platform’s Great Reddit Scaling Bake-off Proposal

Syscoin Platform’s Great Reddit Scaling Bake-off Proposal

https://preview.redd.it/rqt2dldyg8e51.jpg?width=1044&format=pjpg&auto=webp&s=777ae9d4fbbb54c3540682b72700fc4ba3de0a44
We are excited to participate and present Syscoin Platform's ideal characteristics and capabilities towards a well-rounded Reddit Community Points solution!
Our scaling solution for Reddit Community Points involves 2-way peg interoperability with Ethereum. This will provide a scalable token layer built specifically for speed and high volumes of simple value transfers at a very low cost, while providing sovereign ownership and onchain finality.
Token transfers scale by taking advantage of a globally sorting mempool that provides for probabilistically secure assumptions of “as good as settled”. The opportunity here for token receivers is to have an app-layer interactivity on the speed/security tradeoff (99.9999% assurance within 10 seconds). We call this Z-DAG, and it achieves high-throughput across a mesh network topology presently composed of about 2,000 geographically dispersed full-nodes. Similar to Bitcoin, however, these nodes are incentivized to run full-nodes for the benefit of network security, through a bonded validator scheme. These nodes do not participate in the consensus of transactions or block validation any differently than other nodes and therefore do not degrade the security model of Bitcoin’s validate first then trust, across every node. Each token transfer settles on-chain. The protocol follows Bitcoin core policies so it has adequate code coverage and protocol hardening to be qualified as production quality software. It shares a significant portion of Bitcoin’s own hashpower through merged-mining.
This platform as a whole can serve token microtransactions, larger settlements, and store-of-value in an ideal fashion, providing probabilistic scalability whilst remaining decentralized according to Bitcoin design. It is accessible to ERC-20 via a permissionless and trust-minimized bridge that works in both directions. The bridge and token platform are currently available on the Syscoin mainnet. This has been gaining recent attention for use by loyalty point programs and stablecoins such as Binance USD.

Solutions

Syscoin Foundation identified a few paths for Reddit to leverage this infrastructure, each with trade-offs. The first provides the most cost-savings and scaling benefits at some sacrifice of token autonomy. The second offers more preservation of autonomy with a more narrow scope of cost savings than the first option, but savings even so. The third introduces more complexity than the previous two yet provides the most overall benefits. We consider the third as most viable as it enables Reddit to benefit even while retaining existing smart contract functionality. We will focus on the third option, and include the first two for good measure.
  1. Distribution, burns and user-to-user transfers of Reddit Points are entirely carried out on the Syscoin network. This full-on approach to utilizing the Syscoin network provides the most scalability and transaction cost benefits of these scenarios. The tradeoff here is distribution and subscription handling likely migrating away from smart contracts into the application layer.
  2. The Reddit Community Points ecosystem can continue to use existing smart contracts as they are used today on the Ethereum mainchain. Users migrate a portion of their tokens to Syscoin, the scaling network, to gain much lower fees, scalability, and a proven base layer, without sacrificing sovereign ownership. They would use Syscoin for user-to-user transfers. Tips redeemable in ten seconds or less, a high-throughput relay network, and onchain settlement at a block target of 60 seconds.
  3. Integration between Matic Network and Syscoin Platform - similar to Syscoin’s current integration with Ethereum - will provide Reddit Community Points with EVM scalability (including the Memberships ERC777 operator) on the Matic side, and performant simple value transfers, robust decentralized security, and sovereign store-of-value on the Syscoin side. It’s “the best of both worlds”. The trade-off is more complex interoperability.

Syscoin + Matic Integration

Matic and Blockchain Foundry Inc, the public company formed by the founders of Syscoin, recently entered a partnership for joint research and business development initiatives. This is ideal for all parties as Matic Network and Syscoin Platform provide complementary utility. Syscoin offers characteristics for sovereign ownership and security based on Bitcoin’s time-tested model, and shares a significant portion of Bitcoin’s own hashpower. Syscoin’s focus is on secure and scalable simple value transfers, trust-minimized interoperability, and opt-in regulatory compliance for tokenized assets rather than scalability for smart contract execution. On the other hand, Matic Network can provide scalable EVM for smart contract execution. Reddit Community Points can benefit from both.
Syscoin + Matic integration is actively being explored by both teams, as it is helpful to Reddit, Ethereum, and the industry as a whole.

Proving Performance & Cost Savings

Our POC focuses on 100,000 on-chain settlements of token transfers on the Syscoin Core blockchain. Transfers and burns perform equally with Syscoin. For POCs related to smart contracts (subscriptions, etc), refer to the Matic Network proposal.
On-chain settlement of 100k transactions was accomplished within roughly twelve minutes, well-exceeding Reddit’s expectation of five days. This was performed using six full-nodes operating on compute-optimized AWS c4.2xlarge instances which were geographically distributed (Virginia, London, Sao Paulo Brazil, Oregon, Singapore, Germany). A higher quantity of settlements could be reached within the same time-frame with more broadcasting nodes involved, or using hosts with more resources for faster execution of the process.
Addresses used: 100,014
The demonstration was executed using this tool. The results can be seen in the following blocks:
612722: https://sys1.bcfn.ca/block/6d47796d043bb4c508d29123e6ae81b051f5e0aaef849f253c8f3a6942a022ce
612723: https://sys1.bcfn.ca/block/8e2077f743461b90f80b4bef502f564933a8e04de97972901f3d65cfadcf1faf
612724: https://sys1.bcfn.ca/block/205436d25b1b499fce44c29567c5c807beaca915b83cc9f3c35b0d76dbb11f6e
612725: https://sys1.bcfn.ca/block/776d1b1a0f90f655a6bbdf559ff5072459cbdc5682d7615ff4b78c00babdc237
612726: https://sys1.bcfn.ca/block/de4df0994253742a1ac8ac9eec8d2a8c8b0a6d72c53d6f3caa29bb6c171b0a6b
612727: https://sys1.bcfn.ca/block/e5e167c52a9decb313fbaadf49a5e34cb490f8084f642a850385476d4ef10d70
612728: https://sys1.bcfn.ca/block/ab64d989edc71890e7b5b8491c20e9a27520dc45a5f7c776d3dae79057f59fe7
612729: https://sys1.bcfn.ca/block/5e8b7ecd0e36f99d07e4ea6e135fc952bf7ec30164ab6f4d1e98b0f2d405df6d
612730: https://sys1.bcfn.ca/block/d395df3d31dde60bbb0bece6bd5b358297da878f0beb96be389e5f0e043580a3
It is important to note that this POC is not focused on Z-DAG. The performance of Z-DAG has been benchmarked within realistic network conditions: Whiteblock’s audit is publicly available. Network latency tests showed an average TPS around 15k with burst capacity up to 61k. Zero-latency control group exhibited ~150k TPS. Mainnet testing of the Z-DAG network is achievable and will require further coordination and additional resources.
Even further optimizations are expected in the upcoming Syscoin Core release which will implement a UTXO model for our token layer bringing further efficiency as well as open the door to additional scaling technology currently under research by our team and academic partners. At present our token layer is account-based, similar to Ethereum. Opt-in compliance structures will also be introduced soon which will offer some positive performance characteristics as well. It makes the most sense to implement these optimizations before performing another benchmark for Z-DAG, especially on the mainnet considering the resources required to stress-test this network.

Cost Savings

Total cost for these 100k transactions: $0.63 USD
See the live fee comparison for savings estimation between transactions on Ethereum and Syscoin. Below is a snapshot at time of writing:
ETH price: $318.55 ETH gas price: 55.00 Gwei ($0.37)
Syscoin price: $0.11
Snapshot of live fee comparison chart
Z-DAG provides a more efficient fee-market. A typical Z-DAG transaction costs 0.0000582 SYS. Tokens can be safely redeemed/re-spent within seconds or allowed to settle on-chain beforehand. The costs should remain about this low for microtransactions.
Syscoin will achieve further reduction of fees and even greater scalability with offchain payment channels for assets, with Z-DAG as a resilience fallback. New payment channel technology is one of the topics under research by the Syscoin development team with our academic partners at TU Delft. In line with the calculation in the Lightning Networks white paper, payment channels using assets with Syscoin Core will bring theoretical capacity for each person on Earth (7.8 billion) to have five on-chain transactions per year, per person, without requiring anyone to enter a fee market (aka “wait for a block”). This exceeds the minimum LN expectation of two transactions per person, per year; one to exist on-chain and one to settle aggregated value.

Tools, Infrastructure & Documentation

Syscoin Bridge

Mainnet Demonstration of Syscoin Bridge with the Basic Attention Token ERC-20
A two-way blockchain interoperability system that uses Simple Payment Verification to enable:
  • Any Standard ERC-20 token to be moved from Ethereum to the Syscoin blockchain as a Syscoin Platform Token (SPT), and back to Ethereum
  • Any SPT to be moved from Syscoin to the Ethereum blockchain as an ERC-20 token, and back to Syscoin

Benefits

  • Permissionless
  • No counterparties involved
  • No trading mechanisms involved
  • No third-party liquidity providers required
  • Cross-chain Fractional Supply - 2-way peg - Token supply maintained globally
  • ERC-20s gain vastly improved transactionality with the Syscoin Token Platform, along with the security of bitcoin-core-compliant PoW.
  • SPTs gain access to all the tooling, applications and capabilities of Ethereum for ERC-20, including smart contracts.
https://preview.redd.it/l8t2m8ldh8e51.png?width=1180&format=png&auto=webp&s=b0a955a0181746dc79aff718bd0bf607d3c3aa23
https://preview.redd.it/26htnxzfh8e51.png?width=1180&format=png&auto=webp&s=d0383d3c2ee836c9f60b57eca35542e9545f741d

Source code

https://github.com/syscoin/?q=sysethereum
Main Subprojects

API

Tools to simplify using Syscoin Bridge as a service with dapps and wallets will be released some time after implementation of Syscoin Core 4.2. These will be based upon the same processes which are automated in the current live Sysethereum Dapp that is functioning with the Syscoin mainnet.

Documentation

Syscoin Bridge & How it Works (description and process flow)
Superblock Validation Battles
HOWTO: Provision the Bridge for your ERC-20
HOWTO: Setup an Agent
Developer & User Diligence

Trade-off

The Syscoin Ethereum Bridge is secured by Agent nodes participating in a decentralized and incentivized model that involves roles of Superblock challengers and submitters. This model is open to participation. The benefits here are trust-minimization, permissionless-ness, and potentially less legal/regulatory red-tape than interop mechanisms that involve liquidity providers and/or trading mechanisms.
The trade-off is that due to the decentralized nature there are cross-chain settlement times of one hour to cross from Ethereum to Syscoin, and three hours to cross from Syscoin to Ethereum. We are exploring ways to reduce this time while maintaining decentralization via zkp. Even so, an “instant bridge” experience could be provided by means of a third-party liquidity mechanism. That option exists but is not required for bridge functionality today. Typically bridges are used with batch value, not with high frequencies of smaller values, and generally it is advantageous to keep some value on both chains for maximum availability of utility. Even so, the cross-chain settlement time is good to mention here.

Cost

Ethereum -> Syscoin: Matic or Ethereum transaction fee for bridge contract interaction, negligible Syscoin transaction fee for minting tokens
Syscoin -> Ethereum: Negligible Syscoin transaction fee for burning tokens, 0.01% transaction fee paid to Bridge Agent in the form of the ERC-20, Matic or Ethereum transaction fee for contract interaction.

Z-DAG

Zero-Confirmation Directed Acyclic Graph is an instant settlement protocol that is used as a complementary system to proof-of-work (PoW) in the confirmation of Syscoin service transactions. In essence, a Z-DAG is simply a directed acyclic graph (DAG) where validating nodes verify the sequential ordering of transactions that are received in their memory pools. Z-DAG is used by the validating nodes across the network to ensure that there is absolute consensus on the ordering of transactions and no balances are overflowed (no double-spends).

Benefits

  • Unique fee-market that is more efficient for microtransaction redemption and settlement
  • Uses decentralized means to enable tokens with value transfer scalability that is comparable or exceeds that of credit card networks
  • Provides high throughput and secure fulfillment even if blocks are full
  • Probabilistic and interactive
  • 99.9999% security assurance within 10 seconds
  • Can serve payment channels as a resilience fallback that is faster and lower-cost than falling-back directly to a blockchain
  • Each Z-DAG transaction also settles onchain through Syscoin Core at 60-second block target using SHA-256 Proof of Work consensus
https://preview.redd.it/pgbx84jih8e51.png?width=1614&format=png&auto=webp&s=5f631d42a33dc698365eb8dd184b6d442def6640

Source code

https://github.com/syscoin/syscoin

API

Syscoin-js provides tooling for all Syscoin Core RPCs including interactivity with Z-DAG.

Documentation

Z-DAG White Paper
Useful read: An in-depth Z-DAG discussion between Syscoin Core developer Jag Sidhu and Brave Software Research Engineer Gonçalo Pestana

Trade-off

Z-DAG enables the ideal speed/security tradeoff to be determined per use-case in the application layer. It minimizes the sacrifice required to accept and redeem fast transfers/payments while providing more-than-ample security for microtransactions. This is supported on the premise that a Reddit user receiving points does need security yet generally doesn’t want nor need to wait for the same level of security as a nation-state settling an international trade debt. In any case, each Z-DAG transaction settles onchain at a block target of 60 seconds.

Syscoin Specs

Syscoin 3.0 White Paper
(4.0 white paper is pending. For improved scalability and less blockchain bloat, some features of v3 no longer exist in current v4: Specifically Marketplace Offers, Aliases, Escrow, Certificates, Pruning, Encrypted Messaging)
  • 16MB block bandwidth per minute assuming segwit witness carrying transactions, and transactions ~200 bytes on average
  • SHA256 merge mined with Bitcoin
  • UTXO asset layer, with base Syscoin layer sharing identical security policies as Bitcoin Core
  • Z-DAG on asset layer, bridge to Ethereum on asset layer
  • On-chain scaling with prospect of enabling enterprise grade reliable trustless payment processing with on/offchain hybrid solution
  • Focus only on Simple Value Transfers. MVP of blockchain consensus footprint is balances and ownership of them. Everything else can reduce data availability in exchange for scale (Ethereum 2.0 model). We leave that to other designs, we focus on transfers.
  • Future integrations of MAST/Taproot to get more complex value transfers without trading off trustlessness or decentralization.
  • Zero-knowledge Proofs are a cryptographic new frontier. We are dabbling here to generalize the concept of bridging and also verify the state of a chain efficiently. We also apply it in our Digital Identity projects at Blockchain Foundry (a publicly traded company which develops Syscoin softwares for clients). We are also looking to integrate privacy preserving payment channels for off-chain payments through zkSNARK hub & spoke design which does not suffer from the HTLC attack vectors evident on LN. Much of the issues plaguing Lightning Network can be resolved using a zkSNARK design whilst also providing the ability to do a multi-asset payment channel system. Currently we found a showstopper attack (American Call Option) on LN if we were to use multiple-assets. This would not exist in a system such as this.

Wallets

Web3 and mobile wallets are under active development by Blockchain Foundry Inc as WebAssembly applications and expected for release not long after mainnet deployment of Syscoin Core 4.2. Both of these will be multi-coin wallets that support Syscoin, SPTs, Ethereum, and ERC-20 tokens. The Web3 wallet will provide functionality similar to Metamask.
Syscoin Platform and tokens are already integrated with Blockbook. Custom hardware wallet support currently exists via ElectrumSys. First-class HW wallet integration through apps such as Ledger Live will exist after 4.2.
Current supported wallets
Syscoin Spark Desktop
Syscoin-Qt

Explorers

Mainnet: https://sys1.bcfn.ca (Blockbook)
Testnet: https://explorer-testnet.blockchainfoundry.co

Thank you for close consideration of our proposal. We look forward to feedback, and to working with the Reddit community to implement an ideal solution using Syscoin Platform!

submitted by sidhujag to ethereum [link] [comments]

Bitcoin Sidechains & SPV Proofs Sidechains, Halong Mining, Roger Ver and More Hard Forks - Off Chain Daily, 2017.11.23 How to send and receive bitcoin in your block chain wallet How to get Bitcoins and Block chain explained 2019 [Mining,Free Bitcoins etc.] Let the REAL Bitcoin please stand up (1st chain split): BITCOIN CASH

Sidechains cannot create the same asset that is pegged to Bitcoins to use as a reward for mining. Obviously, it would be problematic if anyone could put in a certain amount of Bitcoins into a sidechain, and get out more/less than what was originally put in. However, there are many other ways to reward miners. Transaction fees is one example ... Diese beschreibt die Rechenleistung, die die Miner ins Bitcoin-Mining stecken. Nachdem sie seit gut einem Jahr nach oben schießt, als gäbe es kein Morgen, hat sich seit Mitte September das Wachstum deutlich verlangsamt. Gestern kam es zu einem kleinen Absturz, der die Hashrate von 291 Millionen Gigahash / Sekunde auf 232 Millionen GH/s gedrückt hat. Damit könnte die Schwierigkeit zum ... Sidechain is a method of separation blockchains. Instead of using only primary blockchain, a user now can transfer his digital assets to a supplemented one. These are various platforms for blockchain, which operate with sidechains for a faster speed of transactions and payments: Ardor, RSK and etc. This is particularly attractive in the merged mining scenario, as this presents only upside for the existing bitcoin mining network, no matter how small the commercial upside might be. An alternative method, called demurrage, charges users a fee for holding and storing their tokens on the side chain, and this fee is distributed amongst miners based on their contribution. Conclusion. There will ... It has a two-way peg with the Bitcoin blockchain and rewards Bitcoin miners via merged mining. RSK’s goal is to enable the Bitcoin blockchain to have smart contract capabilities and make payments much faster. 2. Ardor’s Blockchain as a service platform for business: Ardor uses the Proof of Stake consensus mechanism. Ardor calls its sidechains ‘childchains’, and they are tightly ...

[index] [23249] [44207] [10583] [35125] [48072] [27552] [39872] [34919] [36982] [9449]

Bitcoin Sidechains & SPV Proofs

# Blockstream Sidechain Proposal: https://blockstream.com/sidechains.pdf # RSK Bitcoin Mainnet Sidechain w/ Solidity EVM https://media.rsk.co/bamboo-mainnet-... Bitcoin Backed Altcoins, Side Chains - Bitcoin and Cryptocurrency Technologies Part 10 - Altcoins and the Cryptocurrency Ecosystem Hundreds of altcoins, or alternative cryptocurrencies, have been ... Join us to discuss Blockstream, Sidechains and Bitcoin 2.0. Top Bitcoin Core Dev Greg Maxwell DevCore: Must watch talk on mining, block size, and more - Duration: 55:04. The Bitcoin Foundation ... Sidechains taxonomy article, Halong Mining's new DragonMint miner Roger Ver holding more BCH than BTC? More Hard Forks, Bitcoin Silver and Bitcoin Diamond Si... This video tells you about everything about Bitcoin that how to get free bitcoins and also this video is about what is bitcoin,how is it safe,Why should you buy,how Mining process works,why only ...

#